C. Zhou, M. Prabhushankar, and G. AlRegib
This work leverages crowd-sourced practitioners to enhance seismic fault interpretation, reducing the requirement for costly expert annotations. By combining a few expert-fault labels and a large number of practitioner-fault annotations collected from the Amazon Mechanical Turk crowdsourcing platform, we achieved high performance on fault segmentation tasks as if all the training labels were from the expert. Our framework first identifies the seismic sections which possess the highest labeling variation between the expert and practitioners, then allocate the remaining sections to practitioners to enhance network performance with practitioners-augmented labels.