TY - JOUR AU - Bigdely-Shamlo, Nima AU - Mullen, Tim AU - Kothe, Christian AU - Su, Kyung-Min AU - Robbins, Kay A. PY - 2015 M3 - Methods TI - The PREP pipeline: standardized preprocessing for large-scale EEG analysis JO - Frontiers in Neuroinformatics UR - https://www.frontiersin.org/articles/10.3389/fninf.2015.00016 VL - 9 SN - 1662-5196 N2 - The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode. ER -