AUTHOR=Crossa José , Martini Johannes W.R. , Gianola Daniel , Pérez-Rodríguez Paulino , Jarquin Diego , Juliana Philomin , Montesinos-López Osval , Cuevas Jaime TITLE=Deep Kernel and Deep Learning for Genome-Based Prediction of Single Traits in Multienvironment Breeding Trials JOURNAL=Frontiers in Genetics VOLUME=Volume 10 - 2019 YEAR=2019 URL=https://www.frontiersin.org/journals/genetics/articles/10.3389/fgene.2019.01168 DOI=10.3389/fgene.2019.01168 ISSN=1664-8021 ABSTRACT=Deep learning (DL) is a promising method for genomic-enabled prediction. However, the implementation of DL is awkward because many hyper-parameters (number of hidden layers, number of neurons, learning rate, number of epochs, batch size, etc.) need to be tuned. On the other hand, deep kernel methods only require defining the number of layers for emulating DL models based on covariance matrices with a large number of neurons. In this research we compare the genome-based prediction of DL to a deep kernel (arc-cosine kernel, AK), to the commonly used non-additive Gaussian kernel (GK), as well as to the conventional additive Genomic Best Linear Unbiased Predictor (GBLUP/GB). We used two real wheat data sets for benchmarking these methods. On average, AK and GK outperformed DL and GB. The gain in terms of prediction performance of AK and GK over DL and GB was not large, but AK and GK have the advantage that only one parameter, the number of layers (AK) or the bandwidth parameter (GK), has to be tuned in each method. Furthermore, although AK and GK had similar performance, deep kernel AK is easier to implement than GK, since the parameter “number of layers” is more easily determined than the bandwidth parameter of GK. Our results suggest that AK is a good alternative to DL with the advantage that practically no tuning process is required.