Skip to main content

ORIGINAL RESEARCH article

Front. Plant Sci., 10 May 2019
Sec. Technical Advances in Plant Science
This article is part of the Research Topic Advances in High-throughput Plant Phenotyping by Multi-platform Remote Sensing Technologies View all 32 articles

High-Throughput Phenotyping of Fire Blight Disease Symptoms Using Sensing Techniques in Apple

  • 1Department of Biological Systems Engineering, Washington State University, Pullman, WA, United States
  • 2Donald Danforth Plant Science Center, St. Louis, MO, United States
  • 3Tree Fruit Research and Extension Center, Washington State University, Wenatchee, WA, United States
  • 4College of Information Science and Technology, Agricultural University of Hebei, Baoding, China

Washington State produces about 70% of total fresh market apples in the United States. One of the primary goals of apple breeding programs is the development of new cultivars resistant to devastating diseases such as fire blight. The overall objective of this study was to investigate high-throughput phenotyping techniques to evaluate fire blight disease symptoms in apple trees. In this regard, normalized stomatal conductance data acquired using a portable photosynthetic system, image data collected using RGB and multispectral cameras, and visible-near infrared spectral reflectance acquired using a hyperspectral sensing system, were independently evaluated to estimate the progression of fire blight infection in young apple trees. Sensors with ranging complexity – from simple RGB to multispectral imaging to hyperspectral system – were evaluated to select the most accurate technique for the assessment of fire blight disease symptoms. The proximal multispectral images and visible-near infrared spectral reflectance data were collected in two field seasons (2016, 2017); while, proximal side-view RGB images and multispectral images using unmanned aerial systems were collected in 2017. The normalized stomatal conductance data was correlated with disease severity rating (r = 0.51, P < 0.05). The features extracted from RGB images (e.g., maximum length of senesced leaves, area of senesced leaves, ratio between senesced and healthy leaf area) and multispectral images (e.g., vegetation indices) also demonstrated potential in evaluation of disease rating (|r| > 0.35, P < 0.05). The average classification accuracy achieved using visible-near infrared spectral reflectance data during the classification of susceptible from symptomless groups ranged between 71 and 93% using partial least square regression and quadratic support vector machine. In addition, fire blight disease ratings were compared with normalized difference spectral indices (NDSIs) that were generated from visible-near infrared reflectance spectra. The selected spectral bands in the range 710–2,340 nm used for computing NDSIs showed consistently higher correlation with disease severity rating than data acquired from RGB and multispectral imaging sensors across multiple seasons. In summary, these specific spectral bands can be used for evaluating fire blight disease severity in apple breeding programs and potentially as early fire blight disease detection tool to assist in production systems.

Introduction

Apple (Malus pumila Mill) belongs to the Rosaceae family, and is the most consumed and valued fruit crop in the United States (Lu and Lu, 2017) and other parts of the world. The United States is the second largest apple producer worldwide and Washington State has the nation’s top apple production area. Washington State’s favorable climate with low humidity assists in the control of many of the typical apple diseases (Sutton, 1996). However, the introduction of new cultivars with different levels of disease susceptibility has revealed that in some growing seasons, the bacterial disease fire blight can be a problem. Fire blight is a major concern to commercial fruit production, as it results in significant production losses (Sutton, 1996; Salm and Geider, 2004). The causative agent of fire blight, Erwinia amylovora (Bereswill et al., 1995) can infect flowers, fruits, shoots, and the rootstock of the tree, potentially causing flower, tissue, and/or tree death (Norelli et al., 2003). E. amylovora uses wounds or natural openings as well as nectarthodes to enter the host (Vanneste, 2000).

Blackened crooked shoots (i.e., shepherd’s crook), bacterial ooze, necrotic leaves, and the formation of necrotic lesions and cankers are symptoms that characterize typical fire blight shoot infections (Van der Zwet et al., 2012). Fire blight infections can result in structural damage and potentially tree death. Several factors impact the incidence and severity of fire blight symptoms such as environmental conditions, cultivar susceptibility, host vigor, amount of inoculum present, and management practices (Van der Zwet et al., 2012). Most modern commercial apple cultivars are susceptible to fire blight and current control methods (e.g., pruning, antibiotics) are not effective against all disease phases (e.g., floral, shoot, rootstock) and/or are not sustainable. One major limitation of disease control is the ability of pathogens to survive for a long period of time, becoming active when favorable weather conditions prevail. The use of antibiotics for control is limited by both legislation and to prevent the development of resistant strains (Lespinasse and Aldwinckle, 2000). Breeding new cultivars of apple that are resistant to fire blight is a logical progression to solve this issue (Kostick et al., 2019). The Washington State University (WSU) apple breeding program focuses on producing long storing, high quality cultivars developed for local production and has recently added resistance to fire blight as an important selection trait (Harshman et al., 2017).

Phenotyping fire blight susceptibility is challenging due to the large impacts of environment and host growth status on susceptibility, quantitative resistance, and the inconsistent nature of fire blight (Brown, 2012). Fire blight susceptibility to shoot infection has typically been evaluated by measuring the current season’s shoot length and the lesion (necrosis) or healthy tissue length. The proportion of healthy or necrotic tissue can be calculated, which provides a standardized measure of infection (Durel et al., 2009). As such visual evaluations are labor intensive and subjective; the development of an accurate and rapid sensing technique for high-throughput screening of fire blight symptoms in apples would be beneficial.

In recent years, high-throughput phenotyping using non-invasive imaging and sensing systems has made advances toward evaluation of anatomical, physiological, and biochemical properties in plants (Mahlein, 2016). High-throughput phenotyping is currently being developed for grain crops such as wheat (Casadesús et al., 2007; Hosoi and Omasa, 2009; Römer et al., 2011), corn (Trachsel et al., 2011; Weber et al., 2012; Yang et al., 2014), and rice (Tanabata et al., 2012; Hairmansis et al., 2014). In spite of the progress in sensing systems, there are limited studies on optical spectrometric and imaging techniques for phenotyping diseases in trees. Current studies in sensor-based disease detection are focused on identifying onset of disease for management applications (Bock et al., 2008; Wijekoon et al., 2008; Neumann et al., 2014). Meanwhile, in breeding programs, the focus has been to quantify the extent of disease (resistant/susceptible) under inoculated conditions, excluding the genotypic differences in morphology.

In a controlled environment study (Delalieux et al., 2007), hyperspectral reflectance spectra were used to detect apple scab disease in tree leaves. Susceptible and resistant apple cultivars were inoculated with conidia of Venturia inaequalis. The study indicated that spectral reflectance between 1,350–1,750 nm and 2,200–2,500 nm were effective in distinguishing between healthy and infected leaves. The area under the receiver operator characteristic plots, indicated as c-value, was used as a measure of the discriminatory performance. A good predictability in classification of infected and healthy trees using logistic regression, partial least squares logistic discriminant analysis, and tree-based modeling (c-value > 0.8) was achieved. Similarly, Bauriegel et al. (2014) assessed the disease rating (caused by a fungal pathogen infection, Bremia lactucae) on 10 Butterhead and Batavia lettuce cultivars in a semi-controlled environment. Leaves were inoculated by spraying a conidia suspension and cultivars were rated visually for 14 days. A chlorophyll fluorescence imaging system was used to capture images each day after dark adaptation of leaves for 10 min from leaf discs. The ratio of variable fluorescence to maximum fluorescence was calculated for each leaf disc. In pixel-wise analysis of images, in 10–14 days after inoculation, a significant decrease in the ratio up to 0.35 in susceptible cultivars was observed, while the ratio was 0.10 for resistant cultivars. The potential for rapid and automated cultivar resistance detection using their sensing system was reported. Díaz-Varela et al. (2015) used unmanned aerial vehicle (UAV)-based imaging to estimate olive tree height and crown diameter for breeding applications. The results showed 3–16% of root mean square error between the sensing data estimates and field measurements. Other studies in literature (Gomez-Candon et al., 2014; Virlet et al., 2014) report the application of multispectral imaging (RGB, near infrared, and thermal infrared) to assess apple tree response to drought.

Diverse studies demonstrated the potential of sensing techniques for disease detection in both controlled environment and field conditions for precision agriculture applications. Khanal et al. (2017) reviewed the application of thermal sensing for stress monitoring and Dlamini et al. (2019) described numerous remote sensing indices that are relevant for disease mapping. Therefore, remote sensing techniques are widely employed in monitoring and managing crops (Usha and Singh, 2013). For instance, Salgadoe et al. (2018) found that the simplified ratio vegetation index (SRVI) extracted from the high resolution Worldview-3 satellite imagery was strongly correlated with four levels of disease severity resulting from Phytophthora root rot in avocado trees. Similarly, Zarco-Tejada et al. (2018) evaluated the performance of spectral signatures derived from airborne imaging spectroscopy and thermography in detecting the early stage of Xylella fastidiosa infection in olive trees. In addition, Al-Saddik et al. (2018) used red–green–blue and hyperspectral imaging to detect two grapevine diseases: Yellowness and Esca. In their study, texture and spectral features extracted from imaging data were able to classify healthy and infected grapevine leaves. Furthermore, the complex data acquired using these sensing techniques required advanced tools for data processing and analysis. Tzionas et al. (2005) extracted morphological features to classify leaves through image processing integrated feed forward neural network approaches. Machine learning methods improve disease detection accuracies, especially if the methods are integrated with hyperspectral data (Golhani et al., 2018) in a 3D environment (Roscher et al., 2016). Other studies reporting the use of machine learning and/or deep learning methods for disease detection/classification can be found in literature (Meunkaewjinda et al., 2008; Phadikar and Sil, 2008; Weizheng et al., 2008).

In this research, multiple sensing techniques at different scales (proximal and remote) were evaluated to assess the fire blight infection levels in different apple cultivars, important breeding parents, and seedlings. Prior to utilizing technology to perform sensor-based high-throughput phenotyping to assess disease severity in a specific crop (such as fire blight resistance in apple), it is important to evaluate and understand the benefits and limitations of each technique. Such evaluations are more challenging in field conditions as multiple season data is required for confident evaluation of sensors and assess cultivar response to stress conditions. In this regard, the current study contributes to the application of high-throughput sensor-based assessment of fire blight disease symptoms using several proximal and remote sensing technologies for high-throughput phenotyping in apple under field conditions. Such studies on fire blight symptoms or other disease evaluation in apple breeding program has not been reported. The studies on disease detection for precision agricultural applications (e.g., crop management) may not be applicable in breeding programs, as the studies often focus on one or few varieties and early detection of diseases; while, 10s–100s of varieties are assessed for a scale in disease severity in breeding studies. Therefore, three major sensing approaches at varying complexity that were independently evaluated in this study were: (1) side-view RGB imaging to detect necrosis extent in multiple cultivars; (2) top-view remote sensing using multispectral imaging at different scales; and (3) proximal hyperspectral sensing (350–2,500 nm). In addition to disease rating by manual methods, stomatal conductance was measured to understand the physiological changes in the canopy upon infection.

Materials and Methods

Plant Materials and Inoculation

Data collection was performed on a set of 72 unique apple individual trees (e.g., cultivars, important breeding parents, seedlings). The trees were part of a larger planting, located at the WSU Columbia View Orchard, Douglas County, WA, United States (47°33′52.1″N, 120°14′47.3″W). They were budded onto M.111 rootstocks and planted with 82 cm spacing between trees. Full details of the germplasm, orchard establishment and maintenance can be found in Kostick et al. (2019). Freeze-dried E. amylovora 153n was used for inoculation and inoculum was diluted using 0.05 M dibasic phosphate buffer, pH 7 to a concentration of 5 × 108 CFU mL-1 in 2016 and 1 × 109 CFU mL-1 in 2017. Three to ten independent, actively growing shoots with ideally ≥15 cm of growth were chosen per tree for inoculation on April 28–29, 2016 and May 18, 2017. In 2017, 68 trees were evaluated as four trees died. Kostick et al. (2019) describes the inoculation procedure in detail.

Data collection using multiple sensors (Supplementary Figure S1 and Supplementary Table S1) was performed as described in detail in the following sessions. Most of the sensor data collection was performed in complete-disease developmental stages, which corresponded to about 40 days after inoculation (DAI; 9 June 2016 and 27 June 2017). The hyperspectral reflectance data was also collected at mid-disease development stages at about 23 DAI (19 May 2016 and 13 June 2017). Visible development of disease symptoms ceased at about 40 DAI and the disease symptoms were manually/visually rated as described in the following session at that time.

Disease Severity Rating and Stomatal Conductance

Several disease severity rating systems are used to quantify the severity of fire blight infection (Van Der Zwet and Keil, 1979). These ratings are based on proportion of current season’s growth that was blighted or healthy, percentage of shoots that were infected per tree (i.e., incidence), and the age of wood that a lesion progressed into (i.e., infected). As ground reference data (both years), the length of current season’s shoot growth was measured at inoculation in 2016 or at the time of lesion measurement in 2017. As described by Kostick et al. (2019), disease progression within the current season’s shoot growth was evaluated by measuring the length of non-necrotic (i.e., healthy) tissue in 2016 or length of necrotic fire blight lesions in 2017. From these measurements, the proportion of healthy tissue was calculated on a given shoot and was averaged for each tree. These disease severity ratings are described as the proportion of shoot length blighted henceforth. Each shoot was also rated based on the age of wood that the lesion extended into with 0, 1, 2, and 3 representing no infection, first year, second year, and third year wood, respectively. When a lesion extended into the previous season’s wood or into third year wood, the response was considered highly susceptible. In symptomless responses, no lesions were visible and only minor responses were observed on the inoculated leaves. In 2016, average disease severity rating for each tree was used as ground reference data; while in 2017, the individual shoots that were used for hyperspectral data collection were used as ground reference data during hyperspectral data analysis, in addition to average disease severity rating for individual tree during RGB and multispectral image data analysis.

In 2017, a portable photosynthesis system (LI-6800, LI-COR Biosciences, Lincoln, NE, United States) was used to collect stomatal conductance data at the complete disease development stage (about 40 DAI) from 20 trees (subset of 72). The rationale behind the use of this system was to evaluate whether the physiological measurements such as stomatal conductance could be used as alternative to disease severity rating. Data were collected from three healthy and three inoculated leaf samples from each of the 20 trees. The stomatal conductance data were normalized by subtracting the inoculated leaves from healthy leaves in each cultivar to eliminate the cultivar effect on stomatal conductance values. The stomatal conductance of healthy leaves ranged between 186 and 473 mmol.m-2.s-1.

RGB Image Acquisition, Image Processing, and Feature Extraction

In regard to sensor data, the sensors with ranging complexity – from simple RGB to multispectral imaging to hyperspectral system –were independently evaluated. This is important for practical application in disease symptoms evaluation as the sensor should be easy to use and data processing should be simple for breeders to adopt the technology for high-throughput phenotyping. The rationale behind the use of RGB imaging system to evaluate disease symptoms was that the disease rating was directly associated with visible symptoms and measure of length of necrotic fire blight lesions, which could be captured using RGB imaging. In regard to the use of multispectral imaging at different scales (proximal and remote) for disease resistance evaluation, it was hypothesized that although limited shoots were inoculated within the tree canopy, the presence of fire blight pathogen may induce overall canopy stress response that can be captured using generic vegetation indices extracted from multispectral images. The rationale behind the use of hyperspectral system was to capture the entire spectral reflectance response of the leaves to the disease, so as to derive novel spectral indices that can be translated into customized multispectral imaging sensors that can be integrated with ground or aerial platforms in future. The details on multispectral and hyperspectral data collection and analysis is described in the following sessions.

An RGB digital camera (Canon PowerShot SX260HS, Carlstadt, NJ, United States) with resolution of 4000 × 3000 was used to capture side-view images of the trees with a white background board placed behind each tree (in 2017). The reference panel was placed at the tree trunk and the distance between the camera and the tree was maintained at around 1.5 m. During image processing, the first step was to observe image reflectance value patterns (Figure 1a) in each band across green leaves, senesced leaves, and shoots in Matlab®. This was important to separate the abaxial leaf surfaces from partially senesced leaves. Upon optimization of image processing protocol, wilted and necrotic leaves could be separated from healthy leaves using defined red, green, and blue channel reflectance values (Eq. 1).

I(i,j)=Infected pixels60>(R(i,j)G(i,j))0 and (G(i,j)B(i,j))I(i,j)=background pixels OTHERWISE>0(1)
FIGURE 1
www.frontiersin.org

Figure 1. (a) Original RGB side-image with red, green, and blue lines in right side showing the relative gray scale intensities in the R, G, and B channels, respectively, of the yellow line highlighted in the image; (b) processed RGB image where the senesced leaves identified during image processing are marked in yellow; and (c) processed RGB image where the healthy leaves identified during image processing are marked in red. The total number of pixels representing senesced and healthy leaves, and maximum length of senesced leaf area were used for feature extraction.

where: R(i,j), G(i,j) and B(i,j) are the red, green, and blue pixel intensities (0–255), respectively, and (i,j) represents each pixel coordinates.

Using this method, all pixels in the image were scanned and infected parts of the leaves were identified (yellow area in Figure 1b). Following this step, region of interest was defined to eliminate background. Healthy leaves in the canopy (Figure 1c) were identified using Eq. (2) with a manually selected threshold of 20, optimized during image processing for distinguishing the background from area of interest:

I(i,j)=Healthy leaves pixels(G(i,j)>R(i,j)) and (G(i,j)B(i,j))>20)(2)

Finally, total number of senesced and healthy pixels were computed and compared with ground reference data. In addition, the length of infected shoots was also calculated by selecting the two endpoints of the longest infected shoot. The coordinates of these two endpoints were obtained and the length of the infected shoot was computed. The three extracted features (maximum senesced shoot length, total senesced shoot leaf area, and ratio of senesced shoot leaf area with respect to healthy/green shoot leaf area) were compared with average senesced shoot length/tree, average proportion of shoot length blighted, and average tree disease severity rating based on wood age.

Multispectral Image Acquisition, Image Processing, and Feature Extraction

Multispectral images were acquired at two scales: 7 m above ground level (AGL) and at 100 m AGL. An agricultural utility vehicle (AUV, John Deere GatorTM XUV590i, John Deere, Moline, IL, United States) with a retractable mast (FM50-25, Floatagraph Technologies, Santa Barbara, CA, United States) and a camera mount was used to capture multispectral images from top-view (in 2016 and 2017, 7 m AGL) of the trees. A modified multispectral 3-band digital camera (Canon ELPH110 HS, Carlstadt, NJ, United States) with resolution of 4608 × 3456 and red channel replaced with NIR channel (680–800 nm) was mounted on the camera mount on the platform mast. AUV was driven along the rows at the speed of 0.5 m/s. The camera was operated in ambient light condition and was equipped with an SD card for data storage. In 2017, aerial multispectral images were collected using an unmanned octocopter (ARF OktoXL 6S12, HiSystems GmbH, Moormerland, Germany) integrated with a multispectral camera (Rededge, Micasense, Seattle, WA, United States) to capture red (R), green (G), blue (B), red edge (RE), and near infrared (NIR) band images. The imaging altitude was 100 m AGL. All the spectral reflectance data was radiometrically corrected using a reference panel (Spectralon Reflectance Target, Labsphere®, North Sutton, NH, United States) placed within the camera’s field of view.

Image processing and analysis were performed in Matlab®. For AUV-based multispectral images, five major steps of image processing were followed: (i) images were separated into individual bands and radiometrically corrected to compensate for incident light variation during the imaging; (ii) corresponding vegetation index images were extracted using Matlab®Image Processing Toolbox (Mathworks, Natick, MA, United States); (iii) the soil background and shadows were eliminated using a combination of k-means clustering and thresholding methods that discriminate regions of interest (trees) from the background; (iv) individual trees were segmented from the preprocessed image; and (v) the green normalized difference vegetation index (GNDVI) was extracted from segmented trees and the data was recorded by matching their ID. Figure 2 describes these steps in detail.

FIGURE 2
www.frontiersin.org

Figure 2. Data processing steps used to process AUV-based multispectral images.

For the aerial images, the mosaic of NIR band was selected as the reference image to align images in other bands. Mosaics of red, green, blue, and red edge bands were geometrically corrected to match spatially with the NIR band. This process was performed using the “georeference tool” in QGIS software (QGIS Development Team, Graphic Information System, Open Source Geospatial Foundation Project1). With the same software, the five bands were merged into one color composite image following their conventional order: (1) red, (2) green, (3) blue, (4) red edge and (5) NIR, using the “Build Virtual Raster” command. In AutoCAD (Autodesk Inc., San Rafael, CA, United States), using the “Polyline” command, vector layers were created to isolate the area of each tree. For this purpose, a polygon surrounding the entire tree canopy area was developed. Later, the canopy area polygon was segmented into smaller fields of view (polygon) of 0.79 cm length, representing each tree. The sum and average tree GNDVI, normalized difference red edge index (NDRE), and normalized difference vegetation index (NDVI) were extracted using “Zonal Statistics” plugin in QGIS. The following equations describe the vegetation indices:

GNDVI=NIRGNIR+G(3)
NDRE=NIRRENIR+RE(4)
NDVI=NIRRNIR+R(5)

where: NIR, G, RE, and R refer to digital number representing reflectance at near infrared, green, red-edge, and red bands. The NDVI (Rouse et al., 1973) and GNDVI (Gitelson and Merzlyak, 1998) are broadband greenness indices that represent overall canopy vigor or greenness; while NDRE (Gitelson and Merzlyak, 1994; Sims and Gamon, 2002) is a narrowband greenness that represents canopy stress response. In addition, NDRE also captures differences in foliage content and senescence, which could be useful in capturing disease symptoms. Vegetation index data were correlated to ground reference data (average disease severity rating based on wood age) during analysis.

Hyperspectral Data Acquisition, Processing, and Feature Extraction

In addition to remote sensing data, proximal visible-near infrared (Vis-NIR) reflectance spectra using spectroradiometer (SVC HR-1024i, Spectra Vista Corporation, Poughkeepsie, NY, United States) was captured from the tree leaves under study (2016 and 2017). This hyperspectral system measures reflectance in the range 350–2,500 nm (overall 992 channels) with resolution of ≤3.5 at 700 nm, ≤9.5 at 1,500 nm, and ≤6.5 nm at 2,100 nm. The leaf clip with fiber optic channel connected to the equipment was used during data collection. Hyperspectral data was collected from two inoculated shoots from each tree (one young leaf on the shoot tip in the vicinity of the inoculated leaf and one mature leaf from new season’s growth). During analysis, in 2016, average disease rating was utilized; while in 2017, the disease rating for measured shoots was used.

Hyperspectral reflectance data captured using spectroradiometer was radiometrically corrected, normalized (Sankaran et al., 2011), and binned with 10 nm intervals. Two models, partial least square regression (PLSR) and quadratic kernel support vector machine (QSVM) were applied to classify spectra into four classes of 0, 1, 2, and 3 that represent disease rating (ground reference data), by separating the dataset into train and test datasets with 3:1 ratio after randomization. PLSR is a regression model that takes structures of both explanatory and independent variables into account. Variables are decomposed into latent structures in an iterative method. QSVM uses kernel to transfer data to a quadratic space and then defines a linear decision boundary. For two-class classification, 0 and 1 ratings and 2 and 3 ratings were combined. For three-class classification, only classes of 2 and 3 were combined. Overall classification accuracy was computed to assess performance of the class during four-class, three-class, and two-class classification accuracies.

Following hyperspectral data-based classification, binned reflectance data were utilized to generate normalized difference spectral indices (NDSIs) (Inoue et al., 2008), that represents every possible coupled combination of reflectance wavelengths with following equation:

NDSI[i,j]=RiRjRi+Rj(6)

where R refers to reflectance data, and i and j refer to specific spectral bands. These spectral indices evaluate novel combinations of spectral bands, from which spectral ratios that were closely related to disease symptoms were selected. This would increase the accuracy of disease symptom assessment with few spectral bands rather than entire hyperspectral data.

Normalized difference spectral indices were extracted from Vis-NIR spectral reflectance data of mature leaves from inoculated trees at mid- and complete-disease development stages for two consecutive years. Considering 214 spectral features after binning, 45,796 NDSIs were calculated for each spectrum, which were correlated with ground reference measurements for each tree. NDSIs correlation coefficients over 0.35 were selected for further analysis. To remove data redundancy from selected NDSIs, stepwise regression analysis was applied to each dataset. This method is a variable selection procedure for independent variables. It consists of a series of steps to evaluate each variable with a defined criterion in order to decide if it should be selected. In this study, only NDSIs that had the highest correlations were finally selected and redundant indices were removed. Figure 3 describes the data processing steps used for hyperspectral reflectance data analysis. All Pearson’s correlation analyses between extracted features and ground reference data were performed in R program (version 3.1.1, R Foundation for Statistical Computing, Vienna, Austria).

FIGURE 3
www.frontiersin.org

Figure 3. Data processing steps used to process hyperspectral reflectance data.

Results

Disease Rating and Stomatal Conductance

Proportion of shoot length blighted (Norelli et al., 2003; Khan et al., 2006; Durel et al., 2009) and the age of infected wood (Harshman et al., 2017) for fire blight phenotyping have been used in different studies. In this study, length of healthy tissue on inoculated shoots was measured right after inoculation and at the complete disease developmental stage. Disease severity was also rated according to age of wood infected on each inoculated shoot; the tree average was considered as actual disease severity rating. Ratings based on proportion of shoot length blighted were highly correlated with disease severity rating based on age of wood infected in 2016 (r = 0.96) and 2017 (r = 0.93). Therefore, for most parts of this study, disease severity based on the age of infected wood was used as a ground reference measure (Figure 4). While analyzing side-view RGB images, proportion of shoot length blighted was also considered, as it was more related to the image features.

FIGURE 4
www.frontiersin.org

Figure 4. Distribution of samples in four disease severity rating classes based on the age of the wood infection in 2016 and 2017.

Stomatal conductance measurements were collected from inoculated and healthy shoots in the experimental trees. How the bacteria travels through the plant tissue has yet to be fully determined; however, there is reasonable evidence that E. amylovora travels through intercellular spaces as described in the review paper (Billing, 2011). We anticipated that as this process of infection can affect water use efficiency, leaf stomatal conductance can be used to evaluate disease progression. Data were analyzed by computing the difference between stomatal conductance data from both healthy and inoculated shoots within the same tree (to normalize data for variety differences on the stomatal conductance) and correlating this normalized stomatal conductance difference with disease severity rating. Results showed a statistically significant correlation among these parameters (r = 0.51, P < 0.05) with an increase in normalized stomatal conductance values (representing higher differences between data from inoculated and healthy shoot leaves) associated with increased disease severity rating based on wood age.

RGB Side Imaging

Three features were extracted from RGB side-images (2017): (1) maximum length of shoots with senesced leaves (pixels), (2) total area of senesced leaves (pixels), and (3) ratio of senesced to healthy leaf area. These three features were compared to actual lesion length, proportion of shoot length blighted, and disease severity rating based on wood age. The correlation coefficients were significant (P < 0.01) and are summarized in Table 1. In general, all RGB image features were correlated with ground reference data. Amongst these features, the strongest correlation coefficient of 0.51 was found between maximum length of shoot with senesced leaves as measured using RGB imaging and total lesion length. The direct relationship between these two measures could be the reason for high correlation.

TABLE 1
www.frontiersin.org

Table 1. Correlation coefficient (r) between RGB image features and disease severity rating based on wood age in 2017.

Multispectral Imaging

Multispectral imaging at multiple scales were evaluated to measure overall tree stress resulting from the inoculation. Although only inoculated shoots (limited in comparison to overall branches on the tree canopy) were expected to experience necrosis during disease development, we anticipated that this process will induce stress at the canopy level, which could be measured using remote sensing. During AUV-based multispectral imaging (2016 and 2017), in some cases, a similar trend between disease rating and GNDVI data was observed, even if the pattern was not consistent. Figure 5 shows the color map of a few sample trees from data collected in 2016, where ground reference rating refers to disease rating and observer rating refers to non-expert evaluator (S. Jarolmasjed). In 2016, a significant correlation between GNDVI and disease severity rating was observed (r = -0.38, P < 0.01), which was absent in 2017 (r = -0.22, P = 0.08, outliers were removed). Higher canopy volume in 2017 may have contributed to minor errors in tree segmentation or masking of symptoms, which could have led to these results.

FIGURE 5
www.frontiersin.org

Figure 5. Color map showing disease rating and GNDVI data of representative 2016 diseased trees. Ground reference rating refers to disease rating; while observer rating refers to non-expert evaluator. Green box represents similar disease rating-GNDVI trend and red box represents dissimilar disease rating-GNDVI trend.

In regard to UAV-based multispectral images (2017), the average and sum vegetation indices values (Figure 6) were extracted for individual segmented trees. These features were significantly (Table 2, P < 0.01) correlated to disease severity rating. Sensors used for phenotyping disease resistance should ideally be able to capture subtle differences in disease symptoms. It was interesting to note that although AUV-based multispectral images at higher resolution could not capture canopy health differences using vegetation indices, aerial images showed similar trend (similar to those using AUV-based multispectral images in 2016). Higher canopy vigor in 2017 (in comparison to 2016), combined with enhancement of image noise could have resulted in no correlation between AUV image-based GNDVI data and visual rating in 2017.

FIGURE 6
www.frontiersin.org

Figure 6. The original RGB, GNDVI, NDVI, and NDRE images of trees evaluated in 2017. The scales in GNDVI, NDVI, and NDRE represent the range of vegetation index data. The heterogeneity in tree canopy was a function of growth and disease status of each tree. The black vector superimposed on the original and vegetation index images represent segmentation of each tree.

TABLE 2
www.frontiersin.org

Table 2. Correlation coefficient (r) between UAV-based multispectral image features and disease severity rating based on wood age in 2017.

Hyperspectral Spectrometry

The overall classification accuracies with PLSR and QSVM are summarized in Table 3. The purpose of the classification was to observe the Vis-NIR spectral reflectance pattern difference between different disease ratings. Results indicated that spectra are capable of delineating fire blight disease rating in shoots. In general, classification accuracies were higher in 2017 than 2016, which could be resulting from differences between disease developments in both years. In general, the classification of two classes was more accurate than four classes. Observing the confusion matrix (Figure 7), it was found that the classification accuracies with four classes were often lower because 0 and 1 ratings were often misclassified as 1 and 0 ratings, respectively; while 2 and 3 ratings were misclassified as 3 and 2 ratings, respectively.

TABLE 3
www.frontiersin.org

Table 3. Overall classification accuracies computed using two models partial least square regression (PLSR) and quadratic support vector machine (QSVM).

FIGURE 7
www.frontiersin.org

Figure 7. Confusion matrices of classification accuracies computed using quadratic support vector machine. The two classes represent 0 and 1 rating as class 0 and 2 and 3 rating as class 1; while four classes represent rating 0, 1, 2, and 3 as four classes.

In regard to the NDSI selection, mature leaf spectra were used for analysis, as their spectral pattern was considered to be more stable during the season and disease development. Figures 8, 9 show the distribution of correlation coefficient between NDSIs generated with the entire spectra and disease rating using mid and end of (disease development) season datasets for 2016 and 2017. One interesting observation from this data is the consistency in the relationship (correlation) pattern across two seasons, at both mid-season and end of disease development period. This indicates development of fire blight infection may progress in a predicted manner. Moreover, stronger correlations between NDSIs and disease rating were found in end of season than mid-season, which confirms the development of symptoms at the end of disease development phase. Using the raw dataset of NDSI, stepwise regression was applied to select NDSIs that were significantly correlated with disease rating within a season (excluding visible bands). The final set of selected NDSIs are reported in Table 4. All NDSIs were significantly correlated with the visual ratings. Figure 10 shows the boxplot of representative NDSIs.

FIGURE 8
www.frontiersin.org

Figure 8. Correlation between NDSIs generated using hyperspectral data collected during the mid-season and disease rating in 2016 and 2017. The color scale represents Pearson’s correlation coefficients (r).

FIGURE 9
www.frontiersin.org

Figure 9. Correlation between NDSIs generated using hyperspectral data collected during the end of disease development and disease rating in 2016 and 2017. The color scale represents Pearson’s correlation coefficients (r).

TABLE 4
www.frontiersin.org

Table 4. Selected wavelength combinations used in normalized different spectral index (NDSI) resulting from feature selection from 2016 and 2017 datasets.

FIGURE 10
www.frontiersin.org

Figure 10. Boxplots of selected NDSIs generated using 2016 and 2017 end of disease development phase datasets. The visual rating refers to disease severity rating by wood age.

Discussion

Apple breeders need to evaluate the performance of their germplasm under different conditions in order to select varieties that are more resistant to disease/other conditions, which is also an important aspect of integrated pest management (Lespinasse and Aldwinckle, 2000). In this study, WSU apple germplasms were inoculated with the bacteria causing fire blight (E. amylovora), a potentially devastating pathogen. Fire blight causes significant production loss for the commercial fruit industry worldwide, and consequently, resistant cultivars are sought to save crops from its devastating effects. Typically, phenotyping susceptibility to fire blight shoot infection is performed by estimating incidence under natural infection pressure or measuring shoot lesion length after artificial inoculation. These options are subjective and labor intensive (Lespinasse and Aldwinckle, 2000). For these reasons, high-throughput phenotyping techniques offer standardization in the process of disease rating in an accurate and rapid manner. In this study, multiple sensing systems at ranging complexity (RGB, multispectral, and hyperspectral sensing systems) were evaluated to select the most effective and accurate method for disease severity evaluation in apple. The benefits and limitations of each method is reported in Supplementary Table S2.

In regard to RGB imaging, a canopy trait such as senescence can be easily captured (Ahmad and Reid, 1996; Laliberte et al., 2007). RGB images were explored to estimate the senescence leaf area from captured data and the extracted features showed significant correlation with disease rating. One major limitation in the throughput of this method was placement of white background. Accuracy and throughput of such evaluation can be further enhanced with the use of an automated phenotyping system to cover the canopy for controlled lighting and imaging conditions, in addition to accurate estimation of distance between the canopy and the sensor using 3D camera or other time-of-flight sensors. Moreover, image processing protocol utilized in this study could not delineate minor differences between branches/shoots and senesced leaves, although noise from other tissues could be eliminated. Even if this noise was minimal, integrating image processing techniques such as Hough transform and machine learning algorithms can potentially increase accuracy of this technique. Nevertheless, other stress conditions influencing leaf color will affect the accuracy of this method.

Multispectral cameras were also integrated with multiple platforms (AUV, UAV) to capture images in order to phenotype disease rating and assess fire blight symptoms. The vegetation indices extracted from multispectral images acquired from AUV (GNDVI, 2016) and UAV (GNDVI, NDRE, NDVI; 2017) showed weak (yet significant) correlation with disease rating. In 2017, correlation between GNDVI and disease severity rating was absent, which could be associated with segmentation issues during image processing as reported in Laliberte and Rango (2009). Overlapping shoots could not be detected from the images in some cases (especially when the canopy vigor was high) as the trees were planted at high density.

The significant correlation between vegetation indices extracted from remote sensing data with disease severity rating showed the capability of vegetation indices to detect differences in the canopy reflectance between symptomless and susceptible trees, although with low sensitivity (Gröll et al., 2007). The observed result in our study were in contrast with the study reported in Naidu et al. (2009), where significant differences between the vegetation indices of non-infected and infected plants with leaf roll-associated virus-3 in grapevine were not found. On the other hand, Mahlein et al. (2013) achieved an accuracy of 80% while classifying Cercospora leaf spot infected and healthy leaves using NDVI in sugarbeet. In the same crop, a correlation coefficient r of -0.89 was found between NDVI and diseased leaf severity (Cercospora leaf spot, powdery mildew; Mahlein et al., 2010). The efficacy in the detection of plant diseases using vegetation index depends on the crop, pathogen, and symptoms. Although AUV and UAV-based multispectral images may provide information of canopy health, the limited number of spectral bands will limit its application in disease-specific evaluation, which can be better achieved with data capture in broader visible-near infrared spectral reflectance range in comparison to using vegetation indices (Montesinos-López et al., 2017). The remote sensing technique can be used for scouting for fire blight disease incidence in production systems, which can be followed by pathological evaluation of samples.

At times, vegetation indices do not provide relationship with crop stress (Eitel et al., 2008; Jarolmasjed et al., 2018). NDVI has also been reported to record plant responses to water stress, and are indicative of chlorophyll and yield (Ceccato et al., 2002; Gitelson et al., 2003; Eitel et al., 2008). Moreover, the specific absorption coefficient of chlorophyll in the red channel is high, and when combined with the low depth of light penetration into the leaf, the index can create saturation. For a similar reason, NDVI cannot capture stress responses from a canopy with higher leaf area index (Gitelson et al., 2003). In this regard, hyperspectral sensing offers a more specific evaluation of certain disease.

Visible-near infrared spectrometry was used to capture the spectral reflectance of the leaves and create a set of novel vegetation indices (NDSIs) representing fire blight disease symptom progression. Such indices were created utilizing the high-spectral resolution visible-near infrared reflectance data (Inoue et al., 2008). Other feature extraction methods reported in literature (Sun et al., 2017; Sun and Du, 2018) can also be explored for hyperspectral data analysis. In this study, the highest correlating NDSIs to visual rating were reported as spectral bands capable of defining fire blight disease in apple trees. The selected wavelengths for NDSIs ranged from 710 to 2340 nm. Among different wavelengths, the red edge band (696–704 nm) reported to contain a high amount of information in regard to chlorophyll content and vegetation stress (Mohd Shafr and Hamdan, 2009) was also present. In a lettuce maturity detection study (Brach et al., 1982), the ratio of 1,170 nm to 1,110 nm provided in morphogenetic differentiation of the tissue. The 1,170 nm was also selected as one of the wavelength features in this work. Few wavelengths reported in literature that could discriminate wolfberries (Yin et al., 2017) such as 1,130, 1,160, 1,300, 1,328, and 1,423 nm were also similar to the wavelengths during NDSI selection in the current study. The selected NDSIs found in this research were consistently highly correlated across seasons, and showed a prospective for early fire blight disease detection that can be useful in precision agriculture applications. Thus, imaging and spectrometric techniques have the potential to be used as phenotyping tool that are dependent on the crop and disease conditions. In future, multispectral imaging with customized spectral band combinations can be used for fire blight resistant evaluation.

Conclusion

In summary, the RGB and multispectral imaging sensors offered low to moderate accuracy in detection of disease severity based on image features representing senescence and vegetation indices extracted from the images, respectively. The absence of higher spectral resolution in the process of disease severity evaluation in breeding programs can limit the application of these sensing systems. In addition, presence of other stress conditions such as heat stress and/or other diseases, may also limit the potential use of these techniques. In this regard, hyperspectral sensing system can capture disease-specific responses that can be useful for disease severity evaluations. The normalized difference spectral ratios derived from hyperspectral data were found to show moderate to high accuracies in disease severity evaluation, which were consistent between two seasons. Thus, these specific spectral bands could be used for evaluating fire blight disease severity in apple breeding programs. In addition, these indices also showed potential to be used as early disease detection tools that could assist in timely crop management in production systems.

Author Contributions

SJ, SS, and KE: conceptualization. SJ, SK, and KE: data collection. YS and SS: data processing and analysis – RGB images. JQ and SS: data processing and analysis – UAV images. SJ and SS: data processing and analysis – proximal multispectral images, hyperspectral data, and writing – original draft preparation. SS and KE: resources, supervision, project administration, and funding acquisition. SK, KE, and AM: writing – review and editing. SJ, AM, JQ, and SS: visualization.

Funding

This activity was funded in part by United States Department of Agriculture (USDA) – National Institute for Food and Agriculture (NIFA) Agriculture and Food Research Initiative Competitive Project WNP08532 (Accession Number 1008828), Washington Tree Fruit Research Commission projects # CP-15-100 (A, B, and C) and CP-12-104 (A and B), and USDA-NIFA Hatch project 1014919.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

The authors would like to thank Dr. Carlos Zúñiga Espinoza, Dr. Lav R. Khot, Dr. Jianfeng Zhou, Dr. Haitham Bahlol, Dr. Rajeev Sinha, and Chongyuan Zhang for their help in field data collection.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpls.2019.00576/full#supplementary-material

Footnotes

  1. ^http://qgis.osgeo.org

References

Ahmad, I. S., and Reid, J. F. (1996). Evaluation of colour representations for maize images. J. Agric. Eng. Res. 63, 185–195. doi: 10.1006/jaer.1996.0020

CrossRef Full Text | Google Scholar

Al-Saddik, H., Laybros, A., Billiot, B., and Cointault, F. (2018). Using image texture and spectral reflectance analysis to detect Yellowness and Esca in grapevines at leaf-level. Remote Sens. 10:618. doi: 10.3390/rs10040618

CrossRef Full Text | Google Scholar

Bauriegel, E., Brabandt, H., Gärber, U., and Herppich, W. B. (2014). Chlorophyll fluorescence imaging to facilitate breeding of Bremia lactucae-resistant lettuce cultivars. Comput. Electron. Agric. 105, 74–82. doi: 10.1016/j.compag.2014.04.010

CrossRef Full Text | Google Scholar

Bereswill, S., Bugert, P., Bruchmüller, I., and Geider, K. (1995). Identification of the fire blight pathogen, Erwinia amylovora, by PCR assays with chromosomal DNA. Appl. Environ. Microbiol. 61, 2636–2642.

PubMed Abstract | Google Scholar

Billing, E. (2011). Fire blight. Why do views on host invasion by Erwinia amylovora differ? Plant Pathol. 60, 178–189. doi: 10.1111/j.1365-3059.2010.02382.x

CrossRef Full Text | Google Scholar

Bock, C. H., Parker, P. E., Cook, A. Z., and Gottwald, T. R. (2008). Visual rating and the use of image analysis for assessing different symptoms of citrus canker on grapefruit leaves. Plant Dis. 92, 530–541. doi: 10.1094/PDIS-92-4-0530

PubMed Abstract | CrossRef Full Text | Google Scholar

Brach, E. J., Phan, C. T., Poushinsky, G., Jasmin, J. J., and Aubé, C. B. (1982). Lettuce maturity detection in the visible (380-720 nm) far red (680-750 nm) and near infrared (800-1 850 nm) wavelength band. Agronomie 2, 685–694. doi: 10.1051/agro:19820801

CrossRef Full Text | Google Scholar

Brown, S. (2012). “Apple,” in Fruit Breeding Handbook of Plant Breeding, eds M. L. Badenes and D. H. Byrne (Boston, MA: Springer), 329–367.

Google Scholar

Casadesús, J., Kaya, Y., Bort, J., Nachit, M. M., Araus, J. L., Amor, S., et al. (2007). Using vegetation indices derived from conventional digital cameras as selection criteria for wheat breeding in water-limited environments. Ann. Appl. Biol. 150, 227–236. doi: 10.1111/j.1744-7348.2007.00116.x

CrossRef Full Text | Google Scholar

Ceccato, P., Gobron, N., Flasse, S., Pinty, B., and Tarantola, S. (2002). Designing a spectral index to estimate vegetation water content from remote sensing data: Part 1: theoretical approach. Remote Sens. Environ. 82, 188–197. doi: 10.1016/S0034-4257(02)00037-8

CrossRef Full Text | Google Scholar

Delalieux, S., van Aardt, J., Keulemans, W., Schrevens, E., and Coppin, P. (2007). Detection of biotic stress (Venturia inaequalis) in apple trees using hyperspectral data: non-parametric statistical approaches and physiological implications. Eur. J. Agron. 27, 130–143. doi: 10.1016/j.eja.2007.02.005

CrossRef Full Text | Google Scholar

Díaz-Varela, R. A., de la Rosa, R., León, L., and Zarco-Tejada, P. J. (2015). High-resolution airborne uav imagery to assess olive tree crown parameters using 3D photo reconstruction: application in breeding trials. Remote Sens. 7, 4213–4232. doi: 10.3390/rs70404213

CrossRef Full Text | Google Scholar

Dlamini, S. N., Beloconi, A., Mabaso, S., Vounatsou, P., Impouma, B., and Fall, I. S. (2019). Review of remotely sensed data products for disease mapping and epidemiology. Remote Sens. Appl. 14, 108–118. doi: 10.1016/j.rsase.2019.02.005

PubMed Abstract | CrossRef Full Text | Google Scholar

Durel, C.-E., Denancé, C., and Brisset, M.-N. (2009). Two distinct major QTL for resistance to fire blight co-localize on linkage group 12 in apple genotypes ‘Evereste’ and Malus floribunda clone 821. Genome 52, 139–147. doi: 10.1139/G08-111

PubMed Abstract | CrossRef Full Text | Google Scholar

Eitel, J. U. H., Long, D. S., Gessler, P. E., and Hunt, E. R. (2008). Combined spectral index to improve ground-based estimates of nitrogen status in dryland wheat. Agron. J. 100:1694. doi: 10.2134/agronj2007.0362

CrossRef Full Text | Google Scholar

Gitelson, A., and Merzlyak, M. N. (1994). Spectral reflectance changes associated with autumn senescence of Aesculus hippocastanum L. and Acer platanoides L. leaves. spectral features and relation to chlorophyll estimation. J. Plant Physiol. 143, 286–292. doi: 10.1016/S0176-1617(11)81633-0

CrossRef Full Text | Google Scholar

Gitelson, A. A., and Merzlyak, M. N. (1998). Remote sensing of chlorophyll concentration in higher plant leaves. Adv. Space Res. 22, 689–692. doi: 10.1016/S0273-1177(97)01133-2

CrossRef Full Text | Google Scholar

Gitelson, A. A., Viña, A., Arkebauer, T. J., Rundquist, D. C., Keydan, G., and Leavitt, B. (2003). Remote estimation of leaf area index and green leaf biomass in maize canopies: remote estimation of leaf area index. Geophys. Res. Lett. 30:1248 doi: 10.1029/2002GL016450

CrossRef Full Text | Google Scholar

Golhani, K., Balasundram, S. K., Vadamalai, G., and Pradhan, B. (2018). A review of neural networks in plant disease detection using hyperspectral data. Inform. Process. Agric. 5, 354–371. doi: 10.1016/j.inpa.2018.05.002

CrossRef Full Text | Google Scholar

Gomez-Candon, D., Labbé, S., Virlet, N., Jolivot, A., and Regnard, J.-L. (2014). “High resolution thermal and multispectral UAV imagery for precision assessment of apple tree response to water stress,” in Proceedings of the 2nd International Conference on Robotics and associated High-technologies and Equipment for Agriculture and Forestry RHEA, (Madrid).

Google Scholar

Gröll, K., Graeff, S., and Claupein, W. (2007). “Use of vegetation indices to detect plant diseases,” in Proceedings of the Agrarinformatik im Spannungsfeld zwischen Regionalisierung und globalen Wertschöpfungsketten, Referate der 27. GIL Jahrestagung, 5.-7, (Stuttgart).

Google Scholar

Hairmansis, A., Berger, B., Tester, M., and Roy, S. J. (2014). Image-based phenotyping for non-destructive screening of different salinity tolerance traits in rice. Rice 7:16. doi: 10.1186/s12284-014-0016-3

PubMed Abstract | CrossRef Full Text | Google Scholar

Harshman, J. M., Evans, K. M., Allen, H., Potts, R., Flamenco, J., Aldwinckle, H. S., et al. (2017). Fire Blight resistance in wild accessions of Malus sieversii. Plant Dis. 101, 1738–1745. doi: 10.1094/PDIS-01-17-0077-RE

PubMed Abstract | CrossRef Full Text | Google Scholar

Hosoi, F., and Omasa, K. (2009). Estimating vertical plant area density profile and growth parameters of a wheat canopy at different growth stages using three-dimensional portable lidar imaging. ISPRS J. Photogramm. Remote Sens. 64, 151–158. doi: 10.1016/j.isprsjprs.2008.09.003

CrossRef Full Text | Google Scholar

Inoue, Y., Penuelas, J., Miyata, A., and Mano, M. (2008). Normalized difference spectral indices for estimating photosynthetic efficiency and capacity at a canopy scale derived from hyperspectral and CO2 flux measurements in rice. Remote Sens. Environ. 112, 156–172. doi: 10.1016/j.rse.2007.04.011

CrossRef Full Text | Google Scholar

Jarolmasjed, S., Sankaran, S., Kalcsits, L., and Khot, L. R. (2018). Proximal hyperspectral sensing of stomatal conductance to monitor the efficacy of exogenous abscisic acid applications in apple trees. Crop Protect. 109, 42–50. doi: 10.1016/j.cropro.2018.02.022

CrossRef Full Text | Google Scholar

Khan, M. A., Duffy, B., Gessler, C., and Patocchi, A. (2006). QTL mapping of fire blight resistance in apple. Mol. Breed. 17, 299–306. doi: 10.1007/s11032-006-9000-y

CrossRef Full Text | Google Scholar

Khanal, S., Fulton, J., and Shearer, S. (2017). An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 139, 22–32. doi: 10.1016/j.compag.2017.05.001

CrossRef Full Text | Google Scholar

Kostick, S. A., Norelli, J. L., and Evans, K. M. (2019). Novel metrics to classify fire blight resistance of 94 apple cultivars. Plant Pathol. doi: 10.1111/ppa.13012 [Epub ahead of print].

CrossRef Full Text | Google Scholar

Laliberte, A. S., and Rango, A. (2009). Texture and scale in object-based analysis of subdecimeter resolution unmanned aerial vehicle (UAV) imagery. IEEE Trans. Geosci. Remote Sens. 47, 761–770. doi: 10.1109/TGRS.2008.2009355

CrossRef Full Text | Google Scholar

Laliberte, A. S., Rango, A., Herrick, J. E., Fredrickson, E. L., and Burkett, L. (2007). An object-based image analysis approach for determining fractional cover of senescent and green vegetation with digital plot photography. J. Arid Environ. 69, 1–14. doi: 10.1016/j.jaridenv.2006.08.016

CrossRef Full Text | Google Scholar

Lespinasse, Y., and Aldwinckle, H. S. (2000). “Breeding for resistance to fire blight,” in Fire blight: the disease and its causative agent, Erwinia amylovora, ed. J. Vanneste (Wallingford: Vanneste. CAB International), 253–265.

Google Scholar

Lu, Y., and Lu, R. (2017). Non-destructive defect detection of apples by spectroscopic and imaging technologies: a review. Trans. ASABE 60, 1765–1790. doi: 10.13031/trans.12431

CrossRef Full Text | Google Scholar

Mahlein, A.-K. (2016). Plant disease detection by imaging sensors – parallels and specific demands for precision agriculture and plant phenotyping. Plant Dis. 100, 241–251. doi: 10.1094/pdis-03-15-0340-fe

PubMed Abstract | CrossRef Full Text | Google Scholar

Mahlein, A.-K., Rumpf, T., Welke, P., Dehne, H.-W., Plümer, L., Steiner, U., et al. (2013). Development of spectral indices for detecting and identifying plant diseases. Remote Sens. Environ. 128, 21–30. doi: 10.1016/j.rse.2012.09.019

CrossRef Full Text | Google Scholar

Mahlein, A.-K., Steiner, U., Dehne, H.-W., and Oerke, E.-C. (2010). Spectral signatures of sugar beet leaves for the detection and differentiation of diseases. Precision Agric. 11, 413–431. doi: 10.1007/s11119-010-9180-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Meunkaewjinda, A., Kumsawat, P., Attakitmongcol, K., and Srikaew, A. (2008). “Grape leaf disease detection from color imagery using hybrid intelligent system,” in Proceedings of the 2008 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, (Krabi), 513–516.

Google Scholar

Mohd Shafr, H. Z., and Hamdan, N. (2009). Hyperspectral imagery for mapping disease infection in oil palm plantation using vegetation indices and red edge techniques. Am. J. Appl. Sci. 6, 1031–1035. doi: 10.3844/ajassp.2009.1031.1035

CrossRef Full Text | Google Scholar

Montesinos-López, O. A., Montesinos-López, A., Crossa, J., de los Campos, G., Alvarado, G., Suchismita, M., et al. (2017). Predicting grain yield using canopy hyperspectral reflectance in wheat breeding data. Plant Methods 13, 1–23. doi: 10.1186/s13007-016-0154-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Naidu, R. A., Perry, E. M., Pierce, F. J., and Mekuria, T. (2009). The potential of spectral reflectance technique for the detection of Grapevine leafroll-associated virus-3 in two red-berried wine grape cultivars. Comput. Electron. Agric. 66, 38–45. doi: 10.1016/j.compag.2008.11.007

CrossRef Full Text | Google Scholar

Neumann, M., Hallau, L., Klatt, B., Kersting, K., and Bauckhage, C. (2014). “Erosion band features for cell phone image based plant disease classification,” in Proceedings of the 2014 22nd International Conference on Pattern Recognition, (Stockholm), 3315–3320. doi: 10.1109/ICPR.2014.571

CrossRef Full Text | Google Scholar

Norelli, J. L., Jones, A. L., and Aldwinckle, H. S. (2003). Fire blight management in the twenty-first century: using new technologies that enhance host resistance in apple. Plant Dis. 87, 756–765. doi: 10.1094/PDIS.2003.87.7.756

PubMed Abstract | CrossRef Full Text | Google Scholar

Phadikar, S., and Sil, J. (2008). “Rice disease identification using pattern recognition techniques,” in Proceedings of the 2008 11th International Conference on Computer and Information Technology, (Khulna), 420–423.

Google Scholar

Römer, C., Bürling, K., Hunsche, M., Rumpf, T., Noga, G., and Plümer, L. (2011). Robust fitting of fluorescence spectra for pre-symptomatic wheat leaf rust detection with support vector machines. Comput. Electron. Agric. 79, 180–188. doi: 10.1016/j.compag.2011.09.011

CrossRef Full Text | Google Scholar

Roscher, R., Behmann, J., Mahlein, A.-K., Dupuis, J., Kuhlmann, H., and Plumer, L. (2016). Detection of Disease Symptoms On Hyperspectral 3D Plant Models. Germany: ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences.

Google Scholar

Rouse, J., Haas, R., Schell, J., and Deering, D. (1973). Monitoring Vegetation Systems in the Great Plains With ERTS. Available at: https://ntrs.nasa.gov/search.jsp?R = 19740022614 (accessed march 6, 2019).

Google Scholar

Salgadoe, A. S. A., Robson, A. J., Lamb, D. W., Dann, E. K., and Searle, C. (2018). Quantifying the severity of phytophthora root rot disease in avocado trees using image analysis. Remote Sens. 10:226. doi: 10.3390/rs10020226

CrossRef Full Text | Google Scholar

Salm, H., and Geider, K. (2004). Real-time PCR for detection and quantification of Erwinia amylovora, the causal agent of fireblight. Plant Pathol. 53, 602–610. doi: 10.1111/j.1365-3059.2004.01066.x

CrossRef Full Text | Google Scholar

Sankaran, S., Mishra, A., Maja, J. M., and Ehsani, R. (2011). Visible-near infrared spectroscopy for detection of Huanglongbing in citrus orchards. Comput. Electron. Agric. 77, 127–134. doi: 10.1016/j.compag.2011.03.004

CrossRef Full Text | Google Scholar

Sims, D. A., and Gamon, J. A. (2002). Relationships between leaf pigment content and spectral reflectance across a wide range of species, leaf structures and developmental stages. Remote Sens. Environ. 81, 337–354. doi: 10.1016/S0034-4257(02)00010-X

CrossRef Full Text | Google Scholar

Sun, W., and Du, Q. (2018). Graph-regularized fast and robust principal component analysis for hyperspectral band selection. IEEE Trans. Geosci. Remote Sens. 56, 3185–3195. doi: 10.1109/tgrs.2018.2794443

CrossRef Full Text | Google Scholar

Sun, W., Yang, G., Wu, K., Li, W., and Zhang, D. (2017). Pure endmember extraction using robust kernel archetypoid analysis for hyperspectral imagery. ISPRS J. Photogramm. Remote Sens. 131, 147–159. doi: 10.1016/j.isprsjprs.2017.08.001

CrossRef Full Text | Google Scholar

Sutton, T. B. (1996). Changing options for the control of deciduous fruit tree diseases. Annu. Rev. Phytopathol. 34, 527–547. doi: 10.1146/annurev.phyto.34.1.527

PubMed Abstract | CrossRef Full Text | Google Scholar

Tanabata, T., Shibaya, T., Hori, K., Ebana, K., and Yano, M. (2012). SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis. Plant Physiol. 160, 1871–1880. doi: 10.1104/pp.112.205120

PubMed Abstract | CrossRef Full Text | Google Scholar

Trachsel, S., Kaeppler, S. M., Brown, K. M., and Lynch, J. P. (2011). Shovelomics: high throughput phenotyping of maize (Zea mays L.) root architecture in the field. Plant Soil 341, 75–87. doi: 10.1007/s11104-010-0623-8

CrossRef Full Text | Google Scholar

Tzionas, P., Papadakis, S. E., and Manolakis, D. (2005). “Plant leaves classi?cation based on morphological features and a fuzzy surface selection technique,” in Proceedings of the Fifth international conference on technology and automation,Thessaloniki, (Greece), 365–370.

Usha, K., and Singh, B. (2013). Potential applications of remote sensing in horticulture—A review. Sci. Hortic. 153, 71–83. doi: 10.1016/j.scienta.2013.01.008

CrossRef Full Text | Google Scholar

Van Der Zwet, T., and Keil, H. L. (1979). Fire Blight A Bacterial Disease Of Rosaceous Plants. Washington, DC: USA Department of Agriculture, 200.

Google Scholar

Van der Zwet, T., Orolaza-Halbrendt, N., and Zeller, W. (2012). Fire Blight: History, Biology, and Management. St. Paul: APS Press/American Phytopathological Society.

Google Scholar

Vanneste, J. L. (2000). Fire Blight: The Disease and Its Causative Agent, Erwinia amylovora. Wallingford: CABI.

Google Scholar

Virlet, N., Lebourgeois, V., Martinez, S., Costes, E., Labbé, S., and Regnard, J.-L. (2014). Stress indicators based on airborne thermal imagery for field phenotyping a heterogeneous tree population for response to water constraints. J. Exp. Bot. 65, 5429–5442. doi: 10.1093/jxb/eru309

PubMed Abstract | CrossRef Full Text | Google Scholar

Weber, V. S., Araus, J. L., Cairns, J. E., Sanchez, C., Melchinger, A. E., and Orsini, E. (2012). Prediction of grain yield using reflectance spectra of canopy and leaves in maize plants grown under different water regimes. Field Crops Res. 128, 82–90. doi: 10.1016/j.fcr.2011.12.016

CrossRef Full Text | Google Scholar

Weizheng, S., Yachun, W., Zhanliang, C., and Hongda, W. (2008). “Grading method of leaf spot disease based on image processing,” in Proceedings of the 2008 International Conference on Computer Science and Software Engineering, (Piscataway, NJ), 491–494. doi: 10.1109/CSSE.2008.1649

CrossRef Full Text | Google Scholar

Wijekoon, C. P., Goodwin, P. H., and Hsiang, T. (2008). Quantifying fungal infection of plant leaves by digital image analysis using scion image software. J. Microbiol. Methods 74, 94–101. doi: 10.1016/j.mimet.2008.03.008

PubMed Abstract | CrossRef Full Text | Google Scholar

Yang, W., Guo, Z., Huang, C., Duan, L., Chen, G., Jiang, N., et al. (2014). Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice. Nat. Commun. 5:5087. doi: 10.1038/ncomms6087

PubMed Abstract | CrossRef Full Text | Google Scholar

Yin, W., Zhang, C., Zhu, H., Zhao, Y., and He, Y. (2017). Application of near-infrared hyperspectral imaging to discriminate different geographical origins of Chinese wolfberries. PLoS One 12:e0180534. doi: 10.1371/journal.pone.0180534

PubMed Abstract | CrossRef Full Text | Google Scholar

Zarco-Tejada, P. J., Camino, C., Beck, P. S. A., Calderon, R., Hornero, A., Hernández-Clemente, R., et al. (2018). Previsual symptoms of Xylella fastidiosa infection revealed in spectral plant-trait alterations. Nat. Plants 4:432. doi: 10.1038/s41477-018-0189-7

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: proximal sensing, unmanned aerial systems, normalized difference spectral indices, classification, Malus pumila Mill

Citation: Jarolmasjed S, Sankaran S, Marzougui A, Kostick S, Si Y, Quirós Vargas JJ and Evans K (2019) High-Throughput Phenotyping of Fire Blight Disease Symptoms Using Sensing Techniques in Apple. Front. Plant Sci. 10:576. doi: 10.3389/fpls.2019.00576

Received: 29 October 2018; Accepted: 16 April 2019;
Published: 10 May 2019.

Edited by:

Yanbo Huang, United States Department of Agriculture, United States

Reviewed by:

Weiwei Sun, Ningbo University, China
Jingcheng Zhang, Hangzhou Dianzi University, China

Copyright © 2019 Jarolmasjed, Sankaran, Marzougui, Kostick, Si, Quirós Vargas and Evans. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sindhuja Sankaran, sindhuja.sankaran@wsu.edu

These authors have contributed equally to this work

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.