Research Article

Horticultural Science and Technology. 31 December 2020. 763-775
https://doi.org/10.7235/HORT.20200069

ABSTRACT


MAIN

  • Introduction

  • Materials and Methods

  •   Experimental Device

  •   Image Acquisition

  •   Pre-processing of a RGB Image

  •   Fire Blight Learning Algorithm

  •   Image Learning Similarity Evaluation

  •   Fire Blight Analysis Data Set

  • Results and Discussion

  •   Learning Using a Fire Blight Simulation Branch

  •   Detection of a Fire Blight Simulation Branch

  •   Detection of Branches of Fire Blight

  • Conclusions

Introduction

Fruit tree fire blight first occurred in New York in 1780 and has since occurred in many parts of the world, including Europe and Central Asia (Calzolari et al., 1999; Bahadou et al., 2018; Zhao et al., 2019). This disease is caused by Erwinia amylopora, a bacterial pathogen, and is a highly contagious fruit tree disease that causes black-brown decay of branches, berries, and leaves, on fruit trees such as apples and pears (Jeong et al., 2018). There is currently no treatment for fire blight, and the disease is difficult to control, especially since the pathogen is easily spread to nearby trees by bees and splashing rain and has a long incubation period. In addition, if the presence of the disease is confirmed, the damage to the orchard can be very serious because all of the trees within a radius of 100 meters must be buried in accordance with the 'Guidelines for Preventing Fire Blight' based on (Preparation and Prevention Act 36). In Korea, fire blight occurred in 43 orchards in Anseong and Cheonan in 2015, 8 orchards in 2016, and 17 orchards in 2017, resulting in the destruction of trees over an area of about 103 ha (Park et al., 2017). According to the Rural Development Administration, as of June 2020, fire blight incidences were reported in 312 farms and an area of ​​187 ha. In Korea, copper-based compounds and antibiotics such as kasugamycin and oxyttracycline, or antagonistic microorganisms are used to prevent fire blight (McGhe and Sundin, 2011; Lee et al., 2018).

The study on fire blight detection was intended to make a strip-type diagnostic kit using locally separated Erwinia amylopora strains, check the specificity and sensitivity of the domestic fire blight pathogens in this diagnostic kit and a diagnostic kit already commercialized in Europe, and examine the possibility of application in field diagnosis of orchard fire blight (Heo et al., 2017). There are also improved loops for diagnosis of apple and pear fire blight (Shin et al., 2018) and research on selection of in-flight test repellents for orchard disease in Korea (Lee et al., 2018). Also, there is a possibility of future infection in Korea by using Maryblyt, which is windows application used in the U.S., Israel, Spain, Canada, and Switzerland to study and prevent fire blight. Most of the existing studies were field-diagnostic and environmental-induced on-site studies. In the case of on-site diagnostics, all the pear trees in the orchard must be investigated to determine the incidence of fire blight, which requires large inputs of manpower, time, and cost. Precision forecasts of a single orchard are difficult because prediction by environmental factors establishes and predicts the likelihood of occurrence in a wide range of areas. Therefore, it is imperative to develop technologies that can reduce the manpower, time, and cost of predicting fire blight by remotely monitoring orchards and performing on-site diagnostics.

Artificial intelligence (AI) is also widely used by many researchers for plant growth and nonlinear data processing (Nam et al., 2019; Moon et al., 2020). A Convolutional Neural Network (CNN) is an algorithm useful for finding patterns to analyze images, and CNNs learn directly from data and classify images using patterns. The core concept of a CNN is to learn by maintaining spatial data in images. CNNs handle images more effectively by applying filtering techniques to artificial neural networks (Yann et al., 1998). In medical image segmentation, lots of researchers have used CNNs (Ronneberger et al., 2015; Litjens et al., 2017). They used a U-Net, which is similar to SegNet, and consists of an encoder and a decoder network. U-Net is very useful in segmenting medical images, U-Net and U-Net-like models have been used in many biomedical areas, such as neuronal structures (Ronneberger et al., 2015), liver (Christ et al., 2016), and skin lesion (Lin, 2017), etc.

Therefore, in this study, the RGB images were acquired using rotary-wing drones and RGB sensors, and the detection status of fire blight was analyzed using the CNN of Deep Learning, and then a system was developed to predict the occurrence of fire blight on the site. The performance of the fire blight forecast system was evaluated by applying it to actual fire blight images.

Materials and Methods

Experimental Device

Fig. 1 and Table 1 show the specifications of a rotary-wing drone and specifications for image acquisition in areas where fire blight occur. As shown in Fig. 1, the image of pear fire blight was acquired using the data acquisition platform of the Rotary-wing Drone DJI Company Phantom4 Pro V2.0.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F1.jpg
Fig. 1

Rotary-wing drone for image acquisition of pear fire blight (DJI Phantom4 Pro V2.0).

Table 1.

Specifications of DJI Phantom4 Pro V2.0 Rotary-winged drone

Model DJI Phantom4 Pro V2.0
Company DJI
Weight 1,388 g
Sensing distance 30 m
Control distance 7 km
Flight speed 50 ‑ 72 km/h
Flight time 55 min

Fig. 2 and Table 2 show the sensors and specifications for obtaining images of pear fire blight. As shown in Fig. 2, the sensor for acquiring the image data for pear fire blight was a 1" 20 MP RGB Sensor (DJI Phantom4 Pro V2.0 Camera). In addition, the video acquisition software used was the DJI GO4 application. DJI GO4 sets the flight altitude and overlay shooting area of the rotary-wing drone to create an autonomous flight path, and it is possible to set the shooting area display and landing.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F2.jpg
Fig. 2

1" 20 MP (CMOS) camera for RGB image acquisition.

Table 2.

Specifications of DJI Phantom4 Pro V2.0 camera

Model DJI Phantom4 Pro V2.0 camera
Company DJI
Sensor resolution 20 MP
Image resolution 5,472 pixels × 3,648 pixels
Band Red, Green, Blue

Image Acquisition

Actual pear fire blight images (fire blight outbreak in a pear orchard in Ipjang-myeon, Cheonan-si, Chungcheongnam-do, Korea, June 2018) were acquired at an altitude of over 6 m. The spatial resolution was 0.11 cm/pixel. In order to increase the precision and reliability of fire blight detection machine learning, a fire blight branch model for simulation was also installed in the orchard following the advice of a disease expert in Naju pear research center. The simulation images of fire blight were obtained at an altitude of over 12 m. The spatial resolution was 0.32 cm/pixel.

Fig. 3 shows the location of an orchard in Cheonan, South Chungcheong Province, where the actual pear fire blight occurred, and the autonomous flight path of the drone. Fig. 4 shows the flight path of a rotary-wing drone and a regular image of the orchard at the pear lab in Naju, South Jeolla Province.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F3.jpg
Fig. 3

Fire blight-infected site and rotary-winged drone flight path (Cheonan, Chungnam).

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F4.jpg
Fig. 4

Image acquisition flight path of rotary-winged drone for simulation of fire blight and the orchard (Naju, Jeonnam).

Pre-processing of a RGB Image

Fig. 5 shows the image preprocessing process for deep-learning analysis of the image of pear fire blight.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F5.jpg
Fig. 5

Image preprocessing process for deep learning analysis of pear fire blight images.

As shown in Fig. 5, images acquired with a 1" 20 MP RGB camera mounted on a DJI Phantom4 Pro V2.0 drone were obtained with 5,472 pixels × 3,648 pixels. The pre-processing of the image was performed after compressing the acquired images to 2,048 pixels × 4,096 pixels and dividing the compressed images into 16 images by 512 pixels × 1,024 pixels for normalization.

Fire Blight Learning Algorithm

Fig. 6 shows the RGB image segmentation algorithm for fire blight using Deep Learning's CNN. The fire blight RGB image analysis algorithm and graph creation used Tensorflow (US 1.2.1, Google, Menlo Park, USA), a library of numerical calculations based on the programming language Python.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F6.jpg
Fig. 6

RGB image analysis algorithm using CNN in deep learning.

The CNN architecture consists of image input layer, convolution layer, batch normal layer, ReLU layer, max pooling layer, transpose convolution layer, and sigmoid layer. In particular, batch normal layer was applied to improve the performance of detection of fire blight images (Ji et al., 2018). Batch normalization is one of the proposed methods for solving the problem of the network's Internal Covariate Shift, which is to normalize the unit output of the mini-batch pass through each Affine layer into a standard regular distribution. In this paper, the average (μB) and the variance (σ2) of the minibatch were calculated and normalized (xi^) based on its value, the scale and shift (γ,β) were performed, and equations (1,2,3,4,5,6) were as follows. (Ioffe and Christian, 2015).

(1)
Input:Valuesofxoveramini-batchB=x1mParametertobelearned:γ,βOutput:yi=BNγ,β(xi)
(2)
μB1mi=1mxi
(3)
σ21mi=1m(xi-μB)2
(4)
xi^xi-μBσ2B+ϵ
(5)
yiγxi^+βBNγ,β(xi)

Image Learning Similarity Evaluation

In this study, the similarity coefficient was used as one of the methods of assessment based on overlap. The Dice similarity coefficient is defined in equation (6) indicating its similarity by directly comparing the results of the automatic detection with the gold standard image directly detected by the naked eye (Kim and Kim, 2017).

(6)
DICE=2Sg1St1sg1+st1=2TP(TP+FP)+(TP+FN)

where sg1 and st1 represent the segmented region of interest in the gold standard image and the auto-detection image, respectively.

The component of the error sequence TP, FP, TN, and FN are True Positive, False Positive, True Negative, and False Negative, respectively; those values are defined through Equations (7,8,9,10) by using the segmentation function fgi(x) and fti(x).

(7)
TP=r=1nminft1(xr),fg1(xr)
(8)
FP=r=1nminft1(xr)-fg1(xr),0
(9)
TN=r=1nminft2(xr)-fg2(xr)
(10)
FN=r=1nminft2(xr)-fg2(xr),0

where fgi(x) and fti(x) are the split allocation function of the gold standard image and the auto detect image, respectively, and the function value is defined as fgi(x)1, if x(0, 1), and fgi(s)0 if x∉(0.1) in the gold standard image, fti(s)1 if x(0, 1), and fti(s)0 if x∉(0.1) in the auto detect image. Also, i=1 means the region of split interest, i=2 means a background other than the region of interest. Additionally, using the Adam Optimizer, we optimized the learning variables of the graph according to the DICE loss value.

Fig. 7 shows the original image and gold standard image used in image segmentation.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F7.jpg
Fig. 7

Original and gold standard images used for image segmentation.

Fire Blight Analysis Data Set

Table 3 shows the data sets for the analysis of fire blight. 131 images out of a total of 231 simulation images were used for training and 100 images were used for learning results evaluation.

Table 3.

Data set for fire blight analysis

Item Total Training Evaluation
Fire blight simulation 231 131 100
Fire blight 140 100 40

In the case of actual fire blight images, 100 images of the total of 140 images were used for training and 40 were used for learning results evaluation.

Results and Discussion

Learning Using a Fire Blight Simulation Branch

Fig. 8 illustrates the learning loss value of overlap-based fire blight imaging simulations using the Dice function. The number of studies using simulation images was performed 3,600 times, and errors were performed based on an average of 22% to prevent overfitting of the image data.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F8.jpg
Fig. 8

Loss value using Dice similarity coefficient.

Detection of a Fire Blight Simulation Branch

Fig. 9 shows a still image acquired with a 1" 20 MP RGB camera at an altitude of 12 m and the result of detecting fire blight by deep learning.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F9.jpg
Fig. 9

Results of the detection of a fire blight simulation image obtained from the 12-m altitude.

As shown in Fig. 9, the area marked by a red ellipse is the site of onset of fire blight and detection by deep learning shows that the area of fire blight can be detected accurately. In particular, as shown in the enlarged image, it was possible to detect the ground shading and a dried branch of an orchard similar in color to that of a fire blight-infected branch without recognizing it as fire blight. However, there were also cases in which very small shaded areas of a normal branch were mis-detected, as seen in Fig. 10.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F10.jpg
Fig. 10

Errors detection of fire blight simulation by deep learning.

Table 4 shows the learning results of the pear's RGB image taken at an altitude of 12 meters. For fire blight simulation branch learning 131 images of fire blight simulation were used.

Table 4.

Results of the detection of a fire blight simulation using the CNN learning method in deep learning

Confusion matrix Prediction Total Prediction ratio (%)
Infection No-infection
Actual Infection 64 16 80 80.0
No-infection 1 19 20 95.0

As shown in Table 4, 100 images were used for the training test, which consisted of 80 images of fire blight-infected fruit trees and 20 images of non-infected fruit trees.

As a result, 64 of the 80 images of the fruit tree infected with fire blight were recognized as an infection while 16 images were recognized as non-infected, resulting in an 80.0% prediction rate. In addition, in the test results of 20 non-infected images, 19 images were recognized as non-infected, and 1 image was recognized as infected, showing a 95.0% prediction rate. Therefore, it is believed that Deep Learning's CNN algorithm, which was developed in this study, can be applied sufficiently to detect pear fire blight.

Detection of Branches of Fire Blight

Fig. 11 shows the results of the detection by the size of each occurrence of fire blight after learning the actual fire blight incidence images acquired with a 1" 20 MP RGB camera at an altitude of 6 meters using the same algorithm as the fire blight simulation analysis algorithm.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F11.jpg
Fig. 11

Detection results of images of pear fire blight acquired at a height of 6 m.

As shown in Fig. 11, areas marked by solid red lines show large-scale, medium-sized, and small-sized incidences of the fire blight, which are about 50 cm in size, 20 cm in size, and 5 cm in size, respectively. In particular, the areas of shading between the ground and tree branches are marked with red dotted lines, and the test sites that have fallen dried branches can be detected accurately without recognizing them as fire blight. However, some of the test sites and very small shaded parts of a normal branch were also mis-detected as shown in Fig. 12. In order to increase the predictive accuracy in the future, more learning will be required using various types of imaging.

https://static.apub.kr/journalsite/sites/kshs/2020-038-06/N0130380601/images/HST_38_06_01_F12.jpg
Fig. 12

Results of misdetection of fire blight by deep learning.

Table 5 shows the results of actual detection of fire blight using learning results from RGB images of pear fire blight taken at a height of 6 m.

Table 5.

Results of detection of actual fire blight by the CNN learning method in deep learning

Confusion matrix Prediction Total Prediction ratio (%)
Infection No infection
Actual Infection 15 6 21 71.4
No infection 5 14 19 73.7

As shown in Table 5, 40 images were used for the training test, consisting of 21 images of fire blight infection and 19 images of the non-infected fruit tree.

As a result, 15 of the 21 images of an infected fruit tree were recognized as fire blight, while 6 images were recognized as non-infected, resulting in a prediction rate of 71.4%. In addition, the test results of 19 images of the non-infected orchard showed that 14 images were recognized as non-infected, and 5 images were recognized as infected, with a prediction rate of 73.7%.

Therefore, it is possible to detect fire blight if a CNN of Deep Learning is used after obtaining the images using drones in the pear orchard. In particular, if this fire blight detection system can detect partial fire blight in the pear orchard during actual field prediction, the spread of the disease and its damage can be reduced and also the reduction of manpower and time required for wide-area prediction can be achieved.

Conclusions

In this study, a RGB sensor was installed on a rotary-wing drone to obtain RGB images of the fire blight simulation site and the actual infection of fire blight, and to develop a system that can determine the occurrence of fire blight in the field by analyzing the detection status of fire blight by using Deep Learning's CNN. Images of infected areas were acquired with 0.32 cm/pixel images at an altitude of 6 m, with a spatial resolution of 0.11 cm/pixel at an altitude of 12 m. 131 images out of a total of 231 simulation images were used for training and 100 images were used for evaluation of learning results. In addition, 100 images out of a total of 140 images of infection were used for training and 40 were used for evaluation of learning results. The fire blight image learning algorithm used Tensorflow, a library of numerical calculations based on the programming language Python. The similarity of image learning was directly compared with the result of the gold standard image and the automatic detection image, and the error was learned based on an average of 22% to prevent overfitting of the image data.

As a result, 64 of the 80 fire blight-infected fruit tree images were recognized as an infection, while 16 were recognized as non-infected and the prediction rate was 80.0%. In addition, in the test results of 20 non-infected orchards, 19 images were recognized as non-infected, and only 1 image was recognized as infected, showing a 95.0% prediction rate. In the case of an actual fire blight-infected branch, 15 of the 21 images of an infected fruit tree were recognized as infected, while 6 images were recognized as non-infected, with a prediction rate of 71.4%. In addition, the test results of 19 images of the non-infected orchard showed that 14 images were recognized as non-infected, and 5 images were recognized as infected, with a prediction rate of 73.7%.

Therefore, the system for analyzing the occurrence of pear fire blight by applying the images acquired with the drones and CNN of Deep Learning developed in this study is possible with high accuracy. In particular, it is believed that by applying the developed system to the site of preliminary pear fire blight, the actual manpower and time required for the prediction of the area of fire blight can be reduced and the spread of the disease and the need for orchard destruction can be reduced. It is believed that more images of infectious diseases will be needed and learned in order to increase the accuracy of predicting fire blight. It is also necessary to analyze the possibility of detecting a highly damaging apple fire blight using the same system.

Acknowledgements

This study was supported by Rural Development Administration through Development of Image- based Fire Blight Regional Forecasting Technology Project (PJ01277604).

References

1
Bahadou SA, Ouijja A, Karfach A, Tahiri A, Lahlali R (2018) New potential bacterial antagonists for the biocontrol of fire blight disease (Erwinia amylovora) in Morocco. Microb Pathog 117:7-15. doi:10.1016/j.micpath.2018.02.011 10.1016/j.micpath.2018.02.01129428423
2
Calzolari A, Finelli F, Mazzoli GL (1999) A severe unforeseen outbreak of fire blight in the Emilia-Romagna Region. Acta Hortic 489:171-176. doi:10.17660/ActaHortic.1999.489.26 10.17660/ActaHortic.1999.489.26
3
Christ PF, Elshaer M, Ettlinger F, Tatavarty S, Bickel M, Bilic P, Rempfler M, Armbruster M, Hofmann F, et al. (2016) Automatic liver and lesion segmentation in CT using cascaded fully convolutional neural networks and 3D conditional random fields. International Conference on Medical Image Computing and Computer-assisted Intervention, Springer, pp 415-423. doi:10.1007/978-3-319-46723-8_48 10.1007/978-3-319-46723-8_48
4
Heo GI, Shin DS, Son SH, Oh CS, Park DH, Lee YK, Cha JS (2017) On-site diagnosis of fire blight with antibody-based diagnostic strips. Res Plant Dis 23:306-313
5
Ioffe S, Christian S (2015) Batch normalization: Accelerating deep network training by reducing internal covariate shift. Int Conf Mach Learn 37:448-456
6
Jeong US, Kim SS, Kim TH, Seo ST (2018) Sharing loss compensation between the owner and the lessee caused by disease control order for fire blight in fruit. Korean J Agric Manag Policy 45:291-314. doi:10.30805/KJAMP.2018.45.2.291 10.30805/KJAMP.2018.45.2.291
7
Ji MG, Chun JC, Kim NG (2018) An improved image classification using batch normalization and CNN. J Internet Comput Serv 19:35-42
8
Kim JW, Kim JH (2017) Review of evaluation metrics for 3D medical image segmentation. J Korean Soc Imaging Inform Med 23:14-20
9
Lee MS, Lee IG, Kim SK, Oh CS, Park DH (2018) In vitro screening of antibacterial agents for suppression of fire blight disease in Korea. Res Plant Dis 24:41-51
10
Lin BS, Michael K, Kalra S, Tizhoosh HR (2017) Skin lesion segmentation: U-nets versus clustering. 2017 IEEE Symposium Series on Computational Intelligence (SSCI), Springer, pp 1-7. doi:10.1109/SSCI.2017.8280804 10.1109/SSCI.2017.8280804
11
Litjens G, Kooi T, Bejnordi BE, Setio AAA, Ciompi F, Ghafoorian M, Laak J, Ginneken B, Sanchez C (2017) A survey on deep learning in medical image analysis. Méd Image Anal 42:60-88. doi:10.1016/j.media.2017.07.005 10.1016/j.media.2017.07.00528778026
12
McGhee GC, Sundin GW (2011) Evaluation of kasugamycin for fire blight management, effect on nontarget bacteria, and assessment of kasugamtcin resistance potential in Erwinia amlyovora. Phytopathology 101:192-204. doi:10.1094/PHYTO-04-10-0128 10.1094/PHYTO-04-10-012820923369
13
Moon T, Choi HY, Jung DH, Chang SH, Son JE (2020) Prediction of CO2 Concentration via Long Short-Term Memory Using Environmental Factors in Greenhouses. Hortic Sci Technol 38:201-209
14
Nam DS, Moon T, Lee JW, Son JE (2019) Estimating transpiration rates of hydroponically-grown paprika via an artificial neural network using aerial and root-zone environments and growth factors in greenhouses. Hortic Environ Biotechnol 60:913-923. doi:10.1007/s13580-019-00183-z 10.1007/s13580-019-00183-z
15
Park DH, Lee YG, Kim JS, Cha JS, Oh CS (2017) Current status of fire blight caused by Erwinia amylovora and action for its management in Korea. J Plant Pathol 99:59-63
16
Ronneberger O, Fischer P, Brox T (2015) U-net: Convolutional networks for biomedical image segmentation. International Conference on Medical Image Computing and Computer-assisted Intervention, Springer, pp 234-241. doi:10.1007/978-3-319-24574-4_28 10.1007/978-3-319-24574-4_28
17
Rural Development Administration (RDA) (2020) Occurrence of fire blight and control. RDA, Jeonju, Korea
18
Shin DS, Heo GI, Son SH, Oh CS, Lee YK, Cha JS (2018) Development of an improved loop-mediated isothermal amplification assay for on-site diagnosis of fire blight in apple and pear. J Plant Pathol 34:191-198
19
Yann L, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86:2278-2324. doi:10.1109/5.726791 10.1109/5.726791
20
Zhao Y. Tian Y. Wang L. Geng G. Zhao W. Hu B. Zhao Y. (2019) Fire blight disease, a fast-approaching threat to apple and pear production in China. J Integrative Agriculture 18:815-820. doi:10.1016/S2095-3119(18)62033-7 10.1016/S2095-3119(18)62033-7
페이지 상단으로 이동하기