Beruflich Dokumente
Kultur Dokumente
a r t i c l e i n f o a b s t r a c t
Article history: State-of-the-art object-based approaches to automatic plant classification for crop/weed discrimination
Received 5 June 2016 are reported to work with typical classification rates of 80–90% under laboratory or restricted field con-
Received in revised form 18 October 2016 ditions. Adapting their parameter sets and classifiers to match changing field situation is laborious, yet it
Accepted 7 January 2017
is required for practical application. Pixel-based classification allows adjusting the classifier model in the
Available online 27 January 2017
field easily by adding a few marks to sample data; however, pixel-based classification of camera data for
crop/weed discrimination is impractical, as pixel features lack descriptiveness. This paper contributes a
Keywords:
multi-wavelength laser line profile (MWLP) system for scanning the plants and obtaining sensor data,
Plant classification
Crop/weed discrimination
yielding image-based 3D range data, matched spectral reflectance, and scattering data at multiple wave-
3D sensing lengths for each pixel. Using these descriptive pixel features, pixel-based Bayesian classification for crop/
Spectral imaging weed discrimination requires very few field-specific label data, thus allowing In-Field-Labeling for clas-
Model adaptation sifier adaptation to specific field situations. For different field situations and two different crops (carrots
Bayesian classification (Daucus carota) and corn salad (Valerianella locusta)) the classification using spectral and 3D features
applying classifiers generated from very few marks in sample data (i.e., with very little effort for labeling),
was successfully demonstrated, thereby achieving misclassification rates comparable to the best litera-
ture values.
Ó 2017 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND
license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
1. Introduction may cause major yield losses, if uncontrolled. This makes manual
weeding necessary in early growth stages (Laber and Stützel,
A weed control system operating autonomously for selective 2003).
mechanical or thermal weed plant removal can help in many ways. At the time being, weeding is done manually by persons lying
With its use, the amount of herbicides applied could be reduced in on slowly moving vehicles while removing weeds, as Fig. 1 illus-
conventional farming. For organic farming in dense cultivation of trates. The working conditions are harsh, and so is the ergonomic
several vegetable plants, selective weed control is only possible situation. Further, costs per hectare caused by this production step
by hand until today. are high. Hence, an automated solution would be preferred.
Carrots (Daucus carrota) are a particular crop requiring much The main components of an autonomous mechanical weed con-
manual labor in organic growing (Fittje et al., 2015). The carrot trol system are a carrier vehicle including autonomous navigation,
shows relatively slow development in early growth stages. Hence, a sensor system for contactlessly scanning plants, a classification
competing weeds may overtake the crop plant and limit its access system, which makes the necessary decisions to treat or not to
to resources such as sunlight, moisture, and nutrient. Thus, weeds treat a plant based on the sensor data, and a weeding tool for selec-
tively treating the weed plants. Of these components, the lack of
weed detection and classification with robustness at practicable
⇑ Corresponding author. level was identified to be the main limitation for robotic weed con-
E-mail address: w.strothmann@hs-osnabrueck.de (W. Strothmann). trol (Slaughter et al., 2008).
http://dx.doi.org/10.1016/j.compag.2017.01.003
0168-1699/Ó 2017 The Authors. Published by Elsevier B.V.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
80 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93
Glossary
the processing chain. Oftentimes, this kind of adaptation is imple- applications based on camera data is limited. Pixel-based analysis
mented by introducing complex algorithm parameter sets exposed is widely used in remote sensing, e.g., satellite imagery or analysis
in the user interface. However, adjusting parameters with respect of hyper-spectral imaging data. For conventional cameras, the
to the specific situation is notoriously difficult (Burgos-Artizzu available features per pixel, i.e., red, green, blue values and – for
et al., 2009). Particularly, having farmers or operators of agricul- multispectral cameras – optionally Near Infra-red (NIR) values,
tural machines in mind as end users, it cannot be assumed that are few and, thus, a poor basis for complex classification problems.
they are trained on parameterizing image processing algorithms In particular, pixel-based crop/weed discrimination based on cam-
(Strothmann et al., 2015). era images is impracticable due to color similarity of crops and
Consequently, the image processing algorithm has to be ‘ab- weeds. Therefore, camera-based plant classification approaches
stracted away’ from the user interface. Here, machine learning rely on feature generation from the segmented shapes for provid-
techniques can help simplify the user interaction. In order to create ing the classifier with a sufficient feature vector (Weis and
a model by supervised machine learning, the user only has to add Sökefeld, 2010). However, using the descriptive features of the
labels to data samples. This task is typically much simpler. In par- Multi-wavelength Laser Line Profiling (MWLP) sensor system,
ticular, the targeted end users can be assumed proficient in plant classification in a pixel-based manner might be feasible.
inspecting and assessing plant objects.
However, if using classification systems with a design following
the above-mentioned four main processing steps, steps 1–3 can 2.1. The MWLP sensor system
only be adjusted by parametrization. Moreover, the classification
in step 4 happens at object level, i.e., image segments are classified. The MWLP approach is a novel sensing concept proposed and
As mentioned, meaningful classifiers based on machine learning realized by the authors (Strothmann et al., 2014). It expands on
techniques would typically require hundreds or thousands of the robust and broadly adopted laser line profiling method for
labeled reference samples. I.e., for setting up a plant classifier on triangulation-based sensing of dense 3D point clouds.
site, 100+ plant objects need to be labeled. Thus, setting up a refer- Triangulation-based line profile sensing is broadly adopted in
ence data base of labeled image segments of plants is only possible industrial processes and has found first applications in agriculture
during algorithm development. It can impossibly be done during for phenotyping (Paulus et al., 2013; Cai et al., 2013) and even for
system setup for field situations by end users, as labeling this many weed discrimination (Šeatović et al., 2010). However, in contrast to
data would take too long. classical line profiling, the MWLP system scans multiple laser lines
Therefore, In-Field-Labeling implies a classification approach at different wavelengths using a single camera. Further, not just
different from the mentioned processing steps. Classification can- the range information is measured, but reflection intensity and
not happen at segment level but has to be at pixel level. For backscatter features are extracted for each point of each detected
pixel-based classification, even few relatively small user marks in laser line, too. Subject to relative movement, the objects are
high-resolution image data provide ten-thousands of samples. scanned by all laser lines. Laser lines scanning one point at differ-
Hence, for pixel-based classification, the user only has to add a ent times are assembled based on motion estimation (rotary enco-
few marks with labels to some sample images from the specific der) and optical tracking. Thus, scanning results in 3D point clouds
field situation. This can be done quickly and easily for adjusting with reflection and scattering information at multiple, selectable
the pixel-based classification system toward new field situations. wavelengths available for each point (Strothmann et al., 2014).
Thus, In-Field-Labeling allows retraining the classification more Fig. 2 sketches the measurement concept; Fig. 3 presents sample
often, thereby keeping classifier knowledge up-to-date and making scans of field data acquired using the system.
classification more robust toward situation-specific changes MWLP’s single-camera approach is in contrast to other
(Strothmann et al., 2015). approaches toward scanning and merging line profiles at multiple
The authors implemented a classification system with a Graph- wavelengths from multiple scanners (Dupuis et al., 2015). Using a
ical User Inter-face (GUI) for In-Field-Labeling (Strothmann et al., single camera and, thus, avoiding misalignment issues of multiple
2015). It comprises a data management system for combining label sensors, particularly favors the outdoor use of the system, where
reference and image data with a database-like relational mapping defined movements of different sensors cannot be guaranteed
at pixel level, i.e., it allows arbitrary label marks/segments (Strothmann et al., 2014). The camera used here is a Baumer
(Strothmann et al., 2013). The pixel-based classification is done HXG20NIR, providing, at its full resolution of 2048 1088 pixels,
by a Bayesian classifier (Strothmann et al., 2015). images at up to 105 Hz frame rate, which MWLP’s line detection
For applying In-Field-Labeling in the field, the user first has to computer can process online. In addition, the camera offers a sub-
sample data on the field to be treated and to add few marks to sampling mode, where of each 2 2 pixels only one pixel is read
the data. The classifier is generated automatically from the marks out, resulting in a reduced resolution of 1024 544 pixels. In this
and sample data. This field setup would only take few minutes mode, the frame rate can be increased up to 420 Hz without spec-
and allow using a plant classifier that is targeted to the field it is ifying an Region of Interest (ROI) to read out from the sensor, i.e.,
applied on. However, In-Field-Labeling does not mean that the user without reducing the field of view of measurement range of the
must follow these steps. Of course, classifiers can also be saved and system. Furthermore, reduced-resolution images get processed
reused on other days/fields if working satisfactorily. Our In-Field- online at 420 Hz.
Labeling concept with pixel-based classification only assures that The line detection of the MWLP system is performed in a dedi-
the user can create a better classifier in the field if shipped classi- cated ROIs for each mounted line laser (cf. Fig. 4). The relative
fiers do not work in the specific field situation without having to motion between the sensor and the sensed objects is monitored
descend into complex algorithm parameter sets. by a rotary encoder mounted to the pulley of the conveyor (cf.
Fig. 2) or by the odometry of the carrying vehicle. Further, the
robust, but crude rotary encoder data is improved by optical track-
2. Materials and methods ing for correct assembly of the scan data collected with different
lasers. Moreover, the optical tracking allows obtaining differential
As described, the In-Field-Labeling approach allows quick and images by subtracting images in which lines should be detected by
easy adaptation of image processing chains toward field-specific images captured with laser lines shifted. This results in significant
situations by inexperienced users. However, the range of possible enhancements of the laser lines in the images, thus normalizing
82 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93
Fig. 2. Overview of the measurement concept of the MWLP system (Strothmann et al., 2014).
Fig. 3. MWLP in field operation for scanning plants. Top-left: view of web-cam mounted inside BoniRob (cf. Fig. 9) looking at the laser lines; bottom-right: raw camera image
of the camera of the MWLP system; bottom-left: line detection of multiple lasers by the MWLP system; top-right: output data of the MWLP system visualized as point cloud.
Colors represent intensities in different laser wavelengths. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this
article.)
Fig. 4. Excerpt of a camera image from scanning two laser lines including detected laser lines. The line detection is conducted in a dedicated ROI for each laser line. The
straight colored lines indicate the boundaries of the ROIs. On the left side a potato is scanned, on the right side a stone is scanned (Strothmann et al., 2015).
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 83
the images before line detection and canceling out influences of depict mounting and field operation of the MWLP system on
uncontrolled ambient lighting (cf. Fig. 5) (Strothmann et al., 2014). BoniRob. To avoid disturbances by hard sun light exposure, a sun
The line detection is done in the differential images by running shade is mounted. The system operates beneath the BoniRob look-
along the image columns (Fig. 6). In each column of each dedicated ing down. An actuator for weed treatment could further be
ROI, only one segment is allowed, otherwise nothing is detected mounted there together with the system and inside the sun shade.
(Strothmann et al., 2014). Owing to the high level of normalization Fig. 10 shows an image-based color visualization of scan data
of the differential image, lines can be detected by applying a simple gathered with the MWLP of carrot and weed plants scanned in-
threshold. Fig. 7 shows an example for a cut along the image col- field. The distance information obtained along with the scans of
umn of a ROI. The gray values of the pixels in the differential image Fig. 10 is depicted in Fig. 11.
are drawn on the y-axis; the x-axis indicates the image rows. Along
with the line detection, feature extraction for describing the line 2.3. Feature selection for background removal
points is done. First, the line start and line end of the segment
are detected. The line position, which is relevant for determining The first step to be applied in detection and classification of
the range value, i.e., z-position, is then assumed to be the center crop/weed plants on data such as shown in Figs. 10 and 11 is back-
of the line, i.e., in the middle between line start and end. Further, ground removal, i.e., filtering pixels representing soil and, thus, not
features for describing scattering and reflection at this point are belonging to any plant. This is done by a pixel-based foreground/
extracted. To describe the reflection features, the IntensityMaxi- background classification. Note that this does not represent a clas-
mum, the LineWidth, and the IntensitySum, i.e., sum of pixel inten- sical segmentation in the sense of the four main processing steps for
sities below the line curve between line start and end, are camera-based plant classification mentioned in the introduction,
determined (Fig. 7). The scattering, i.e., light entering the object due to the following points. First, the foreground/background sepa-
being scattered back beneath the surface, results in higher pixel ration is done by pixel-based classification, i.e., it is trainable by In-
intensities before line start and after the line end. It is described Field-Labeling. Second, it is determined on pixel level rather than
by summing up pixel intensities at different distance ranges before segment level, i.e., a biomass pixel can completely be surrounded
and after the laser line. E.g., the feature Scatter20Section is deter- by soil pixels (and vice versa) and will still be correctly classified.
mined by summing up the intensities for the pixels 1–20 before Third, the determined soil pixels are only skipped for plant classifi-
and after the laser line, for Scatter40Section the pixels 21–40 are cation. This means, during following feature generation steps for a
summed and for Scatter40Sum pixels 1–40 are summed, i.e., Scat- pixel, the entire neighborhood raw data is still available. Moreover,
ter40Sum equals sum of Scatter20Section and Scatter40Section. The during feature generation, a pixel does not ‘know’ whether its
same is done for the Scatter60. . . and Scatter80. . . features neighbors are soil or biomass pixels. Thus, no plant classification
(Strothmann et al., 2015). features are generated based on the foreground/background classi-
fication. Hence, poor results of the soil/biomass separation have no
significant influence on the plant classification.
2.2. Field operation mounted into BoniRob
Selection of features for classification should focus on maxi-
mum relevance of the selected features for the outcome and min-
As mentioned, a relative movement between sensor and sensed
imum redundancy of the selected features (Peng et al., 2005). For
objects is required for operating the MWLP system. However, this
identifying features relevant for classification of soil and biomass,
is not limited to mounting the system above a conveyor, as
a set of collected field data was marked with labels using the In-
sketched in Fig. 2. It is also possible to operate the system in a con-
Field-Labeling GUI. Fig. 12 shows a screenshot of the respective
figuration of still objects and moved sensor. This configuration was
labeling tool.
used in field trials of the system mounted into the autonomous
For determining relevance of the different features for soil/bio-
field robot platform BoniRob. This platform has an ‘App’ concept,
mass classification, a numeric measure of the mutual information
allowing different modules to be mounted on the platform via
of the feature values and the categorical classification outcomes
defined mechanical, electrical, and logical interfaces for supplying
had to be determined. To obtain this, the feature values of pixels
the module with power and process information (Bangert et al.,
marked as background in Fig. 12 were aggregated into one his-
2013). Thereby, the rotary encoder data of the setup in Fig. 2 is
togram and the feature values of pixels marked as plants into
replaced by odometry data of the robot platform. Figs. 8 and 9
another. Next, both histograms for each feature were compared
by calculating the Sum of Absolute Differences (SAD), i.e., for each
histogram cell, the absolute difference between both histogram val-
ues is derived, and these values are summed up. Further, the SAD is
normalized by the sum of both histograms. This results in a mutual
information measure scaling between 0.0 and 1.0, where 0.0 indi-
cates identical histograms, i.e., the respective feature has absolutely
no relevance, and 1.0 signifies complete separation, i.e., the out-
come can safely be determined using only this feature. For values
in between, the histograms partly overlap, partly differ. Hence, val-
ues with moderately high SAD-based relevance measures may con-
tribute to the classification, but multiple features are required.
All features extracted from the laser lines by the MWLP system
for each pixel were tested for soil/biomass classification using the
described relevance measure (cf. Table 1). Further, the NDVI was
tested. It is not directly provided by the MWLP system, but it can
easily be calculated for each pixel using the IntensitySum features
Fig. 5. Differential image corresponding to Fig. 4. The effects induced by the laser of the red and NIR laser by applying the following formula:
lighting (reflection and scattering) are highly emphasized, ambient lighting is
canceled out. The potatoes on the left side cause the laser lines to scatter (due to NIR Red
high water content), while the stone on the right side does not (due to high optical NDVI ¼
NIR þ Red
density) (Strothmann et al., 2015).
84 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93
Fig. 6. Gray values of one image column of the differential image plotted over the image rows. The plots show columns of the image data given in Fig. 5. Note that both plots
show image columns cutting the dedicated ROIs of two lasers. The left plot represents a column on the left side of Fig. 5 from scanning a potato, i.e., with significant scattering.
The right plot represents a column on the right side of Fig. 5 from scanning a stone, i.e., without significant scattering (Strothmann et al., 2015).
Fig. 7. Cut through a laser line along the image column for feature extraction of the detected laser lines by the MWLP system (Strothmann et al., 2015).
Fig. 8. Photo of the sensing head of the MWLP system mounted into BoniRob for
scanning corn salad (Valerianella locusta). For this operation, a red, a green and a NIR
(invisible here) laser are used. (For interpretation of the references to colour in this Fig. 9. The autonomous field robot BoniRob equipped with MWLP system scanning
figure legend, the reader is referred to the web version of this article.) corn salad.
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 85
Fig. 12. Screenshot from labeling carrots and weed plants. Mind that classification, labeling, and data persistence are pixel-based, i.e., the form of the label marks does not
matter. It is important to point out regions typical for each class. Green labels mark crop plants, red labels weed plants and black labels the background, i.e., soil. (For
interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)
relevance measures are given in Table 2. As Table 2 shows, the (Langsenkamp et al., 2014). Stamp-like actuators can be scaled to
minimum and maximum radii estimated by the original RSD have a line of individual actuators that can treat weeds while being
moderate significance. Of the features estimated by the extended moved over the plant rows. This setup has the advantage that no
RSD the AngleNeighborMax and AngleNeighborMin features as well manipulator arm is required for moving the weeding tool to the
as the NeighborCount show moderate relevances. MagicToValid- target weed. The weeding tool is only activated when it is located
Count, ValueCount and ChangeSum have the highest relevances of over a weed while the entire weeding system is in continuous
features extracted by the line flicker. Consequently, the configura- motion over the crop rows. Thereby, adjacent cells get treated
tion given in Table 3 of MWLP and classification was used for the independently.
following experiments. Treating the weeds in a grid is a common technique for patch
spraying (Gonzalez-de Soto et al., 2016). However, for mechanical
2.5. System concept of the weeding system and grid aggregation weed treatment, camera-based approaches typically attempt to
detect individual plants. With the described approach, we perform
To validate the pixel-based crop/weed classification, a grid the weed detection by applying the grid-based method. A much
aggregation was set up. The aggregated treatment grids serve as higher grid resolution than for patch spraying is required, though.
an input for actuator control. Langsenkamp et al. showed the effec- Following the concept of treating plants using a line of stamp
tiveness of stamp-like weeding tools for mechanical weed control actuators without a manipulator arm leads to treating the field in
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 87
Table 1
SAD-based relevance measures for the pixel feature channels extracted by the MWLP system and the NDVI for soil/biomass classification.
Table 2
Descriptions and SAD-based relevances for crop/weed classification of the generated numeric 3D surface feature values. Angle a is between normals of neighbor points, b between
normal and connection vector to neighbor point.
Fig. 14. Treatment of weeds in a grid using a line sensor, e.g., MWLP system, and a line actuator such as a line of stamp tools.
Fig. 15. Color visualization of the original MWLP sensor data as input for plant Fig. 16. Binary image generated by the soil/plant classifier for filtering out soil
classification. Note that the intensity values are colored with NIR drawn in blue, i.e. pixels. Soil pixels are drawn in black, biomass pixels in white.
same colorization like in Fig. 10. (For interpretation of the references to colour in
this figure legend, the reader is referred to the web version of this article.)
data, but no intermediate or final results of the classification pipe-
line, thereby avoiding these to influence him. Fig. 19 depicts the
grid cells are to be treated. Note, most of the weeds are treated, reference grid for comparison with the automatically generated
while the crop plants remain mostly untreated. To verify the clas- treatment grid given in Fig. 18.
sification results, reference grids were created manually: a human The manually referenced grids were compared with automati-
user referenced the grid cells as soil, crop or weed as ground truth cally generated treatment grids. Consequently, 3 kinds of misclas-
to be compared with the automatically generated treatment grids. sification for the cells of the automatically generated treatment
During referencing, the user only saw the visualized MWLP sensor grid were possible (cf. Table 4):
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 89
Fig. 17. Probability image of the pixel-based crop/weed classification. Safely Fig. 18. Overlay of the original input image and the grid with the final treatment
classified crop pixels are drawn in green, weed pixels in red. (For interpretation decision. Red-shaded grid cells should be treated for weed removal according to the
of the references to colour in this figure legend, the reader is referred to the web automatic classification. (For interpretation of the references to colour in this figure
version of this article.) legend, the reader is referred to the web version of this article.)
1. Cell manually referenced as soil, automatic decision to treat the 3.1. Carrot cultivation
cell
In practice, this kind of misclassification is uncritical as it would The classification was verified by comparison against the man-
not affect yield. Only energy consumption or tool wear might be ually referenced grids for field-data gathered on two different
effected. dates and on different fields. For the data of both days, a labeling
of a few marks in a small set of sample data according to the In-
2. Cell manually referenced as crop, automatic decision to treat
Field-Labeling concept was made before, in order to train the clas-
the cell sifiers with up-to-date label data. Table 5 shows the result statis-
In practical application this kind of misclassification is critical. tics of the misclassification on the first day at different driving
Crop plants must not be effected by the weeding tool, no matter speeds of the system. The described case distinction is made for
where. the misclassification of weed cells (cf. Table 4). This table gives
3. Cell manually referenced as weed, automatic decision to not the full descriptive statistics at the example of the data set with
treat the cell driving speed 25 mm/s mentioned in Table 5. All weed cells shall
This kind of misclassification should be avoided. However, com- be treated, soil and crop cells shall not be treated. Thus, cells
paring manually referenced grids with auto-generated grids, we included in the italic percentages are correctly treated. For soil mis-
noted that the user mostly referenced the weed plants includ- classification, treatment is uncritical. On the other hand, for crop
ing their boundaries, while for automatically generated grids plants misclassification, treatment is critical, regardless where.
only the plant center was treated. Therefore, a case distinction Further, for weed plant the mentioned case distinction is done:
was made. If a grid cell referenced as a weed cell but not the plain percentage in the manual weed/auto not treat cell (here
automatically treated has an adjacent treated cell, it will be 31.35%) includes untreated cells with adjacent cells. The weed is
assumed that the weed plant is damaged in the other cell. In damaged there, thus this misclassification is uncritical (comparing
this case, the misclassification is uncritical. If it has no adjacent Figs. 18 and 19, note that the user often marked including plant
treated cell, it will be assumed that the respective weed plant boundaries, while the algorithm only decided to treat the plant
stays completely untreated, in which case the misclassification center). Only the untreated weed cells without adjacent treated
is critical. cells (here 9.62%, separated by/) are critical. Hence, the bold
90 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93
are shortened: the skipped figures are redundant values that can
be calculated using the pattern shown in Table 4. This is also the
case for Tables 6–10.
The bold values in Table 5 indicate the misclassification kinds
identified as critical. The MWLP was operating with full camera
resolution at a camera frame rate of approx. 50 Hz for these tests.
The row length scanned for each data set is further mentioned in
Table 5. The width of scanned stripe in these and all following data
sets is approx. 20 cm, i.e., matching the size of the ridge crown of a
carrot ridge. Note that the critical misclassification safely stayed
below 10% for that field situation throughout the tested speed
range.
On day two, the system was again tested with full camera res-
olution. The frame rate was at approx. 75 Hz. The result statistics
for this test is given in Table 6. Here, the misclassification rates
are slightly increased, but stayed at around 10–15%. Further, on
the second day the system was tested with the camera in 2 2
subsampling mode, i.e., an image resolution reduced by 4 together
with frame rate increased to approx. 360 Hz. Table 7 shows that
the misclassification further increases. Even intolerable values of
20–30% are occasionally reached. Apparently, full camera resolu-
tion is required for carrot classification due to the fine-feathered
pinnate leafs of carrots.
Table 5
Shortened result statistics for different data sets of day 1.
Table 6
Shortened result statistics for different data sets day 2 with full camera resolution.
Table 7
Shortened result statistics for different data sets of day 2 in subsampling mode.
Table 8
Shortened result statistics for different data sets of corn salad.
Table 9
Comparing up-to-date classifiers with classifiers trained with data of the respective other day.
Table 10
Comparing classifiers of correct crop with wrong classifiers trained with data of the respective other crop.
data from both days were processed two times: first using a clas- classifiers for the respective other crop resulted in drastically
sifier generated from labeled marks in data of the same day, and increased rates of untreated weeds.
second using a classifier trained with label data of the respective
other day. Table 9 shows the results. For both subsets, the classifi-
cations with the classifier of the respective other date cause drastic 4. Discussion
increases of the critical misclassification for both crops and weeds.
Thus, the In-Field-Labeling concept helped to adjust the classifiers We have shown that pixel-based classification of MWLP sensor
toward the particular field conditions of the respective days. data is feasible for crop/weed discrimination. We used spectral
Further, the effect of the label knowledge for adapting classifiers laser line reflection and scattering features for separating soil and
toward the different crops was tested. Therefore, a subset of corn plant pixels; the generated 3D features served for crop/weed clas-
salad data was processed using a classifier for carrots and vice sification. The achieved misclassification rates of around 5–15% for
versa. Table 10 shows the results. In both cases, using the wrong carrots and below 10% for corn salad are within a good level, as
92 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93
Fig. 20. Labeling for corn salad data. Green labels mark crop plants, red labels weed plants and black labels the background, i.e., soil. Cf. Fig. 12. (For interpretation of the
references to colour in this figure legend, the reader is referred to the web version of this article.)
speeds of up to 0.1 m/s for carrots and up to 0.4 m/s for corn salad
are well in the range of practical figures observed for hand-
weeding (Fittje et al., 2015), too. In addition, there are still some
technical options to be explored for further increasing speed. As
the results showed, the misclassification rates did not vary much
with the speeds in the tested ranges. Thus, increases are possible.
Following tests have to show how far the driving speeds can be
increased. For other applications, the MWLP system and classifica-
tion have been applied with speeds up to 1.09 m/s.
Additionally, the capacity of a weeding system based on the
MWLP, like drafted in Fig. 13 for treating weeds on carrot ridges
in row can be scaled by adding multiple systems for different rows
to a common carrier, thus, treating multiple rows in a single pass.
The impact of the In-Field-Labeling for pixel-based classification
was demonstrated. In both tested cases (classifiers for wrong dates
and classifiers for wrong crops), the respective wrong classifiers
performed drastically worse than the correct ones. Thus, our
pixel-based In-Field-Labeling method for MWLP appears to have
a very high potential for quick and easy adaptation of classifier
models to different field situations. This is a particularly important
feature for practical adoption where a classifier developed for a
specific set of field situations may work poorly for other field situ-
ation. Our system, in contrast, can be geared toward the specific
field situation by the end user in a quick and easy manner.
5. Conclusion
lems for field application of automatic plant classification are can- Komi, P.J., Jackson, M.R., Parkin, R.M., 2007. Plant classification combining colour
and spectral cameras for weed control purposes. In: 2007 IEEE International
celed out (overlapping, segmentation). Moreover, enabled by the
Symposium on Industrial Electronics. IEEE, pp. 2039–2042. http://dx.doi.org/
pixel-based classification, quick and easy adaptation of the system 10.1109/ISIE.2007.4374921.
to new field situations is possible by In-Field-Labeling. This allows Laber, H., Stützel, H., 2003. Ertragswirksamkeit der Restverunkrautung in
untrained users to train the classification with up-to-date data. Gemüsekulturen nach nichtchemischen Unkrautregulationsmaßnahmen.
Pflanzenbauwissenschaften 7 (1), 29–38.
Thereby, robustness with respect to different field situations is Langsenkamp, F., Sellmann, F., Kohlbrecher, M., Kielhorn, A., Strothmann, W.,
achieved. Finally, labeling small sets of data at a huge number of Michaels, A., Ruckelshausen, A., Trautz, D., 2014. Tube Stamp for mechanical
field situations in the described manner would open the door for intra-row individual Plant Weed Control. In: 18th World Congress of CIGR,
CIGR2014, Beijing, China, Sept.16–19, 2014. <https://www.hs-osnabrueck.
long-term (or life-long) machine learning strategies. de/fileadmin/HSOS/Homepages/COALA/Veroeffentlichungen/2014-CIGR_2014_
Tube_Stamp_for_mechanical_intra-row_individual_Plant_Weed_Control.pdf>
Acknowledgements (05/21/2016).
Marton, Z.-C., Pangercic, D., Blodow, N., Kleinehellefort, J., Beetz, M., 2010. General
3D modelling of novel objects from a single view. In: 2010 IEEE/RSJ
This work was conducted in context of the research project International Conference on Intelligent Robots and Systems (IROS). IEEE, pp.
RemoteFarming.1 supported by German Federal Ministry for Food 3700–3705. http://dx.doi.org/10.1109/IROS.2010.5650434.
Moshou, D., Kateris, D., Pantazi, X., Gravalos, I., 2013. Crop and weed species
and Agriculture (BMEL) and managed by German Federal Office for recognition based on hyperspectral sensing and active learning. Precision
Agriculture and Food (BLE). Agriculture, vol. 13. Springer, pp. 555–561.
Okamoto, H., Murata, T., Kataoka, T., Hata, S.-I., 2007. Plant classification for weed
detection using hyperspectral imaging with wavelet analysis. Weed Biol.
References
Manage. 7 (1), 31–37. http://dx.doi.org/10.1111/j.1445-6664.2006.00234.x.
Pastrana, J.C., Rath, T., 2013. Novel image processing approach for solving the
Aitkenhead, M.J., Dalgetty, I.A., Mullins, C.E., McDonald, A.J.S., Strachan, N.J.C., 2003. overlapping problem in agriculture. Biosyst. Eng. 115 (1), 106–115. http://dx.
Weed and crop discrimination using image analysis and artificial intelligence doi.org/10.1016/j.biosystemseng.2012.12.006.
methods. Comput. Electron. Agric. 39 (3), 157–171. http://dx.doi.org/10.1016/ Paulus, S., Dupuis, J., Mahlein, A.-K., Kuhlmann, H., 2013. Surface feature based
S0168-1699(03)00076-0. classification of plant organs from 3D laserscanned point clouds for plant
Bangert, W., Kielhorn, A., Rahe, F., Albert, A., Biber, P., Grzonka, S., Haug, S., Michaels, phenotyping. BMC Bioinform. 14 (1), 238. http://dx.doi.org/10.1186/1471-
A., Mentrup, D., Hänsel, M., Kinski, D., Möller, D., Ruckelshausen, A., Scholz, C., 2105-14-238.
Sellmann, F., Strothmann, W., Trautz, D., 2013. Field-Robot-Based Agriculture: Peng, H., Long, F., Ding, C., 2005. Feature selection based on mutual information
‘‘RemoteFarming.1” and ‘‘BoniRob-Apps”. In: 71th conference LAND.TECHNIK- criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans.
AgEng 2013, VDI Verlag GmbH, Düsseldorf, 2013, pp. 439–446. <https://www. Pattern Anal. Mach. Intell. 27 (8), 1226–1238. http://dx.doi.org/10.1109/
hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/Veroeffentlichungen/ TPAMI.2005.159.
2013-VDIAgEng-BoniRob-Apps.pdf> (05/21/2016). Persson, M., Åstrand, B., 2008. Classification of crops and weeds extracted by active
Burgos-Artizzu, X.P., Ribeiro, A., Tellaeche, A., Pajares, G., Fernández-Quintanilla, C., shape models. Biosyst. Eng. 100 (4), 484–497. http://dx.doi.org/10.1016/j.
2009. Improving weed pressure assessment using digital images from an biosystemseng.2008.05.003.
experience-based reasoning approach. Comput. Electron. Agric. 65 (2), 176– Rusu, R.B., Cousins, S., 2011. 3D is here: Point Cloud Library (PCL). In: IEEE
185. http://dx.doi.org/10.1016/j.compag.2008.09.001. International Conference on Robotics and Automation (ICRA), Shanghai, China.
Cai, X., Sun, Y., Zhao, Y., Damerow, L., Lammers, P.S., Sun, W., Lin, J., Zheng, L., Tang, http://dx.doi.org/10.1109/ICRA.2011.5980567.
Y., 2013. Smart detection of leaf wilting by 3D image processing and 2D Fourier Šeatović, D., Kutterer, H., Anken, T., 2010. Automatic weed detection and treatment
transform. Comput. Electron. Agric. 90, 68–75. http://dx.doi.org/10.1016/ in grasslands. In: 2010 Proc. ELMAR. IEEE, pp. 65–68 http://ieeexplore.ieee.org/
j.compag.2012.11.005. stamp/stamp.jsp?tp=&arnumber=5606087&isnumber=5606063. 09/10/2016.
De Rainville, F.-M., Durand, A., Fortin, F.-A., Tanguy, K., Maldague, X., Panneton, B., Slaughter, D., Giles, D., Downey, D., 2008. Autonomous robotic weed control
Simard, M.-J., 2014. Bayesian classification and unsupervised learning for systems: a review. Comput. Electron. Agric. 61 (1), 63–78. http://dx.doi.org/
isolating weeds in row crops. Pattern Anal. Appl. 17 (2), 401–414. http://dx.doi. 10.1016/j.compag.2007.05.008.
org/10.1007/s10044-012-0307-5. Søgaard, H., 2005. Weed classification by active shape models. Biosyst. Eng. 91 (3),
Dupuis, J., Paulus, S., Mahlein, A.-K., Kuhlmann, H., Eichert, T., 2015. The impact of 271–281. http://dx.doi.org/10.1016/j.biosystemseng.2005.04.011.
different leaf surface tissues on active 3D laser triangulation measurements. Strothmann, W., Kielhorn, A., Tsukor, V., Trautz, D., Ruckelshausen, A., 2013.
Photogramm. – Fernerkundung – Geoinform. 6, 437–447. http://dx.doi.org/ Interactive Image Segmentation for Model Adaption and Decision Support. In:
10.1127/pfg/2015/0280. 9th European Conference on Precision Agriculture, Book of Posters, pp. 95–96.
Dzinaj, T., Kleine Hörstkamp, S., Linz, A., Ruckelshausen, A., Böttger, O., Kemper, M., <https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/
Marquering, J., Naescher, J., Trautz, D., Wigerodt, E., 1998. Multi-sensor-system Veroeffentlichungen/2013-ECPA-Interactive-Image-Segmentation.pdf> (05/21/
zur Unterscheidung von Nutzpflanzen und Beikräutern. Zeitschrift für 2016).
Pflanzenkrankheiten und Pflanzenschutz 16, 233–242. Strothmann, W., Ruckelshausen, A., Hertzberg, J., 2014. Multiwavelength laser line
Emmi, L., Gonzalez-de Soto, M., Pajares, G., Gonzalez-de Santos, P., 2014. Integrating profile sensing for agricultural crop characterization. In: SPIE Photonics Europe.
sensory/actuation systems in agricultural vehicles. Sensors 14 (3), 4014–4049. International Society for Optics and Photonics, p. 91411K. http://dx.doi.org/
http://dx.doi.org/10.3390/s140304014. 10.1117/12.2052009.
Fittje, S., Hänsel, M., Langsenkamp, F., Kielhorn, A., Kohlbrecher, M., Vergara, M., Strothmann, W., Tsukor, V., Ruckelshausen, A., 2015. In-Field-Labeling-HMI für
Trautz, D., 2015. Praxiserhebungen zu Aufwand und Erfolg der Handjäte in automatische Klassifizierung bei der Pflanzen- und Erntegutcharakterisierung
Möhren unter ökologischer Bewirtschaftung. In: Am Mut hängt der Erfolg - mittels bildgebender Sensordaten, in: Informatik in der Land-, Forst-, und
Rückblicke und Ausblicke auf die ökologische Landbewirtschaftung. Beiträge Ernährungswirtschaft, Referate der 35. GIL-Jahrestagung, Geisenheim, pp. 177–
zur 13. Wissenschaftstagung Ökologischer Landbau, Verlag Dr. Köster, Berlin. 180. <http://https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/
<http://orgprints.org/27154/1/27154_fittje.pdf> 01/10/2016. COALA/Veroeffentlichungen/2015-In-Field-Labeling-HMI_fuer_automatische_
Gonzalez-de Soto, M., Emmi, L., Perez-Ruiz, M., Aguera, J., Gonzalez-de Santos P. Klassifizierung.pdf> 05/21/2016.
Autonomous systems for precise spraying–evaluation of a robotised patch Strothmann, W., Tsukor, V., Hertzberg, J., Ruckelshausen, A., 2015.
sprayer. Biosyst. Eng. http://dx.doi.org/10.1016/j.biosystemseng.2015.12.018. Konfigurationsmöglichkeiten und Datenkonzepte des Multiwavelength Line
Haug, S., Michaels, A., Biber, P., Ostermann, J., 2014. Plant classification system for Profiling (MWLP) Systems. In: Bornimer Agrartechnische Berichte, vol. 88, pp.
crop/weed discrimination without segmentation. In: 2014 IEEE Winter 42–52. <https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/
Conference on Applications of Computer Vision (WACV). IEEE, pp. 1142–1149. Veroeffentlichungen/2015-CBA-MWLPMultiwavelength.pdf> (05/21/2016).
http://dx.doi.org/10.1109/WACV.2014.6835733. Suzuki, Y., Okamoto, H., Kataoka, T., 2008. Image segmentation between crop and
Hemming, J., Rath, T., 2001. PA - precision agriculture: computer-vision-based weed weed using hyperspectral imaging for weed detection in soybean field. Environ.
identification under field conditions using controlled lighting. J. Agric. Eng. Res. Control Biol. 46 (3), 163–173. http://dx.doi.org/10.2525/ecb.46.163.
78 (3), 233–243. http://dx.doi.org/10.1006/jaer.2000.0639. Weis, M., Sökefeld, M., 2010. Detection and identification of weeds. In: Precision
Holzer, S., Rusu, R.B., Dixon, M., Gedikli, S., Navab, N., 2012. Adaptive neighborhood Crop Protection-the Challenge and Use of Heterogeneity. Springer, pp. 119–134.
selection for real-time surface normal estimation from organized point cloud http://dx.doi.org/10.1007/978-90-481-9277-9_8.
data using integral images. In: 2012 IEEE/RSJ International Conference on Weiss, U., Biber, P., Laible, S., Bohlmann, K., Zell, A., 2010. Plant species classification
Intelligent Robots and Systems (IROS). IEEE, pp. 2684–2689. http://dx.doi.org/ using a 3D lidar sensor and machine learning. In: 2010 Ninth International
10.1109/IROS.2012.6385999. Conference on Machine Learning and Applications (ICMLA). IEEE, pp. 339–345.
Japkowicz, N., Stephen, S., 2002. The class imbalance problem: a systematic study. http://dx.doi.org/10.1109/ICMLA.2010.57.
Intell. Data Anal. 6 (5), 429–449 http://dl.acm.org/citation.cfm?id=1293951.
1293954. 01/10/2016.