Sie sind auf Seite 1von 15

Computers and Electronics in Agriculture 134 (2017) 79–93

Contents lists available at ScienceDirect

Computers and Electronics in Agriculture


journal homepage: www.elsevier.com/locate/compag

Plant classification with In-Field-Labeling for crop/weed discrimination


using spectral features and 3D surface features from a multi-wavelength
laser line profile system
Wolfram Strothmann a,⇑, Arno Ruckelshausen a, Joachim Hertzberg c,d, Christian Scholz a,
Frederik Langsenkamp b
a
University of Applied Sciences Osnabrueck, Faculty of Engineering and Computer Science, Albrechtstrasse 30, Osnabrueck, Germany
b
University of Applied Sciences Osnabrueck, Faculty of Agricultural Sciences and Landscape Architecture, Oldenburger Landstrasse 30, Osnabrueck, Germany
c
Osnabrueck University, Institute of Computer Science, Knowledge-based Systems Group, Albrechtstrasse 28, Osnabrueck, Germany
d
German Research Center for Artificial Intelligence (DFKI), Robotics Innovation Center, Albert-Einstein-Strasse 1, Osnabrueck, Germany

a r t i c l e i n f o a b s t r a c t

Article history: State-of-the-art object-based approaches to automatic plant classification for crop/weed discrimination
Received 5 June 2016 are reported to work with typical classification rates of 80–90% under laboratory or restricted field con-
Received in revised form 18 October 2016 ditions. Adapting their parameter sets and classifiers to match changing field situation is laborious, yet it
Accepted 7 January 2017
is required for practical application. Pixel-based classification allows adjusting the classifier model in the
Available online 27 January 2017
field easily by adding a few marks to sample data; however, pixel-based classification of camera data for
crop/weed discrimination is impractical, as pixel features lack descriptiveness. This paper contributes a
Keywords:
multi-wavelength laser line profile (MWLP) system for scanning the plants and obtaining sensor data,
Plant classification
Crop/weed discrimination
yielding image-based 3D range data, matched spectral reflectance, and scattering data at multiple wave-
3D sensing lengths for each pixel. Using these descriptive pixel features, pixel-based Bayesian classification for crop/
Spectral imaging weed discrimination requires very few field-specific label data, thus allowing In-Field-Labeling for clas-
Model adaptation sifier adaptation to specific field situations. For different field situations and two different crops (carrots
Bayesian classification (Daucus carota) and corn salad (Valerianella locusta)) the classification using spectral and 3D features
applying classifiers generated from very few marks in sample data (i.e., with very little effort for labeling),
was successfully demonstrated, thereby achieving misclassification rates comparable to the best litera-
ture values.
Ó 2017 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND
license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

1. Introduction may cause major yield losses, if uncontrolled. This makes manual
weeding necessary in early growth stages (Laber and Stützel,
A weed control system operating autonomously for selective 2003).
mechanical or thermal weed plant removal can help in many ways. At the time being, weeding is done manually by persons lying
With its use, the amount of herbicides applied could be reduced in on slowly moving vehicles while removing weeds, as Fig. 1 illus-
conventional farming. For organic farming in dense cultivation of trates. The working conditions are harsh, and so is the ergonomic
several vegetable plants, selective weed control is only possible situation. Further, costs per hectare caused by this production step
by hand until today. are high. Hence, an automated solution would be preferred.
Carrots (Daucus carrota) are a particular crop requiring much The main components of an autonomous mechanical weed con-
manual labor in organic growing (Fittje et al., 2015). The carrot trol system are a carrier vehicle including autonomous navigation,
shows relatively slow development in early growth stages. Hence, a sensor system for contactlessly scanning plants, a classification
competing weeds may overtake the crop plant and limit its access system, which makes the necessary decisions to treat or not to
to resources such as sunlight, moisture, and nutrient. Thus, weeds treat a plant based on the sensor data, and a weeding tool for selec-
tively treating the weed plants. Of these components, the lack of
weed detection and classification with robustness at practicable
⇑ Corresponding author. level was identified to be the main limitation for robotic weed con-
E-mail address: w.strothmann@hs-osnabrueck.de (W. Strothmann). trol (Slaughter et al., 2008).

http://dx.doi.org/10.1016/j.compag.2017.01.003
0168-1699/Ó 2017 The Authors. Published by Elsevier B.V.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).
80 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Glossary

GPU Graphics Processing Unit PCL Point Cloud Library


GUI Graphical User Interface RGB Red, Green, Blue
MWLP Multi-wavelength Laser Line Profiling ROI Region of Interest
NDVI Normalized Difference Vegetation Index RSD Radius-based Surface Descriptor
NE Normal Estimation SAD Sum of Absolute Differences
NIR Near Infra-red

(regardless of plant species) by different algorithms. For a


bi-/multispectral camera, the Normalized Difference Vegetation
Index (NDVI) is frequently calculated and thresholded. Edge
detection can also be applied. Further, morphological opera-
tions (e.g. erosion, dilation) of filtering small segments may be
executed on the segmented binary image (Weis and Sökefeld,
2010, p. 127).
3. Feature extraction
Next, image segments containing biomass pixels, i.e., the plant
objects, are described by numeric features. These features
may be skeleton-based or shape-based features (cf. Weis and
Sökefeld, 2010, p. 128; Haug et al., 2014, p. 1144). Further, sta-
tistical features, e.g. mean/variance, of pixels’ color values, may
be used (Haug et al., 2014, p. 1144) as well as texture features
(Weis and Sökefeld, 2010, p. 125).
4. Classification
For a set of pre-collected image data of crop and weed plants,
Fig. 1. State-of-the-art for weed control in organic carrot cultivation. the images are processed through steps 1–3. Next, all plant seg-
ments detected in these images are labeled as crop plants or
weed plants by a human user. These labeled examples with
1.1. Related work extracted features are used for training statistical classifiers.
Typically, the size of the training set must be in the range of
Weed detection and classification is a topic with quite a hundreds or thousands of labeled reference samples for obtain-
research record. Ratios of successful classifications in the range of ing a meaningful classifier (Japkowicz and Stephen, 2002).
60–90% are frequently reported given appropriate field conditions. Finally, the generated classifier can be used for automatic dis-
However, sufficient robustness is usually not achieved under vari- crimination of segments belonging to crop plants or weeds.
ations of environmental conditions or over multiple field situa-
tions. Overlapping and occlusion of plants as well as unsatisfying Most camera-based plant classification approaches follow these
results of segmentation algorithms under outdoor conditions are basic steps with slight modifications or additions. However, these
frequently reported issues causing misclassification (Slaughter steps have some drawbacks. The overlap problem can be tackled
et al., 2008). by partial shape classification using Active Shape Models
Most of the approaches toward automated weed control apply (Søgaard, 2005; Pastrana and Rath, 2013) or using plant classifica-
cameras for sensing plants, standard RGB cameras (e.g. Hemming tion without segmentation (Haug et al., 2014). However, Shape-
and Rath, 2001) or Bi-/Multispectral cameras (e.g. Haug et al., based features require the plant segment silhouette to be detected
2014). There are some approaches using other sensors such as correctly by the segmentation algorithm. Thus, the classification
hyperspectral imaging (Moshou et al., 2013; Okamoto et al., crucially depends on the weakly defined pixels at the plant bound-
2007; Suzuki et al., 2008) or 3D lidars (Weiss et al., 2010). ary. In turn, this makes the classification more sensitive to changes
Triangulation-based 3D measurement systems are also applied of the image capturing conditions.
(Šeatović et al., 2010). Other approaches are following sensor
fusion concepts with multiple sensors (Dzinaj et al., 1998; Komi 1.2. In-Field-Labeling
et al., 2007). However, camera-based approaches are the most
broadly adopted (Weis and Sökefeld, 2010; Emmi et al., 2014; Another major issue when applying automatic image process-
Persson and Åstrand, 2008; Aitkenhead et al., 2003; De Rainville ing for agricultural applications in general or for crop/weed dis-
et al., 2014). crimination in particular is that the processing chain needs to be
In general, camera-based weed detection and classification are adapted toward field-situation specific conditions. Sensor data of
performed in the following four image processing steps (Weis image-based sensors collected under agricultural field condition
and Sökefeld, 2010, p. 126): usually varies significantly with the field conditions (Hemming
and Rath, 2001). Variations can be induced by local field conditions
1. Image acquisition (e.g. soil types, plant development stages, plant varieties), weather
A single camera image is taken. Noise reduction or normaliza- (e.g. uncontrolled sun light exposure, moisture, wind) or machine
tion may additionally be applied (Weis and Sökefeld, 2010, application (e.g. variations in movement speed, shocks, vibrations)
p. 126). (Strothmann et al., 2015). I.e., not all real field situations can be
2. Segmentation/generation of a binary biomass image addressed during algorithm development. Hence, the end user
The image pixels are divided into soil pixels and biomass pixels must have a means for applying field-specific modifications of
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 81

the processing chain. Oftentimes, this kind of adaptation is imple- applications based on camera data is limited. Pixel-based analysis
mented by introducing complex algorithm parameter sets exposed is widely used in remote sensing, e.g., satellite imagery or analysis
in the user interface. However, adjusting parameters with respect of hyper-spectral imaging data. For conventional cameras, the
to the specific situation is notoriously difficult (Burgos-Artizzu available features per pixel, i.e., red, green, blue values and – for
et al., 2009). Particularly, having farmers or operators of agricul- multispectral cameras – optionally Near Infra-red (NIR) values,
tural machines in mind as end users, it cannot be assumed that are few and, thus, a poor basis for complex classification problems.
they are trained on parameterizing image processing algorithms In particular, pixel-based crop/weed discrimination based on cam-
(Strothmann et al., 2015). era images is impracticable due to color similarity of crops and
Consequently, the image processing algorithm has to be ‘ab- weeds. Therefore, camera-based plant classification approaches
stracted away’ from the user interface. Here, machine learning rely on feature generation from the segmented shapes for provid-
techniques can help simplify the user interaction. In order to create ing the classifier with a sufficient feature vector (Weis and
a model by supervised machine learning, the user only has to add Sökefeld, 2010). However, using the descriptive features of the
labels to data samples. This task is typically much simpler. In par- Multi-wavelength Laser Line Profiling (MWLP) sensor system,
ticular, the targeted end users can be assumed proficient in plant classification in a pixel-based manner might be feasible.
inspecting and assessing plant objects.
However, if using classification systems with a design following
the above-mentioned four main processing steps, steps 1–3 can 2.1. The MWLP sensor system
only be adjusted by parametrization. Moreover, the classification
in step 4 happens at object level, i.e., image segments are classified. The MWLP approach is a novel sensing concept proposed and
As mentioned, meaningful classifiers based on machine learning realized by the authors (Strothmann et al., 2014). It expands on
techniques would typically require hundreds or thousands of the robust and broadly adopted laser line profiling method for
labeled reference samples. I.e., for setting up a plant classifier on triangulation-based sensing of dense 3D point clouds.
site, 100+ plant objects need to be labeled. Thus, setting up a refer- Triangulation-based line profile sensing is broadly adopted in
ence data base of labeled image segments of plants is only possible industrial processes and has found first applications in agriculture
during algorithm development. It can impossibly be done during for phenotyping (Paulus et al., 2013; Cai et al., 2013) and even for
system setup for field situations by end users, as labeling this many weed discrimination (Šeatović et al., 2010). However, in contrast to
data would take too long. classical line profiling, the MWLP system scans multiple laser lines
Therefore, In-Field-Labeling implies a classification approach at different wavelengths using a single camera. Further, not just
different from the mentioned processing steps. Classification can- the range information is measured, but reflection intensity and
not happen at segment level but has to be at pixel level. For backscatter features are extracted for each point of each detected
pixel-based classification, even few relatively small user marks in laser line, too. Subject to relative movement, the objects are
high-resolution image data provide ten-thousands of samples. scanned by all laser lines. Laser lines scanning one point at differ-
Hence, for pixel-based classification, the user only has to add a ent times are assembled based on motion estimation (rotary enco-
few marks with labels to some sample images from the specific der) and optical tracking. Thus, scanning results in 3D point clouds
field situation. This can be done quickly and easily for adjusting with reflection and scattering information at multiple, selectable
the pixel-based classification system toward new field situations. wavelengths available for each point (Strothmann et al., 2014).
Thus, In-Field-Labeling allows retraining the classification more Fig. 2 sketches the measurement concept; Fig. 3 presents sample
often, thereby keeping classifier knowledge up-to-date and making scans of field data acquired using the system.
classification more robust toward situation-specific changes MWLP’s single-camera approach is in contrast to other
(Strothmann et al., 2015). approaches toward scanning and merging line profiles at multiple
The authors implemented a classification system with a Graph- wavelengths from multiple scanners (Dupuis et al., 2015). Using a
ical User Inter-face (GUI) for In-Field-Labeling (Strothmann et al., single camera and, thus, avoiding misalignment issues of multiple
2015). It comprises a data management system for combining label sensors, particularly favors the outdoor use of the system, where
reference and image data with a database-like relational mapping defined movements of different sensors cannot be guaranteed
at pixel level, i.e., it allows arbitrary label marks/segments (Strothmann et al., 2014). The camera used here is a Baumer
(Strothmann et al., 2013). The pixel-based classification is done HXG20NIR, providing, at its full resolution of 2048  1088 pixels,
by a Bayesian classifier (Strothmann et al., 2015). images at up to 105 Hz frame rate, which MWLP’s line detection
For applying In-Field-Labeling in the field, the user first has to computer can process online. In addition, the camera offers a sub-
sample data on the field to be treated and to add few marks to sampling mode, where of each 2  2 pixels only one pixel is read
the data. The classifier is generated automatically from the marks out, resulting in a reduced resolution of 1024  544 pixels. In this
and sample data. This field setup would only take few minutes mode, the frame rate can be increased up to 420 Hz without spec-
and allow using a plant classifier that is targeted to the field it is ifying an Region of Interest (ROI) to read out from the sensor, i.e.,
applied on. However, In-Field-Labeling does not mean that the user without reducing the field of view of measurement range of the
must follow these steps. Of course, classifiers can also be saved and system. Furthermore, reduced-resolution images get processed
reused on other days/fields if working satisfactorily. Our In-Field- online at 420 Hz.
Labeling concept with pixel-based classification only assures that The line detection of the MWLP system is performed in a dedi-
the user can create a better classifier in the field if shipped classi- cated ROIs for each mounted line laser (cf. Fig. 4). The relative
fiers do not work in the specific field situation without having to motion between the sensor and the sensed objects is monitored
descend into complex algorithm parameter sets. by a rotary encoder mounted to the pulley of the conveyor (cf.
Fig. 2) or by the odometry of the carrying vehicle. Further, the
robust, but crude rotary encoder data is improved by optical track-
2. Materials and methods ing for correct assembly of the scan data collected with different
lasers. Moreover, the optical tracking allows obtaining differential
As described, the In-Field-Labeling approach allows quick and images by subtracting images in which lines should be detected by
easy adaptation of image processing chains toward field-specific images captured with laser lines shifted. This results in significant
situations by inexperienced users. However, the range of possible enhancements of the laser lines in the images, thus normalizing
82 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Fig. 2. Overview of the measurement concept of the MWLP system (Strothmann et al., 2014).

Fig. 3. MWLP in field operation for scanning plants. Top-left: view of web-cam mounted inside BoniRob (cf. Fig. 9) looking at the laser lines; bottom-right: raw camera image
of the camera of the MWLP system; bottom-left: line detection of multiple lasers by the MWLP system; top-right: output data of the MWLP system visualized as point cloud.
Colors represent intensities in different laser wavelengths. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this
article.)

Fig. 4. Excerpt of a camera image from scanning two laser lines including detected laser lines. The line detection is conducted in a dedicated ROI for each laser line. The
straight colored lines indicate the boundaries of the ROIs. On the left side a potato is scanned, on the right side a stone is scanned (Strothmann et al., 2015).
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 83

the images before line detection and canceling out influences of depict mounting and field operation of the MWLP system on
uncontrolled ambient lighting (cf. Fig. 5) (Strothmann et al., 2014). BoniRob. To avoid disturbances by hard sun light exposure, a sun
The line detection is done in the differential images by running shade is mounted. The system operates beneath the BoniRob look-
along the image columns (Fig. 6). In each column of each dedicated ing down. An actuator for weed treatment could further be
ROI, only one segment is allowed, otherwise nothing is detected mounted there together with the system and inside the sun shade.
(Strothmann et al., 2014). Owing to the high level of normalization Fig. 10 shows an image-based color visualization of scan data
of the differential image, lines can be detected by applying a simple gathered with the MWLP of carrot and weed plants scanned in-
threshold. Fig. 7 shows an example for a cut along the image col- field. The distance information obtained along with the scans of
umn of a ROI. The gray values of the pixels in the differential image Fig. 10 is depicted in Fig. 11.
are drawn on the y-axis; the x-axis indicates the image rows. Along
with the line detection, feature extraction for describing the line 2.3. Feature selection for background removal
points is done. First, the line start and line end of the segment
are detected. The line position, which is relevant for determining The first step to be applied in detection and classification of
the range value, i.e., z-position, is then assumed to be the center crop/weed plants on data such as shown in Figs. 10 and 11 is back-
of the line, i.e., in the middle between line start and end. Further, ground removal, i.e., filtering pixels representing soil and, thus, not
features for describing scattering and reflection at this point are belonging to any plant. This is done by a pixel-based foreground/
extracted. To describe the reflection features, the IntensityMaxi- background classification. Note that this does not represent a clas-
mum, the LineWidth, and the IntensitySum, i.e., sum of pixel inten- sical segmentation in the sense of the four main processing steps for
sities below the line curve between line start and end, are camera-based plant classification mentioned in the introduction,
determined (Fig. 7). The scattering, i.e., light entering the object due to the following points. First, the foreground/background sepa-
being scattered back beneath the surface, results in higher pixel ration is done by pixel-based classification, i.e., it is trainable by In-
intensities before line start and after the line end. It is described Field-Labeling. Second, it is determined on pixel level rather than
by summing up pixel intensities at different distance ranges before segment level, i.e., a biomass pixel can completely be surrounded
and after the laser line. E.g., the feature Scatter20Section is deter- by soil pixels (and vice versa) and will still be correctly classified.
mined by summing up the intensities for the pixels 1–20 before Third, the determined soil pixels are only skipped for plant classifi-
and after the laser line, for Scatter40Section the pixels 21–40 are cation. This means, during following feature generation steps for a
summed and for Scatter40Sum pixels 1–40 are summed, i.e., Scat- pixel, the entire neighborhood raw data is still available. Moreover,
ter40Sum equals sum of Scatter20Section and Scatter40Section. The during feature generation, a pixel does not ‘know’ whether its
same is done for the Scatter60. . . and Scatter80. . . features neighbors are soil or biomass pixels. Thus, no plant classification
(Strothmann et al., 2015). features are generated based on the foreground/background classi-
fication. Hence, poor results of the soil/biomass separation have no
significant influence on the plant classification.
2.2. Field operation mounted into BoniRob
Selection of features for classification should focus on maxi-
mum relevance of the selected features for the outcome and min-
As mentioned, a relative movement between sensor and sensed
imum redundancy of the selected features (Peng et al., 2005). For
objects is required for operating the MWLP system. However, this
identifying features relevant for classification of soil and biomass,
is not limited to mounting the system above a conveyor, as
a set of collected field data was marked with labels using the In-
sketched in Fig. 2. It is also possible to operate the system in a con-
Field-Labeling GUI. Fig. 12 shows a screenshot of the respective
figuration of still objects and moved sensor. This configuration was
labeling tool.
used in field trials of the system mounted into the autonomous
For determining relevance of the different features for soil/bio-
field robot platform BoniRob. This platform has an ‘App’ concept,
mass classification, a numeric measure of the mutual information
allowing different modules to be mounted on the platform via
of the feature values and the categorical classification outcomes
defined mechanical, electrical, and logical interfaces for supplying
had to be determined. To obtain this, the feature values of pixels
the module with power and process information (Bangert et al.,
marked as background in Fig. 12 were aggregated into one his-
2013). Thereby, the rotary encoder data of the setup in Fig. 2 is
togram and the feature values of pixels marked as plants into
replaced by odometry data of the robot platform. Figs. 8 and 9
another. Next, both histograms for each feature were compared
by calculating the Sum of Absolute Differences (SAD), i.e., for each
histogram cell, the absolute difference between both histogram val-
ues is derived, and these values are summed up. Further, the SAD is
normalized by the sum of both histograms. This results in a mutual
information measure scaling between 0.0 and 1.0, where 0.0 indi-
cates identical histograms, i.e., the respective feature has absolutely
no relevance, and 1.0 signifies complete separation, i.e., the out-
come can safely be determined using only this feature. For values
in between, the histograms partly overlap, partly differ. Hence, val-
ues with moderately high SAD-based relevance measures may con-
tribute to the classification, but multiple features are required.
All features extracted from the laser lines by the MWLP system
for each pixel were tested for soil/biomass classification using the
described relevance measure (cf. Table 1). Further, the NDVI was
tested. It is not directly provided by the MWLP system, but it can
easily be calculated for each pixel using the IntensitySum features
Fig. 5. Differential image corresponding to Fig. 4. The effects induced by the laser of the red and NIR laser by applying the following formula:
lighting (reflection and scattering) are highly emphasized, ambient lighting is
canceled out. The potatoes on the left side cause the laser lines to scatter (due to NIR  Red
high water content), while the stone on the right side does not (due to high optical NDVI ¼
NIR þ Red
density) (Strothmann et al., 2015).
84 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Fig. 6. Gray values of one image column of the differential image plotted over the image rows. The plots show columns of the image data given in Fig. 5. Note that both plots
show image columns cutting the dedicated ROIs of two lasers. The left plot represents a column on the left side of Fig. 5 from scanning a potato, i.e., with significant scattering.
The right plot represents a column on the right side of Fig. 5 from scanning a stone, i.e., without significant scattering (Strothmann et al., 2015).

Fig. 7. Cut through a laser line along the image column for feature extraction of the detected laser lines by the MWLP system (Strothmann et al., 2015).

Fig. 8. Photo of the sensing head of the MWLP system mounted into BoniRob for
scanning corn salad (Valerianella locusta). For this operation, a red, a green and a NIR
(invisible here) laser are used. (For interpretation of the references to colour in this Fig. 9. The autonomous field robot BoniRob equipped with MWLP system scanning
figure legend, the reader is referred to the web version of this article.) corn salad.
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 85

NIR lasers with moderate relevances of 0.758 and 0.700, respec-


tively. Further, the IntenMax values of all lasers and the Scat-
ter. . .values of the NIR laser provide high relevancies. However,
feature redundancies have to be kept in mind. Therefore, only
the IntenMax feature of the green laser was selected to be passed
on to the classifier, as the reflection properties of the other two
lasers already influence the classification via NDVI. Further, only
the Scatter60Sum of the NIR laser was selected to be passed to
the classifier. It is the scattering feature with the highest relevancy
and passing on multiple scattering features of the NIR laser would
again lead to redundancies.

2.4. Feature generation for crop/weed classification

The spectral reflection and backscattering features extracted by


MWLP were also tested for relevance on crop/weed classification
for carrots and weeds in the same way as described for soil/bio-
mass classification (cf. Table 1). However, none of the tested data
was significantly relevant, and neither was NDVI. Hence, feature
generation from the 3D data provided by MWLP was needed.
For feature generation from the 3D data, the Point Cloud Library
(PCL) (Rusu and Cousins, 2011) and some estimators implemented
as part of this work were used. Many feature estimators require
normals for each point of the point cloud as an input. As the point
cloud data obtained by using the MWLP system is organized, Nor-
mal Estimation (NE) can be done by pcl::IntegralImageNormal
Estimation in covariance matrix mode (Holzer et al., 2012).
Fig. 10. Image visualization of intensity scan data of the MWLP system from
Based on the 3D point clouds with surface normals for each
scanning carrots and weed plants. Note that red, green and NIR lasers were applied
here. The IntensitySum pixel features of the NIR laser are drawn in blue (along with point, a set of feature estimators is available. Particularly the
values of red and green laser in the red and green channels, respectively), thus Radius-based Surface Descriptor (RSD) estimation shipped with
causing the blue sheen. (For interpretation of the references to colour in this figure PCL showed good runtime performance and significance for crop/
legend, the reader is referred to the web version of this article.) weed discrimination. It calculates angles between neighboring
points for estimating local surface radii (Marton et al., 2010). Start-
ing from the RSD estimation of PCL, a modified and extended RSD
estimation was implemented as part of this work. It does not
return radius features obtained based on the angle between the
normals and the point distance, but the angle features directly.
The point distances are very similar for organized clouds through-
out all samples. Hence, taking it into account does not add much
descriptiveness. Another modification is that it does not return
the minimum and maximum value but the 5% and 95% percentile
of the angles observed in a point neighborhood. This avoids influ-
ence by crude outliers. Finally, the extended RSD does not only
respect the angle between the normals of a point pair, but also
the angles between the normal of the described point and the con-
necting vector between the described point and neighboring
points.
Another feature estimator implemented as part of this work is
the line flicker estimator. It was implemented due to the observa-
tion that the fine-feathered pinnate leaves of the carrots in dis-
tance images of carrots and weed plants (cf. Fig. 11) cause many
abrupt distance changes, while weeds with oval leaves show a
smooth behavior in the distance maps and grass-like weeds only
cause a single distance change pair. To generate numeric descrip-
tors of a surface point, the line flicker estimator runs along the hor-
izontal line, the vertical line, and the two diagonals of a kernel
around the point to be described. Thereby, it sums up the differ-
ences in the distances values, i.e., distance changes, of adjacent
points and counts the number of sign changes in the delta. Further,
due to the triangulation principle, sharp edges may cause invalid
Fig. 11. Heat map of depth information gathered by MWLP simultaneously with
pixels in the distance image (white pixels in Fig. 11). Therefore,
the intensity data visualized in Fig. 10. the line flicker also counts the number of changes from valid to
invalid adjacent pixels along the monitored lines.
As Table 1 shows, applying the described feature generation The described surface features were tested on their relevances
allows obtaining the new feature NDVI with very high relevance for crop/weed classification. Labeled MWLP data field-acquired
of 0.925 by combining the two features IntensitySum of red and from carrots and weed plants were used again. The SAD-based
86 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Fig. 12. Screenshot from labeling carrots and weed plants. Mind that classification, labeling, and data persistence are pixel-based, i.e., the form of the label marks does not
matter. It is important to point out regions typical for each class. Green labels mark crop plants, red labels weed plants and black labels the background, i.e., soil. (For
interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.)

Fig. 13. Draft of a weeding system based on a MWLP sensor system.

relevance measures are given in Table 2. As Table 2 shows, the (Langsenkamp et al., 2014). Stamp-like actuators can be scaled to
minimum and maximum radii estimated by the original RSD have a line of individual actuators that can treat weeds while being
moderate significance. Of the features estimated by the extended moved over the plant rows. This setup has the advantage that no
RSD the AngleNeighborMax and AngleNeighborMin features as well manipulator arm is required for moving the weeding tool to the
as the NeighborCount show moderate relevances. MagicToValid- target weed. The weeding tool is only activated when it is located
Count, ValueCount and ChangeSum have the highest relevances of over a weed while the entire weeding system is in continuous
features extracted by the line flicker. Consequently, the configura- motion over the crop rows. Thereby, adjacent cells get treated
tion given in Table 3 of MWLP and classification was used for the independently.
following experiments. Treating the weeds in a grid is a common technique for patch
spraying (Gonzalez-de Soto et al., 2016). However, for mechanical
2.5. System concept of the weeding system and grid aggregation weed treatment, camera-based approaches typically attempt to
detect individual plants. With the described approach, we perform
To validate the pixel-based crop/weed classification, a grid the weed detection by applying the grid-based method. A much
aggregation was set up. The aggregated treatment grids serve as higher grid resolution than for patch spraying is required, though.
an input for actuator control. Langsenkamp et al. showed the effec- Following the concept of treating plants using a line of stamp
tiveness of stamp-like weeding tools for mechanical weed control actuators without a manipulator arm leads to treating the field in
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 87

Table 1
SAD-based relevance measures for the pixel feature channels extracted by the MWLP system and the NDVI for soil/biomass classification.

Laser Feature Soil/biomass SAD Laser Feature Soil/biomass SAD


Green @ 532 nm LineWidth 0.37
IntenMax 0.91 Red Scatter60Section 0.36
IntensitySum 0.27 Scatter60Sum 0.14
Scatter20Section 0.59 Scatter80Section 0.36
Scatter40Section 0.69 Scatter80Sum 0.11
Scatter40Sum 0.71 NIR @ 850 nm LineWidth 0.81
Scatter60Section 0.62 IntenMax 0.93
Scatter60Sum 0.75 IntensitySum 0.70
Scatter80Section 0.42 Scatter20Section 0.87
Scatter80Sum 0.75 Scatter40Section 0.90
Red @ 650 nm LineWidth 0.64 Scatter40Sum 0.91
IntenMax 0.95 Scatter60Section 0.85
IntensitySum 0.76 Scatter60Sum 0.92
Scatter20Section 0.31 Scatter80Section 0.65
Scatter40Section 0.35 Scatter80Sum 0.92
Scatter40Sum 0.20 NDVI 0.93

Table 2
Descriptions and SAD-based relevances for crop/weed classification of the generated numeric 3D surface feature values. Angle a is between normals of neighbor points, b between
normal and connection vector to neighbor point.

Estimator and feature Description SAD


RSD RSD_rmax Maximum surface radius by PCL RSD 0.424
RSD_rmin Minimum surface radius by PCL RSD 0.432
Extended RSD AngleNormalsMax Upper percentile (95 %) of angles a 0.456
AngleNormalsMin Lower percentile (5 %) of angles a 0.436
AngleNeighborMax Upper percentile (95 %) of angles b 0.486
AngleNeighborMin Lower percentile (5 %) of angles b 0.602
NeighborCount Count of valid neighbor points found inside search radius (3D) 0.599

MagicToValidCount Number of valid/invalid changes along line axes 0.753


ValueRange Difference between minimum and maximum observed distance values 0.631
ValueCount Number of valid values on analyzed line axes 0.681
DirectionChangeCount Number of sign changes of the distance differences between adjacent pixels along line axes 0.240
ChangeSum Summed up absolute changes of distance values of adjacent pixels while running line axes 0.711
NormalizedChangeSum Change sum divided by direction change count 0.617

for treating them. By combining pixel-based classification with


Table 3
Configuration of the MWLP system and classification pipeline for plant classification. grid aggregation, the actuation decision can be taken on a grid-
cell base, rather than on the base of segmenting and interpreting
MWLP
individual plants. This strategy of treating the field in a grid avoids
Lasers  Green (@ 532 nm) major fault causes (overlapping, segmentation, etc.) for field
 Red (@ 650 nm)
operation.
 NIR (@ 850 nm)
The required grid aggregation assembles the high-resolution
Classification
image-based data with pixel sizes at 0.1 mm at full frame scan
Preprocessing soil/biomass  Calculation of NDVI
and 0.2 mm in subsampling mode to grid cells of size 1 cm by
Preprocessing crop/weed  NE using PCL
1 cm. Further, it decides whether a particular grid cell needs to
 Extended RSD (GPU)
 Distance line flicker (GPU) treated or not. Grid cells are determined to be treated, if the major-
ity of the pixels in the cell are classified as weed pixels, or not trea-
Features soil/biomass classifier  IntenMax_Green
 Scatter60Sum_NIR ted, if the majority of pixels are classified as crop or soil pixels. The
 NDVI ratios can be adjusted to address different weeding strategies.
Features crop/weed classifier  MagicToValidChangeCount
 ValueCount 3. Results
 ChangeSum
 AngleNeighborMin
 AngleNeighborMax
The pixel-based classification of MWLP data using spectral and
 NeighborCount 3D surface features was developed for classifying carrots and weed
plants. It was first tested for this application. Figs. 15–18 show
example results of these tests. Fig. 15 depicts the input data col-
lected by MWLP. Fig. 16 gives the pixel-based result of the binary
small grid cells (see Fig. 14). The dimensions of these grid cells soil/biomass classification (biomass pixels white). The result shows
depend on the diameter of the stamp, hence, 1 cm  1 cm seems no perfect segmentation (which was not intended either), but the
a reasonable grid size. However, following this thoughts from the vast majority of soil pixels are filtered. Fig. 17 illustrates the result
weeding tool’s point of view it does not matter whether a cell is of the pixel-based plant classification. There are some misclassified
occupied by two weeds or one weed plant or whether two adjacent pixels - mostly at plant boundaries. However, the plant centers are
cells must be treated for one large weed plant or due to two smal- mainly classified correctly. Fig. 18 shows the result of the grid
ler ones. Thus, ‘hand-shaking’ each crop/weed plant is not required aggregation based on the pixel-based classification. The red shaded
88 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Fig. 14. Treatment of weeds in a grid using a line sensor, e.g., MWLP system, and a line actuator such as a line of stamp tools.

Fig. 15. Color visualization of the original MWLP sensor data as input for plant Fig. 16. Binary image generated by the soil/plant classifier for filtering out soil
classification. Note that the intensity values are colored with NIR drawn in blue, i.e. pixels. Soil pixels are drawn in black, biomass pixels in white.
same colorization like in Fig. 10. (For interpretation of the references to colour in
this figure legend, the reader is referred to the web version of this article.)
data, but no intermediate or final results of the classification pipe-
line, thereby avoiding these to influence him. Fig. 19 depicts the
grid cells are to be treated. Note, most of the weeds are treated, reference grid for comparison with the automatically generated
while the crop plants remain mostly untreated. To verify the clas- treatment grid given in Fig. 18.
sification results, reference grids were created manually: a human The manually referenced grids were compared with automati-
user referenced the grid cells as soil, crop or weed as ground truth cally generated treatment grids. Consequently, 3 kinds of misclas-
to be compared with the automatically generated treatment grids. sification for the cells of the automatically generated treatment
During referencing, the user only saw the visualized MWLP sensor grid were possible (cf. Table 4):
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 89

Fig. 17. Probability image of the pixel-based crop/weed classification. Safely Fig. 18. Overlay of the original input image and the grid with the final treatment
classified crop pixels are drawn in green, weed pixels in red. (For interpretation decision. Red-shaded grid cells should be treated for weed removal according to the
of the references to colour in this figure legend, the reader is referred to the web automatic classification. (For interpretation of the references to colour in this figure
version of this article.) legend, the reader is referred to the web version of this article.)

1. Cell manually referenced as soil, automatic decision to treat the 3.1. Carrot cultivation
cell
In practice, this kind of misclassification is uncritical as it would The classification was verified by comparison against the man-
not affect yield. Only energy consumption or tool wear might be ually referenced grids for field-data gathered on two different
effected. dates and on different fields. For the data of both days, a labeling
of a few marks in a small set of sample data according to the In-
2. Cell manually referenced as crop, automatic decision to treat
Field-Labeling concept was made before, in order to train the clas-
the cell sifiers with up-to-date label data. Table 5 shows the result statis-
In practical application this kind of misclassification is critical. tics of the misclassification on the first day at different driving
Crop plants must not be effected by the weeding tool, no matter speeds of the system. The described case distinction is made for
where. the misclassification of weed cells (cf. Table 4). This table gives
3. Cell manually referenced as weed, automatic decision to not the full descriptive statistics at the example of the data set with
treat the cell driving speed 25 mm/s mentioned in Table 5. All weed cells shall
This kind of misclassification should be avoided. However, com- be treated, soil and crop cells shall not be treated. Thus, cells
paring manually referenced grids with auto-generated grids, we included in the italic percentages are correctly treated. For soil mis-
noted that the user mostly referenced the weed plants includ- classification, treatment is uncritical. On the other hand, for crop
ing their boundaries, while for automatically generated grids plants misclassification, treatment is critical, regardless where.
only the plant center was treated. Therefore, a case distinction Further, for weed plant the mentioned case distinction is done:
was made. If a grid cell referenced as a weed cell but not the plain percentage in the manual weed/auto not treat cell (here
automatically treated has an adjacent treated cell, it will be 31.35%) includes untreated cells with adjacent cells. The weed is
assumed that the weed plant is damaged in the other cell. In damaged there, thus this misclassification is uncritical (comparing
this case, the misclassification is uncritical. If it has no adjacent Figs. 18 and 19, note that the user often marked including plant
treated cell, it will be assumed that the respective weed plant boundaries, while the algorithm only decided to treat the plant
stays completely untreated, in which case the misclassification center). Only the untreated weed cells without adjacent treated
is critical. cells (here 9.62%, separated by/) are critical. Hence, the bold
90 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

are shortened: the skipped figures are redundant values that can
be calculated using the pattern shown in Table 4. This is also the
case for Tables 6–10.
The bold values in Table 5 indicate the misclassification kinds
identified as critical. The MWLP was operating with full camera
resolution at a camera frame rate of approx. 50 Hz for these tests.
The row length scanned for each data set is further mentioned in
Table 5. The width of scanned stripe in these and all following data
sets is approx. 20 cm, i.e., matching the size of the ridge crown of a
carrot ridge. Note that the critical misclassification safely stayed
below 10% for that field situation throughout the tested speed
range.
On day two, the system was again tested with full camera res-
olution. The frame rate was at approx. 75 Hz. The result statistics
for this test is given in Table 6. Here, the misclassification rates
are slightly increased, but stayed at around 10–15%. Further, on
the second day the system was tested with the camera in 2  2
subsampling mode, i.e., an image resolution reduced by 4 together
with frame rate increased to approx. 360 Hz. Table 7 shows that
the misclassification further increases. Even intolerable values of
20–30% are occasionally reached. Apparently, full camera resolu-
tion is required for carrot classification due to the fine-feathered
pinnate leafs of carrots.

3.2. Corn salad cultivation

As mentioned, the classification system was developed with a


focus on organic carrot growing. However, it was further tested
for crop/weed classification in corn salad cultivation. For this test,
first, a set of data from corn salad was labeled using the In-Field-
Labeling GUI (cf. Fig. 20). Next, other collected data was processed.
Fig. 21 shows an example result. Note that the (few) weeds are
treated, while the crop plants stay untreated. When inspecting this
result, keep in mind that no changes in programming or
parametrization were done to adapt the classification to corn salad.
Fig. 19. Overlay of the original input image and manually referenced grid (ground-
Hence, the labeled marks shown in Fig. 20 are essentially all the
truth). Red-shaded grid cells were referenced as weed cells, green-shaded cells are
crop-pant cells. The gray-shaded cells are referenced as soil cells. (For interpretation classifier ‘knows’ about corn salad.
of the references to colour in this figure legend, the reader is referred to the web With these encouraging first results, the classification was ver-
version of this article.) ified against manually created reference grids for corn salad. On
the day of the corn salad test, the system was also operated in full
resolution camera mode (at approx. 100 Hz) as well as in subsam-
Table 4
Full result statistics at the example of day 1 data set with driving speed 25 mm/s (cf.
pling mode (at approx. 410 Hz). The results are given in Table 8.
Table 5). Tables 5–10 contain shortened statistics for different data sets in each row. Note that very good misclassification rates significantly below
10% are reached. The number of weeds is partly very low due to
Reference (manual) Treat decision (automatic)
low weed pressure on the monitored field, but the occurring weeds
Auto treat Auto not treat N grid cells are detected. Even the misclassification of the subsampling tests at
Manual soil 4.15% 95.85% 8775 = 100% driving speeds up to 0.4 m/s are good. Likely, for the less fine struc-
Manual crop 6.00% 94.00% 483 = 100% tured oval leafs of corn salad, full camera resolution is not abso-
Manual weed 59.03% 31.35%/9.62% 1694 = 100%
lutely required. Thus, it is possible to increase frame rate and
N grid cells 1393 9559 10,952
driving speed by using the subsampling mode.

3.3. The impact of up-to-date classifier knowledge


percentages describe the critical kinds of misclassification to be
minimized. In addition to the test results for different field situations and
Table 5 contains shortened statistics for the driving speed crops, the impact of the up-to-date classifier knowledge due to
25 mm/s as well as for the increased speeds. Note that the statistics In-Field-Labeling was assessed. For this test, data subsets of carrot

Table 5
Shortened result statistics for different data sets of day 1.

Driving speed (mm/s) Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed Row length (m)
25 4.15% of 8775 6.00% of 483 31.35%/9.62% of 1694 5.5
50 2.09% of 3534 3.70% of 162 32.28%/6.60% of 697 2.2
100 4.27% of 7500 8.93% of 224 34.22%/6.01% of 895 4.3
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 91

Table 6
Shortened result statistics for different data sets day 2 with full camera resolution.

Driving speed (mm/s) Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed Row length (m)
25 2.40% of 6649 12.44% of 217 29.51%/7.10% of 183 3.5
50 4.65% of 26,292 10.08% of 833 33.52%/17.10% of 883 14.0
100 1.06% of 29,252 11.73% of 733 27.08 %/10.97% of 720 15.4

Table 7
Shortened result statistics for different data sets of day 2 in subsampling mode.

Driving speed (mm/s) Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed Row length (m)
50 6.96% of 23,808 15.35% of 593 26.56 %/18.13% of 720 12.6
100 2.66% of 11,411 14.61% of 394 31.07%/19.82% of 338 6.1
150 3.17% of 44,098 14.54% of 1417 24.24%/12.36% of 1060 23.3
200 3.83% of 31,039 33.03% of 1562 27.61%/10.33% of 1094 16.9

Table 8
Shortened result statistics for different data sets of corn salad.

Driving speed/mode Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed Row length (m)
50 mm/s/full resolution 0.00% of 1719 0.20% of 500 66.67%/0.00% of 3 1.1
100 mm/s/full resolution 0.00% of 4737 0.00% of 2643 50.00%/0.00% of 12 3.7
150 mm/s/subsampling 1.64% of 16571 0.72% of 6385 21.21%/9.09% of 66 11.5
400 mm/s/subsampling 1.90% of 30,602 1.55% of 14,281 16.07%/2.35% of 85 22.5

Table 9
Comparing up-to-date classifiers with classifiers trained with data of the respective other day.

Date of label data for classifier Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed
Processing data subset of day 1
Day 1 (correct classifier) 1.40% of 1576 4.44 of 90 35.52%/7.76 of 335
Day 2 (wrong classifier) 12.37% of 1576 41.11 of 90 37.01%/13.43 of 335
Processing data subset of day 2
Day 2 (correct classifier) 0.45% of 2225 9.62% of 52 39.13%/2.17% of 46
Day 1 (wrong classifier) 35.19% of 2225 59.62% of 52 23.91%/8.70% of 46

Table 10
Comparing classifiers of correct crop with wrong classifiers trained with data of the respective other crop.

Date/crop of label data for classifier Misclassification [% of N grid cells]


Manual soil Manual crop Manual weed
Processing data subset of day 3 from corn salad
Day 3 - corn salad (correct classifier) 0.00% of 1794 0.00% of 1215 40.00%/0.00% of 5
Day 1 - carrots (wrong classifier) 0.00% of 1794 0.00% of 1215 0.00%/100.00% of 5
Processing data subset of day 1 from carrots
Day 1 - carrots (correct classifier) 1.40% of 1576 4.44 of 90 35.52%/7.76 of 335
Day 3 - corn salad (wrong classifier) 0.19% of 1576 4.44% of 90 10.75%/87.46% of 335

data from both days were processed two times: first using a clas- classifiers for the respective other crop resulted in drastically
sifier generated from labeled marks in data of the same day, and increased rates of untreated weeds.
second using a classifier trained with label data of the respective
other day. Table 9 shows the results. For both subsets, the classifi-
cations with the classifier of the respective other date cause drastic 4. Discussion
increases of the critical misclassification for both crops and weeds.
Thus, the In-Field-Labeling concept helped to adjust the classifiers We have shown that pixel-based classification of MWLP sensor
toward the particular field conditions of the respective days. data is feasible for crop/weed discrimination. We used spectral
Further, the effect of the label knowledge for adapting classifiers laser line reflection and scattering features for separating soil and
toward the different crops was tested. Therefore, a subset of corn plant pixels; the generated 3D features served for crop/weed clas-
salad data was processed using a classifier for carrots and vice sification. The achieved misclassification rates of around 5–15% for
versa. Table 10 shows the results. In both cases, using the wrong carrots and below 10% for corn salad are within a good level, as
92 W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93

Fig. 20. Labeling for corn salad data. Green labels mark crop plants, red labels weed plants and black labels the background, i.e., soil. Cf. Fig. 12. (For interpretation of the
references to colour in this figure legend, the reader is referred to the web version of this article.)

speeds of up to 0.1 m/s for carrots and up to 0.4 m/s for corn salad
are well in the range of practical figures observed for hand-
weeding (Fittje et al., 2015), too. In addition, there are still some
technical options to be explored for further increasing speed. As
the results showed, the misclassification rates did not vary much
with the speeds in the tested ranges. Thus, increases are possible.
Following tests have to show how far the driving speeds can be
increased. For other applications, the MWLP system and classifica-
tion have been applied with speeds up to 1.09 m/s.
Additionally, the capacity of a weeding system based on the
MWLP, like drafted in Fig. 13 for treating weeds on carrot ridges
in row can be scaled by adding multiple systems for different rows
to a common carrier, thus, treating multiple rows in a single pass.
The impact of the In-Field-Labeling for pixel-based classification
was demonstrated. In both tested cases (classifiers for wrong dates
and classifiers for wrong crops), the respective wrong classifiers
performed drastically worse than the correct ones. Thus, our
pixel-based In-Field-Labeling method for MWLP appears to have
a very high potential for quick and easy adaptation of classifier
models to different field situations. This is a particularly important
feature for practical adoption where a classifier developed for a
specific set of field situations may work poorly for other field situ-
ation. Our system, in contrast, can be geared toward the specific
field situation by the end user in a quick and easy manner.

5. Conclusion

This work showed in field trials that pixel-based classification


of MWLP sensor data is usable for crop/weed discrimination. The
Fig. 21. Auto-generated treatment grid. classification is based on spectral features for soil/biomass separa-
tion and on 3D surface features for crop/weed discrimination.
compared with figures from the literature (Slaughter et al., 2008; Combined with a grid aggregation, the system provides ready-to-
Haug et al., 2014) and compared with monitoring of manual use input data for weeding actuators. Thereby, ‘hand-shaking’ each
hand-weeding (Fittje et al., 2015). Further, the tested driving plant object is intentionally avoided. Thus, major robustness prob-
W. Strothmann et al. / Computers and Electronics in Agriculture 134 (2017) 79–93 93

lems for field application of automatic plant classification are can- Komi, P.J., Jackson, M.R., Parkin, R.M., 2007. Plant classification combining colour
and spectral cameras for weed control purposes. In: 2007 IEEE International
celed out (overlapping, segmentation). Moreover, enabled by the
Symposium on Industrial Electronics. IEEE, pp. 2039–2042. http://dx.doi.org/
pixel-based classification, quick and easy adaptation of the system 10.1109/ISIE.2007.4374921.
to new field situations is possible by In-Field-Labeling. This allows Laber, H., Stützel, H., 2003. Ertragswirksamkeit der Restverunkrautung in
untrained users to train the classification with up-to-date data. Gemüsekulturen nach nichtchemischen Unkrautregulationsmaßnahmen.
Pflanzenbauwissenschaften 7 (1), 29–38.
Thereby, robustness with respect to different field situations is Langsenkamp, F., Sellmann, F., Kohlbrecher, M., Kielhorn, A., Strothmann, W.,
achieved. Finally, labeling small sets of data at a huge number of Michaels, A., Ruckelshausen, A., Trautz, D., 2014. Tube Stamp for mechanical
field situations in the described manner would open the door for intra-row individual Plant Weed Control. In: 18th World Congress of CIGR,
CIGR2014, Beijing, China, Sept.16–19, 2014. <https://www.hs-osnabrueck.
long-term (or life-long) machine learning strategies. de/fileadmin/HSOS/Homepages/COALA/Veroeffentlichungen/2014-CIGR_2014_
Tube_Stamp_for_mechanical_intra-row_individual_Plant_Weed_Control.pdf>
Acknowledgements (05/21/2016).
Marton, Z.-C., Pangercic, D., Blodow, N., Kleinehellefort, J., Beetz, M., 2010. General
3D modelling of novel objects from a single view. In: 2010 IEEE/RSJ
This work was conducted in context of the research project International Conference on Intelligent Robots and Systems (IROS). IEEE, pp.
RemoteFarming.1 supported by German Federal Ministry for Food 3700–3705. http://dx.doi.org/10.1109/IROS.2010.5650434.
Moshou, D., Kateris, D., Pantazi, X., Gravalos, I., 2013. Crop and weed species
and Agriculture (BMEL) and managed by German Federal Office for recognition based on hyperspectral sensing and active learning. Precision
Agriculture and Food (BLE). Agriculture, vol. 13. Springer, pp. 555–561.
Okamoto, H., Murata, T., Kataoka, T., Hata, S.-I., 2007. Plant classification for weed
detection using hyperspectral imaging with wavelet analysis. Weed Biol.
References
Manage. 7 (1), 31–37. http://dx.doi.org/10.1111/j.1445-6664.2006.00234.x.
Pastrana, J.C., Rath, T., 2013. Novel image processing approach for solving the
Aitkenhead, M.J., Dalgetty, I.A., Mullins, C.E., McDonald, A.J.S., Strachan, N.J.C., 2003. overlapping problem in agriculture. Biosyst. Eng. 115 (1), 106–115. http://dx.
Weed and crop discrimination using image analysis and artificial intelligence doi.org/10.1016/j.biosystemseng.2012.12.006.
methods. Comput. Electron. Agric. 39 (3), 157–171. http://dx.doi.org/10.1016/ Paulus, S., Dupuis, J., Mahlein, A.-K., Kuhlmann, H., 2013. Surface feature based
S0168-1699(03)00076-0. classification of plant organs from 3D laserscanned point clouds for plant
Bangert, W., Kielhorn, A., Rahe, F., Albert, A., Biber, P., Grzonka, S., Haug, S., Michaels, phenotyping. BMC Bioinform. 14 (1), 238. http://dx.doi.org/10.1186/1471-
A., Mentrup, D., Hänsel, M., Kinski, D., Möller, D., Ruckelshausen, A., Scholz, C., 2105-14-238.
Sellmann, F., Strothmann, W., Trautz, D., 2013. Field-Robot-Based Agriculture: Peng, H., Long, F., Ding, C., 2005. Feature selection based on mutual information
‘‘RemoteFarming.1” and ‘‘BoniRob-Apps”. In: 71th conference LAND.TECHNIK- criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans.
AgEng 2013, VDI Verlag GmbH, Düsseldorf, 2013, pp. 439–446. <https://www. Pattern Anal. Mach. Intell. 27 (8), 1226–1238. http://dx.doi.org/10.1109/
hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/Veroeffentlichungen/ TPAMI.2005.159.
2013-VDIAgEng-BoniRob-Apps.pdf> (05/21/2016). Persson, M., Åstrand, B., 2008. Classification of crops and weeds extracted by active
Burgos-Artizzu, X.P., Ribeiro, A., Tellaeche, A., Pajares, G., Fernández-Quintanilla, C., shape models. Biosyst. Eng. 100 (4), 484–497. http://dx.doi.org/10.1016/j.
2009. Improving weed pressure assessment using digital images from an biosystemseng.2008.05.003.
experience-based reasoning approach. Comput. Electron. Agric. 65 (2), 176– Rusu, R.B., Cousins, S., 2011. 3D is here: Point Cloud Library (PCL). In: IEEE
185. http://dx.doi.org/10.1016/j.compag.2008.09.001. International Conference on Robotics and Automation (ICRA), Shanghai, China.
Cai, X., Sun, Y., Zhao, Y., Damerow, L., Lammers, P.S., Sun, W., Lin, J., Zheng, L., Tang, http://dx.doi.org/10.1109/ICRA.2011.5980567.
Y., 2013. Smart detection of leaf wilting by 3D image processing and 2D Fourier Šeatović, D., Kutterer, H., Anken, T., 2010. Automatic weed detection and treatment
transform. Comput. Electron. Agric. 90, 68–75. http://dx.doi.org/10.1016/ in grasslands. In: 2010 Proc. ELMAR. IEEE, pp. 65–68 http://ieeexplore.ieee.org/
j.compag.2012.11.005. stamp/stamp.jsp?tp=&arnumber=5606087&isnumber=5606063. 09/10/2016.
De Rainville, F.-M., Durand, A., Fortin, F.-A., Tanguy, K., Maldague, X., Panneton, B., Slaughter, D., Giles, D., Downey, D., 2008. Autonomous robotic weed control
Simard, M.-J., 2014. Bayesian classification and unsupervised learning for systems: a review. Comput. Electron. Agric. 61 (1), 63–78. http://dx.doi.org/
isolating weeds in row crops. Pattern Anal. Appl. 17 (2), 401–414. http://dx.doi. 10.1016/j.compag.2007.05.008.
org/10.1007/s10044-012-0307-5. Søgaard, H., 2005. Weed classification by active shape models. Biosyst. Eng. 91 (3),
Dupuis, J., Paulus, S., Mahlein, A.-K., Kuhlmann, H., Eichert, T., 2015. The impact of 271–281. http://dx.doi.org/10.1016/j.biosystemseng.2005.04.011.
different leaf surface tissues on active 3D laser triangulation measurements. Strothmann, W., Kielhorn, A., Tsukor, V., Trautz, D., Ruckelshausen, A., 2013.
Photogramm. – Fernerkundung – Geoinform. 6, 437–447. http://dx.doi.org/ Interactive Image Segmentation for Model Adaption and Decision Support. In:
10.1127/pfg/2015/0280. 9th European Conference on Precision Agriculture, Book of Posters, pp. 95–96.
Dzinaj, T., Kleine Hörstkamp, S., Linz, A., Ruckelshausen, A., Böttger, O., Kemper, M., <https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/
Marquering, J., Naescher, J., Trautz, D., Wigerodt, E., 1998. Multi-sensor-system Veroeffentlichungen/2013-ECPA-Interactive-Image-Segmentation.pdf> (05/21/
zur Unterscheidung von Nutzpflanzen und Beikräutern. Zeitschrift für 2016).
Pflanzenkrankheiten und Pflanzenschutz 16, 233–242. Strothmann, W., Ruckelshausen, A., Hertzberg, J., 2014. Multiwavelength laser line
Emmi, L., Gonzalez-de Soto, M., Pajares, G., Gonzalez-de Santos, P., 2014. Integrating profile sensing for agricultural crop characterization. In: SPIE Photonics Europe.
sensory/actuation systems in agricultural vehicles. Sensors 14 (3), 4014–4049. International Society for Optics and Photonics, p. 91411K. http://dx.doi.org/
http://dx.doi.org/10.3390/s140304014. 10.1117/12.2052009.
Fittje, S., Hänsel, M., Langsenkamp, F., Kielhorn, A., Kohlbrecher, M., Vergara, M., Strothmann, W., Tsukor, V., Ruckelshausen, A., 2015. In-Field-Labeling-HMI für
Trautz, D., 2015. Praxiserhebungen zu Aufwand und Erfolg der Handjäte in automatische Klassifizierung bei der Pflanzen- und Erntegutcharakterisierung
Möhren unter ökologischer Bewirtschaftung. In: Am Mut hängt der Erfolg - mittels bildgebender Sensordaten, in: Informatik in der Land-, Forst-, und
Rückblicke und Ausblicke auf die ökologische Landbewirtschaftung. Beiträge Ernährungswirtschaft, Referate der 35. GIL-Jahrestagung, Geisenheim, pp. 177–
zur 13. Wissenschaftstagung Ökologischer Landbau, Verlag Dr. Köster, Berlin. 180. <http://https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/
<http://orgprints.org/27154/1/27154_fittje.pdf> 01/10/2016. COALA/Veroeffentlichungen/2015-In-Field-Labeling-HMI_fuer_automatische_
Gonzalez-de Soto, M., Emmi, L., Perez-Ruiz, M., Aguera, J., Gonzalez-de Santos P. Klassifizierung.pdf> 05/21/2016.
Autonomous systems for precise spraying–evaluation of a robotised patch Strothmann, W., Tsukor, V., Hertzberg, J., Ruckelshausen, A., 2015.
sprayer. Biosyst. Eng. http://dx.doi.org/10.1016/j.biosystemseng.2015.12.018. Konfigurationsmöglichkeiten und Datenkonzepte des Multiwavelength Line
Haug, S., Michaels, A., Biber, P., Ostermann, J., 2014. Plant classification system for Profiling (MWLP) Systems. In: Bornimer Agrartechnische Berichte, vol. 88, pp.
crop/weed discrimination without segmentation. In: 2014 IEEE Winter 42–52. <https://www.hs-osnabrueck.de/fileadmin/HSOS/Homepages/COALA/
Conference on Applications of Computer Vision (WACV). IEEE, pp. 1142–1149. Veroeffentlichungen/2015-CBA-MWLPMultiwavelength.pdf> (05/21/2016).
http://dx.doi.org/10.1109/WACV.2014.6835733. Suzuki, Y., Okamoto, H., Kataoka, T., 2008. Image segmentation between crop and
Hemming, J., Rath, T., 2001. PA - precision agriculture: computer-vision-based weed weed using hyperspectral imaging for weed detection in soybean field. Environ.
identification under field conditions using controlled lighting. J. Agric. Eng. Res. Control Biol. 46 (3), 163–173. http://dx.doi.org/10.2525/ecb.46.163.
78 (3), 233–243. http://dx.doi.org/10.1006/jaer.2000.0639. Weis, M., Sökefeld, M., 2010. Detection and identification of weeds. In: Precision
Holzer, S., Rusu, R.B., Dixon, M., Gedikli, S., Navab, N., 2012. Adaptive neighborhood Crop Protection-the Challenge and Use of Heterogeneity. Springer, pp. 119–134.
selection for real-time surface normal estimation from organized point cloud http://dx.doi.org/10.1007/978-90-481-9277-9_8.
data using integral images. In: 2012 IEEE/RSJ International Conference on Weiss, U., Biber, P., Laible, S., Bohlmann, K., Zell, A., 2010. Plant species classification
Intelligent Robots and Systems (IROS). IEEE, pp. 2684–2689. http://dx.doi.org/ using a 3D lidar sensor and machine learning. In: 2010 Ninth International
10.1109/IROS.2012.6385999. Conference on Machine Learning and Applications (ICMLA). IEEE, pp. 339–345.
Japkowicz, N., Stephen, S., 2002. The class imbalance problem: a systematic study. http://dx.doi.org/10.1109/ICMLA.2010.57.
Intell. Data Anal. 6 (5), 429–449 http://dl.acm.org/citation.cfm?id=1293951.
1293954. 01/10/2016.

Das könnte Ihnen auch gefallen