alexa A Mobile Robotic Platform for Crop Monitoring

ISSN: 2168-9695

Advances in Robotics & Automation

Reach Us +44-1522-440391

A Mobile Robotic Platform for Crop Monitoring

Mostafa Bayati and Reza Fotouhi*
Department of Mechanical Engineering, University of Saskatchewan, Saskatoon, Canada
*Corresponding Author: Reza Fotouhi, Department of Mechanical Engineering, University of Saskatchewan, Saskatoon, Canada, Tel: 1-306-966-5453, Email: [email protected]

Received Date: Jun 27, 2018 / Accepted Date: Jul 12, 2018 / Published Date: Jul 31, 2018

Abstract

Development, implementation and performance verification of a field-based high-throughput plant phenotyping mobile platform for monitoring Canola plants, including both data acquisition/visualization software and measurement system, was the main contribution of this research. Plant breeders need an efficient tool to monitor a number of plant traits to achieve a higher yield. Currently, manual measurement to gather the required information is customary. Manual measurement has many limitations especially to study a large-scale field. In this research, a high-throughput plant phenotyping platform was developed. A long boom was attached to a farm vehicle to carry different sensors, cameras and other measurement equipment. A program was developed to read sensors signals and to geo-tag data using GPS for future retrieval. Three programs were developed for image acquisition via webcam and still cameras and a central program for data processing and data visualization. The efficiency of different system architecture including different data transmission networks was examined by conducting several laboratory and field tests on a Canola nursery in spring-summer of 2017. The performance of the phenotyping platform was validated by various measures such as conducting manual measurements and comparing the results with the values given by the platform. The main contributions of this research for plant phenotyping research are automation of image acquisition, enhancement of the data acquisition cycle to minimize data geo-referencing error, development of a modular program for data visualization, and faster data collection in a high-throughput fashion (almost 125 times faster). Works are underway to replace the vehicle with an autonomous ground vehicle (AGV).

Keywords: Mobile platform; Sensors; Automated crop monitoring

Introduction

The heart of plant biology is to understand the link between a genetic difference and expressed phenotypes. Genotyping now is accomplished with low-cost, high-throughput analyses. However, phenotyping is labor intensive; it has become a restrictive factor in plant biology studies and crop improvement research. Platforms for high-throughput phenotyping, mainly under field settings, are needed to match the genomic information [1-3].

There is growing interest in adapting agricultural machinery and electronic sensors for field-based high-throughput phenotyping [4]. Potential applications include monitoring of the crop response to soil, management variability, and correcting crop management with variable inputs to improve yield and economic efficiency, i.e., precision agriculture [5].

Crop breeding and associated genetic research are based on phenotyping a large number of experimental lines. To date, this assessment has been largely labor-intensive [6].

Recent developments in automation, robotics, accurate environmental control and remote sensing facilities will likely offer opportunities for more precise field phenotyping of crop plants through high-throughput plant phenotyping platforms (HTP). To date, there are no commercial HTP vehicles (manual or autonomous ground robots) available which can do phenotyping. The research in this area is still in its infancy. At a recent world symposium on plant phenotyping, the most recent development in this area was presented [7-9]. A French team presented their prototype of Phenomobile- LV which is a robot for field phenotyping with a few sensors installed for wheat. An American team presented Enviratron [8]. A Norwegian team exploring use of robot and drone for phenotyping [9].

Phenotyping identifies the most desirable and adaptive plant/ crop architecture by means of monitoring the response of the genetic behavior of the crop to the environmental changes imposed on the crop. In particular, it is desirable to find the effects of heat, drought stress, soil properties, and crop disease, on the growth dynamics, yield, disease resistance, and quality of the crop [10]. Imaging is one the most common tools used to monitor the response of crops to the stresses introduced. However, HTP can be a tedious, time consuming, and labor intensive job for plant breeders [2]. Measuring a lot of plots in a time-sensitive manner on a large number of genotypes is impossible for many traits (transpiration, photosynthesis, heat stress, drought stress). HTP via a farm vehicle or automated platform is the way around this bottle-neck.

This research presents development and analysis of a prototype of a mobile field-based phenotyping (FBP) platform; we used an existing farm vehicle (a swather). This farm vehicle is narrow enough (about 8 feet wide) to travel down the pathways between ranges of canola plots without damaging any plant, while collecting data. A frame was designed to carry the required imaging sensors on both sides of this vehicle. Furthermore, to measure the height of the plant canopies, ultrasound sensors were used. In order to measure the fullness of crop during the growing period, vision seems to be an appropriate method. Infrared thermometers were used to capture canopy temperature. Two RGB cameras were installed on the boom of the tractor and a novel image acquisition program was developed for capturing images of each plot autonomously. Additional image processing of these data may be a good indication of the health or stress of crops. The images allow us to analyze the visible differences between different plots. Also, Crop Circle sensors were used to capture NDVI information [11]. Existing RTK-GPS (real-time kinematic global positioning systems) on the farm vehicle was exploited to geo-reference all collected data and each plot in the field. With the utilized GPS system, there will be up to one- inch accuracy on the location of tractor and crops.

Materials and Methods

The developed HTP platform consists of three different parts: A) mechanical structure, B) electrical and measurement system and C) software components. In next sections, each part will be introduced and discussed in particulars.

Mechanical structure

Overall, the proposed HTP platform has two main mechanical components.

The carrying vehicle: A high clearance swather was used to carry a mechanical boom and scanning equipment (Figure 1). The utilized farm vehicle can be seen in Figure 2. The utilized farm vehicle was equipped with a highly accurate RTK-GPS and was capable of being driven autonomously. As can be seen in Figure 1, the HTP platform can scan two plots at the same time (from the left and right); this feature significantly improves the data collection speed.

advances-in-robotics-automation-traveling

Figure 1: The traveling path of the platform through the field.

advances-in-robotics-automation-system

Figure 2: The developed boom system.

Mechanical boom: To carry sensors and scanning equipment, two different mechanical booms were developed to be attached to the rear of swather as can be seen in Figure 2a.

As can be seen in Figure 2b, the 2nd-generation boom is foldable so it can be transported in highways conveniently in comparison with the 1st-generation design which was a solid and long mechanical frame. Figure 3 illustrates the developed platform in action, and different components are marked on the mentioned image.

advances-in-robotics-automation-platform

Figure 3: The proposed htp platform in action.

Electrical and measurement system

The current system takes advantage of four ultrasonic sensors, to measure the height of the plant canopies (two sensors on each side of the boom). In fact, it was found that ultrasonic sensors could be efficiently used to measure the height of different objects including a plant with an acceptable level of measurement error. Table 1 illustrates the result of height measurement of four different plants in the laboratory configuration. As can be seen, the maximum measurement error was 2.5% which seems acceptable for a non-contact height measurement system [12].

hu (Ultrasonic Height in cm) hm (Manual height in cm) Diff=hm-hu (cm)  
79.0 79.0 0.0 0.0%
70.0 71.0 1.0 1.4%
63.5 65.0 1.5 2.3%
50.0 51.0 1.0 2.0%

Table 1: Plant height measurement in lab.

Two Crop Circle sensors were utilized to measure canopy NDVI (Normalized Difference Vegetation Index), two infrared thermometers to capture canopy temperature, and two RGB cameras to capture images of canopies (one on each side of the boom). To communicate and capture sensors data, a Campbell Scientific data logger was used. Also, a laptop was used to communicate with two webcams to capture RGB images. The data acquisition task was assigned to a dedicated unit (data logger) and image acquisition task was carried out by a laptop to be able to not only capture the required phenotypic data (which is height, temperature, and NDVI) but also to concurrently capture RGB images of plant canopies during one trip through the field. This feature significantly improved the phenotyping process in terms of the required time to conduct a field trial to acquire a high volume of data in one traverse.

In addition, to attach the time and geospatial information tags to each data point and captured image, a GPS system was required to be accessed by both utilized data logger and laptop. Longitude, latitude, and UTC were captured at each sampling point and were tagged to data and images. As can be seen in Figure 3, GPS data was imported into both laptop and data logger to geo-tag both captured images and also collected data.

Software components

The software part of the proposed HTP platform consists of three programs. The first program was developed for image acquisition to send trigger commands to cameras to capture pictures when needed, to read GPS data string for geo- referencing purposes, and to save captured images by a predefined filename format [13]. The second program was developed for data acquisition to read sensors values and GPS data by the data logger and write them into an Excel file. The third program was developed for data visualization and post- processing. In the next three sections, the details of each program are discussed.

Image acquisition program: The ability of autonomously capturing images of plant canopies during the data collection was one of the most important aspects of the proposed HTP platform. In fact, automatic image acquisition feature has not been reported anywhere in the related literature on field-based HTP platforms. As can be seen in Figure 4, an interactive program was developed in MATLAB GUI to control the operation of the image acquisition program which includes starting image capturing cycle, stopping at the end of data collection and finally saving images at the end of data collection. As can be seen, after user started the program by pressing the start button, the GPS string will be acquired, and then using image acquisition and image processing toolboxes, the program sends control commands to the cameras to capture pictures and this operation is repeated every 0.5 seconds (this time can be adjusted by user as can be seen in Figure 4). After the entire plots in a field are scanned, the user stops image acquisition process. At this time, captured images (which can be up to 5000 images, depending on the field size), as well as GPS data, are being stored in the laptop temporary memory (RAM). The user can click on save to save the captured images. It was found that saving an image is a time- consuming task, so the program was developed so that it doesn’t save the images in real-time to avoid delay issues in geo-referencing. By utilizing this technique, the positioning error was significantly reduced.

advances-in-robotics-automation-acquisition

Figure 4: GUI for image acquisition program.

Data acquisition program: Data acquisition program, which is being executed on datalogger, was developed for reading four differential analog input channels to capture four signals coming from Ultrasonic sensors, two single-ended analog inputs to read IR thermometers data, two serial RS232 ports to read NDVI sensors data, and also a third RS232 port for reading GPS NMEA string for georeferencing; all recorded data for convenient retrieval for visualization and further analysis purposes. Figure 5 illustrates the structure of the resulting Excel file which stores all collected data in different rows. As can be seen, the second column stores the time of data collection, 3rd and 4th columns store the temperature information, 5th and 6th columns store the NDVI information, the 7th column holds the GPS string and the last four columns store the height information. The length of this Excel file depends on the scale of the field and the traveling speed of HTP platform.

advances-in-robotics-automation-database

Figure 5: The structure of excel database to store phenotypic data.

It was found that any further processing of data including GPS string should be strictly escaped to avoid delay issues in georeferencing of collected data. In fact, because of the limited processing speed of dataloggers, any further processing should be postponed after data collection in post-processing and visualization programs, and it is adequate to store raw data in real-time (Figure 6).

advances-in-robotics-automation-collected

Figure 6: The visualization program after importing collected data and images.

Visualization and data analysis program: Figure 7 illustrates different sections of the then developed program for data visualization and further post-processing of the collected data as needed. As can be seen in this figure, the program consists of the main display area to show the map of the field and the graphical objects as the representative of available data and also several control buttons and display area on the right side of the main window. This program was developed to help breeders to conveniently extract the required information out of the raw database. To do so, first, the user needs to import the phenotypic database into an Excel file, and then the folder which contains all captured images. Figure 6 illustrates the display area of the developed visualization program after all data and images are imported into the program.

advances-in-robotics-automation-visualization

Figure 7: The GUI for the developed data visualization and post-processing.

In Figure 6, a green rectangle on the background layer represents the location of a plot in a field with their corresponding numbers on the lower left side, a blue circle represents a height data point, the red circles illustrate the temperature data points and finally, the triangles represent an available image within a plot. For example, we can see that there are seven available images for plot #3.1

After all data is imported as graphical objects, the user can simply click on different objects to monitor the captured value for a specific plot. For example, Figure 8 illustrates how an available height data point near the left border of plot #1.2 is selected by the user and the result can be seen in the gray display area on the lower right side of the program. As can be seen in this Figure 8, the latitude, longitude and the captured value for the mentioned data point can be explicitly monitored [14].

advances-in-robotics-automation-point

Figure 8: Monitoring a height data point within plot # 1.2

Experimental Results

In this section, part of the experimental results obtained after nine times data collection using the developed HTP platform at a Canola breeding facility in Saskatchewan, Canada during June to August 2017, is discussed. The developed visualization program was used to extract information out of raw data. Also, a few strategies were used to verify the validity of measurements, including accuracy of height measurements as well as the positioning error and data geo-referencing. Breeders can employ the same approach to extract more information out of the resulting phenotypic databases for more analysis. In next sections, details of each experiment are discussed.

Verifying the validity of the collected data

The first concern about the performance of the proposed HTP platform was the validity of the collected data. In fact, it had to be verified that collected data points are associated with their actual plots in the field. For example, if the visualization program illustrates five available images for plot number N, we should somehow make sure this is true for actual plot number N in reality as well. To verify this matter, some visible signs were made and were placed next to about 80 plots in different locations across the field. Each sign was a red square and a tag showing the corresponding plot number. Figure 9 illustrates an example for plot number 1.14. As can be seen, we selected an available image (a triangle object) on the left border of the mentioned plot in the visualization program. Because we know that there is a physical sign in that place in the field, the validity of this measurement can be verified only if the plot numbers on the actual field and in the program are matched.

advances-in-robotics-automation-visible

Figure 9: Utilized visible signs to verify the validity of measurements for plot number 1.14.

This examination was continued for all other signs and it was found that 100% of the signs were matched with their corresponding plots.

Comparison between manual and ultrasonic height measurements

As can be seen in Figure 10, the height of some random plots in the field using a scaled bar was manually measured. Because we are looking for a single value as a representative of the height of different plots, we captured the height of each plot in three different locations (left, middle and right sides); the average of these three captured numbers can be considered a true representative of the height of each plot.

advances-in-robotics-automation-canopy

Figure 10: Manual measurement of canopy height to verify automatic height measurement.

As result persisted, the ultrasonic sensor could capture the height of different plots with an acceptable level of error for a field-based measurement system. Figure 11 illustrates a bar graph to compare manual measurements and ultrasonic measurements for 19 random plots. As can be seen, the manual height measurement corresponded well with automatic data collected.

advances-in-robotics-automation-measurements

Figure 11: Comparison between manual vs. ultrasonic height measurements.

Analyzing the collected data for the whole population

One of the advantages of using a high-throughput plant phenotyping platform is the fact that the resulting database can be used in different approaches for different purposes. In fact, breeders have the flexibility to study both entire population and the individual plots for closer investigations. In this section, a summary of the results achieved after studying a population consisting of 252 canola plots by the proposed plant phenotyping platform is provided. Figure 12 illustrates the variation in the average canopy a) NDVI and b) temperature values of the studied population between June 23-August 18, 2017. For example, by having a look at Figure 12 it can be comprehended that the average NDVI and temperature of the entire population (252 plots) are recorded to reach their maximum values on July 7 and June 30, respectively. So we expect that the majority of the plant canopies should be in their maximum greenness on July 7, and this matter can be verified by investigating the captured images during July 7 trial. Likewise, it was found that the average temperature of the entire plots within the field had a continuous growth between June 23 and July 7, and after that time we observed a fluctuation in the temperature until August 4 when another continuous growth could be perceived. In fact, breeders can study the drought stress by investigating in the resulting graphs.

advances-in-robotics-automation-population

Figure 12: Variation in the entire studied population between June 23 and august 18.

Analyzing the growth of individual plots

In the previous section, the variation in the average value of different parameters of the entire studied population (252 plots) was discussed. In this section, it is shown that how the proposed HTP platform can be used to analyze the individual plots as well. For example, Figure 13a illustrates the growing steps of plot number 1.35 in three different days. It is notable that all images are extracted from the developed visualization.

advances-in-robotics-automation-pictures

Figure 13: The growing pictures of the plot number 1.35 in three different days.

In fact, the capability of autonomously capturing images of canopies is an important feature of the developed crop phenotyping platform as discussed before. The growing stages of the plot number 1.35 can be clearly seen in Figure 13b. As can be seen in Figure 13c, the mentioned plot came into flowering on July 13; breeders can employ the same approach to analyze any desired plot.

Furthermore, the variation in different parameters of individual plots can be analyzed as well. Figure 14 illustrates two line graphs to illustrate the variation in temperature and NDVI values of plot number 1.35 between June 23 and August 18. As can be perceived from Figure 14b, the temperature for the mentioned plot is recorded to be ~ 17°C on June 23, and after some ups and downs, it was settled around 22°C on August 18. On the other hand, NDVI value for the mentioned plot had a continuous growth after June 23 until the maximum NDVI value was captured on July 7, which is ~0.78. After July 7, the majority of plants came into flowers (with yellow color) so the NDVI value, which is an indication of the greenness level of a canopy, was decreased. A turning point can be perceived on July 18 because the plots were observed in their maximum flower on July 18 (yellow color was dominant to green color). After July 18, when flowers began to shrink, the NDVI value started to increase because the green color was gradually becoming dominant over the yellow color.

advances-in-robotics-automation-traits

Figure 14: Variation of different traits in plot # 1.35.

Conclusions

A new mobile platform for field-based high-throughput plant phenotyping was developed. It was shown that the mobile platform can be reliably utilized to phenotype canola cultivars (wheat or other plants with some minor modification) with high repeatability and accuracy. In fact, canopy height, temperature, and NDVI as well as RGB images of 252 Canola plots in a nursery (in Saskatchewan, Canada) were captured through field experiments between June-August 2017. Autonomous image acquisition feature was implemented in a fieldbased HTP platform for the first time in Canada. The data acquisition cycle minimized the geo-referencing error. In addition, a modular program with an interactive user interface was developed, using MATLAB for data visualization. This research is a proof of concept that some of the most popular high-throughput phenotyping features can be used automatically to achieve precision phenotypes in crop plants. This technology improves the productivity of farms, and reduce the cost of the crop research in the intermediate and long run.

Acknowledgment

The authors wish to acknowledge the financial support provided by NSERC (Natural Sciences and Engineering Research Council of Canada) and the industrial partner Cargill Canada Ltd. Also, the role of Mr. Tyler Zhang, Dr. Rahim Oraji, Mr. Farzam Ayatizadeh, Robotics Laboratory at the University of Saskatchewan is gratefully acknowledged.

References

Citation: Bayati M, Fotouhi R (2018) A Mobile Robotic Platform for Crop Monitoring. Adv Robot Autom 7: 186. DOI: 10.4172/2168-9695.1000186

Copyright: © 2018 Bayati M, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Select your language of interest to view the total content in your interested language

Post Your Comment Citation
Share This Article
Relevant Topics
Article Usage
  • Total views: 786
  • [From(publication date): 0-2018 - Nov 17, 2018]
  • Breakdown by view type
  • HTML page views: 716
  • PDF downloads: 70

Post your comment

captcha   Reload  Can't read the image? click here to refresh