Top Navigation Bar

ASAE Conference Proceeding

This is not a peer-reviewed article.

Automatic Guidance System With Crop Row Sensor

H. Okamoto, K. Hamada, T. Kataoka, M. Terawaki and S. Hata

Pp. 307-316 in Automation Technology for Off-Road Equipment (ATOE) Proceedings of the 26-27 July 2002 Conference (Chicago, Illinois, USA), ed. Qin Zhang ,Pub. date 6 July 2002 . ASAE Pub #701P0509

ABSTRACT

The objective of this study was to develop an automatic guidance system with a crop row sensor. The crop row sensor consisted of the CCD camera and the image-processing algorithm. This sensor would be applied to the automatic guidance system for the weeding cultivator. In this report, the performance of the developed sensor and the control system of the crop row following was discussed.

The method for sensing the crop row location was followed. The color CCD camera took the crop row images. The computer detected the crop row location based on the image-processing technique. Results of the field test showed that the crop row sensor had both well performances of the accuracy and the response.

The developed sensor was applied to the weeding machine. According to the offset between the weeding machine and the target crop row, the weeding machine was controlled to adjust to the crop row by the hydraulic cylinder. The field test results showed that this system was well controlled along the target crop row, and the weeding was successful without any damages to crops. The crop-row sensor was also applied to the automatic steering control system of a tractor. The system steered the tractor in proportion to the offset between the centerline of the tractor and the target crop row. The Field test showed that this system worked completely successful.

KEYWORDS. Automatic control, Automatic guidance, Automatic steering, Machine Vision, Image processing, Weed control, Weeders.

Introduction

Recently, the field operations, especially fertilizing, spraying and weeding, by the agricultural machinery have needed the high accuracy with the high traveling speed regarding with the philosophy of Precision Agriculture. In the work of cultivating, weeding or thinning, an operator has to keep the crop rows without any crop damages. In addition, since most of the implements are the rear-mounted type, the operator has more workload of checking the crop damages by frequently looking behind the implement. If the implement automatically follows crop rows, the operator can concentrate on steering of the tractor without looking the backward. This technology promises the basis of an automatic guidance system of the agricultural vehicle and implements.

The objective of this study was to develop a crop-row sensor and the goal was to apply the sensor to the automatic row-following control system for a weeding cultivator. In this report, the feasibility of a new method for detecting crop rows and the performance of the row-following control systems were discussed.

CROP ROW SENSOR

Image-processing Algorithm for Detecting Crop Rows Location

When the CCD video camera is located at point Pc with tilt angle of ? X as shown in Figure 1, the image that is projected onto the retina is shown as Figure 2. However, in this study, the method for detecting crop rows doesn't use the original images obtained by the camera. Instead, the original images (the camera coordinate system) are transformed into ground coordinate images (Figure 3). The ground coordinate system corresponds to the vertical view from above. On the transformed image, crop rows are nearly parallel and vertical as shown in Figure 3.

Figure 1. Geometry of crop row detection

307-316_files/image1.gif

Figure 2. Crop row image (Camera coordinates)

307-316_files/image2.jpg

Figure 3. Crop row image (Ground coordinates)

307-316_files/image3.jpg

To obtain a ground coordinate image, all points P ( x , y ) on the camera coordinate system are transformed into points P' ( X , Y ) on the ground coordinate system, using the perspective transformation, shown in Figure 4. The perspective transformation can be represented by the following equations.

307-316_files/image4.gif (1)

307-316_files/image5.gif (2)

307-316_files/image6.gif (3)

Here, ? is the focal length of the lens, h is the height of the camera, and ? X is the tilt angle of the camera.

The transformed image does, however, contain blank points as in Figure 5; therefore, it is necessary to fill these in through interpolation. In this study, the nearest neighbor interpolation is applied.

Figure 4. Perspective transformation

307-316_files/image7.gif

Figure 5. Blank points caused by transformation

307-316_files/image8.gif

In addition, since the transformed image becomes trapezoid, and there are blank triangular areas on both sides as in Figure 1. Those areas are painted out soil color.

In order to extract crop areas, the R and G signals of an RGB color system are used. When a ground coordinate image is divided into five fields of vision ( F 1 - F 5) as shown in Figure 6, and image intensity of R and G in each field are vertically integrated, and G/R values are obtained as in Figure 7. If the length of each field of vision is short, lack of crop may be caused in the field. So each field overlaps with its neighbors to enlarge their length as in Figure 6. Since this method converts a two-dimensional crop-row image into compacted one-dimensional data, the amount of data is reduced drastically, and processing time can be shortened. In Figure 7, higher G/R values correspond to crop-row, and lower values correspond to inter-row soil.

This method detects, not individual plants, but rows as straight lines. A regression line, obtained by applying the least square method corresponds to a crop row that is assumed to be a straight line. During testing, to avoid problems in the use of the least square method, G/R values were normalized, and values under 50% were replaced with zero, as in Figure 8.

Figure 6. Image was divided into five fields ( F 1~ F 5)

307-316_files/image9.jpg

Figure 7. G/R values

307-316_files/image10.gif

First, a border, L 0, which divided the left and right rows, was determined by the discriminant analysis, as in Figure 9. Next, borders, L 1 and L 2, which were the outside lines of the left and right rows, were determined. The zone between L 0 and L 1 was the zone of the left row. The zone between L 0 and L 2 is the zone of the right row. The regression lines in the left and right zones were calculated by the least square method, respectively.

The detected lines can be represented by coordinates of four points ( P 1 ' - P 4 ' ) as shown in Figure 10. P 1 ' and P 2 ' are the upper end points of the lines on the right and left, and P 3 ' and P 4 ' are the lower end points. In Figure 11, an offset value e and a heading angle a between the sensor and the target crop row can be represented by the following equations.

307-316_files/image11.gif (4)

307-316_files/image12.gif (5)

Figure 8. Modified G/R values

307-316_files/image13.gif

Figure 9. Extraction of left and right row zones

307-316_files/image14.gif

Figure 10. Detected crop row

307-316_files/image15.jpg

Figure 11. Offset e and heading angle a between

307-316_files/image16.gif

Experimental System and Methodology

To evaluate the method for detecting crop rows, an experimental crop-row detecting system was built and tested.

The crop-row detecting system is composed of readily available devices such as a CCD video camera, a video capture board, and a personal computer. A tractor or a field machine that is equipped with a video camera travels along a crop row, and continuous images are captured by the video camera. Crop-row images for the test were recorded at the experimental farm of Hokkaido University.

In the test, three sequences of crop-row images (Crop-row A - C) were used. Crop-row A and B were beet, and Crop-row C was soybean. Crop-row A was the smallest and Crop-row C was the largest. While Crop-row A was captured by a backward-facing camera fixed on a cultivator at a speed of 1.3 m/s, Crop-row B and C were captured by a forward-facing camera fixed on a tractor at 2.0 m/s.

Crop-row pictures were recorded by the video tape recorder, and converted into digital movie data (image size, 320240 pixels; sampling rate, 10 Hz) by the video capture board (I-O DATA DEVICE Inc. GV-VCP/PCI). The crop-row detecting software loads still images from the movie data one at a time, and detects the crop row. 100 frames from each row (Crop-row A - C) were sampled.

To estimate the accuracy of crop-row detection, the root-mean-square (R.M.S.) of detection errors of an offset value e and a heading angle a between the sensor and the target crop row were calculated. To calculate detection error, it is necessary to obtain accurate information on the position of the crop row in advance. However, this is very difficult because images were captured continuously by a traveling vehicle. Therefore a line was drawn manually over the crop row on the image, and the position of the drawn line was taken as the actual position. The difference between the actual position and detected position is regarded as the detection error.

Result and Discussion

Table 1 shows the R.M.S. of the detection errors. The R.M.S. of the offset ( e ) errors was from 2.165 to 2.911 cm, and the R.M.S. of the heading angle ( a ) errors was from 0.218 to 0.316 degrees. The result shows that the proposed method for detecting crop rows seems to be accurate enough for practical use.

This process took from 20 to 30 ms per image, using a personal computer (PC-AT compatible; CPU, Intel Celeron at 450 MHz; OS, Microsoft Windows 98). This speed seems to be quick enough for practical use.

Table 1. R.M.S. of detection errors

R.M.S. of detection errors

Crop-row A

Crop-row B

Crop-row C

e [cm]

2.40

2.17

2.91

a [deg]

0.32

0.22

0.31

Row-following Control System

The developed crop-row sensor was applied to the automatic row-following control system for the weeding cultivator (Nichinoki Seiko Inc. NAK-5, shown in Figure 12), and tested in the farm field.

This system was composed of the crop-row sensor and the row-following control unit, as shown in Figure 13. The CCD camera of the crop-row sensor was mounted on the cultivator, and took the backward crop row images. The sensor detected the target crop row lines and calculated the line location in real time after capturing the crop-row image. The row-following control unit received the offset value between the target row and the cultivator. According to the detected offset, the hydraulic cylinder moved the whole cultivator either right or left to minimize the offset.

While the system is working, the position of the weeding cleaner has to be adjusted to the target crop row essentially. However, the hydraulic cylinder moves the cultivator slowly and the system is traveling. Therefore, if the system were controlled according to the offset based on the position of the weeding cleaner, the control would be delayed. In order to avoid the delay in control, the system has to be controlled by predicting offset. Therefore, the datum point of the offset was dislocated ahead. In the field test, the datum point of the offset was set at 0, 25, 50cm ahead from the weeding cleaner, as shown in Figure 14.

In the field test, the weeding cultivator was driven at the traveling velocity of 0.9, 1.1, and 1.3 m/s. The each tested fields had the distance of 150m. The test was also conducted without row-following control.

Figure 12. Cultivator with weeding cleaners

307-316_files/image17.gif

Figure 13. Row-following control system

307-316_files/image18.gif

Figure 14. Datum point of offset

307-316_files/image19.gif

Figure 15. Tracks of cultivator and crop row

307-316_files/image20.gif

Figure 15 is the graph that shows the tracks of the cultivator and the target crop row in the field test. The upper graph illustrates that the cultivator wasn't moved along the crop row when the system wasn't controlled. The lower graph illustrates that the cultivator was well adjusted to the target row when the system was controlled.

Table 2 shows the R.M.S. of the distance between the tracks of the cultivator and the target crop row in the field test. The R.M.S. under the automatic row-following control was smaller than that under no control. It represents that the system was obviously effective in precise row-following. In addition, by dislocating the datum point of the offset ahead, namely predicting the offset, accuracy of adjustment to the target crop row was improved. As the traveling velocity was increased, predicting of the offset was more effective. And the accuracy of adjustment was almost equivalent when the datum point of the offset was at 50cm.

Table 2. R.M.S. of row-following errors

Traveling velocity [m/s]

R.M.S. of row-following errors [cm]

Datum point of offset [cm]

Without control

0

25

50

0.9

2.72

2.63

2.50

5.69

1.1

3.12

2.15

2.60

8.09

1.3

4.25

3.46

2.39

6.19

Conclusion

A new method for detecting crop rows was developed and tested. The results of the field test confirmed feasibility of the new method. The R.M.S. of the offset errors between the camera and a target crop row was less than about 3 cm, and the R.M.S of the heading errors was less than about 0.3 degrees. This process took 20 to 30 ms per image (CPU, Intel Celeron 450MHz). These are accurate and quick enough for practical use.

The developed crop-row sensor was applied to an automatic guidance system for the weeding cultivator. Field tests showed that the system was obviously effective in precise row following, and the prediction of the offset between the cultivator and crop row improved accuracy of row following.

REFERENCES

Chen, B., K. Watanabe, S. Tojo, F. Ai and B. K. Huang. 1997. Studies on the Computer-eye of Rice Transplant Robot (Part 2). Journal of JSAM 59(3): 23-28.

Gerrish, J. B. and T. C. Surbrook. 1984. Mobile Robots in Agriculture. Proceedings of First International Conference on Robotics and Intelligent Machines in Agriculture. Tampa, FL., 30-41.

Hata, S. and M. Takai. 1992a. Crop-row detector equipped with one-dimensional image sensor (Part 1). Journal of JSAM 54(1): 81-88.

Hata, S., M. Takai and K. Sakai. 1992b. Crop-row detector equipped with one-dimensional image sensor (Part 2). Journal of JSAM 54(2): 83-89.

Hata, S., X. Wei, M. Takai and K. Sakai. 1992c. Crop-row detector equipped with one-dimensional image sensor (Part 3). Journal of JSAM 54(4): 11-18.

Kanuma, T., T. Okamoto and T. Torii. 1997. Image Analysis of Crop Row and Position Identification (Part 1). Journal of JSAM 59(2): 57-63.

Kusano, N., N. Inou, O. Kitani, T. Okamoto and T. Torii. 1995. Image Analysis of Crop Row Used for Agricultural Mobile Robot (Part 1). Journal of JSAM 57(4): 37-44.

Okamoto, H., S. Hata and M. Takai. 1999. Crop-Row Detector for Row-following Control System (Part 1). Journal of JSAM 61(6): 159-167.

Okamoto, H., S. Hata and M. Takai. 2000. Crop-Row Detector for Row-following Control System (Part 2). Journal of JSAM 62(2): 66-72.

Okamoto, H., S. Hata and M. Terawaki. 2000. Crop-Row Detector for Automatic Guidance Systems. Preprints of 2nd IFAC/CIGR International Workshop on Bio-Robotics, Information Technology and Intelligent Control for Bioproduction Systems: 316-321.

Okamoto, H., S. Hata, T. Kataoka and M. Terawaki. 2001. Automatic Weeding Cultivator Using Crop-row Detector, Preprints of The 4th IFAC Symposium on Intelligent Autonomous Vehicles: 121-126

Okamoto, H., T. Kataoka, M. Terawaki and S. Hata. 2001. Automatic Row-Following Control Using Machine Vision, ASAE Meeting Paper No.01-1163. Sacramento, CA: ASAE

Torii, T., O. Kitani, T. Okamoto, N. Kusano and N. Inou. 1995. Image Analysis of Crop Row Used for Agricultural Mobile Robot (Part 2). Journal of JSAM 57(6): 53-59.

Torii, T., T. Okamoto, A. Takamizawa and T. Kanuma. 1997. Image Analysis of Crop Row and Position Identification (Part 1). Journal of JSAM 59(5): 37-44.

Watanabe, K., B. Chen, S. Tojo, F. Ai and B. K. Huang. 1997. Studies on the Computer-eye of Rice Transplant Robot (Part 1). Journal of JSAM 59(2): 49-55.