Artificial Neural Network-based Model for Predicting Moisture Content in Rice Using UAV Remote Sensing Data
- Authors
- Sarkar, Tapash Kumar; Ryu, Chan-Seok; Kang, Jeong-Gyun; Kang, Ye-Seong; Jun, Sae-Rom; Jang, Si-Hyeong; Park, Jun-Woo; Song, Hye-Young
- Issue Date
- Aug-2018
- Publisher
- KOREAN SOC REMOTE SENSING
- Keywords
- ANN; Moisture content; Model simulation; Precision agriculture; UAV remote sensing
- Citation
- KOREAN JOURNAL OF REMOTE SENSING, v.34, no.4, pp 611 - 624
- Pages
- 14
- Indexed
- ESCI
KCI
- Journal Title
- KOREAN JOURNAL OF REMOTE SENSING
- Volume
- 34
- Number
- 4
- Start Page
- 611
- End Page
- 624
- URI
- https://scholarworks.gnu.ac.kr/handle/sw.gnu/11440
- DOI
- 10.7780/kjrs.2018.34.4.4
- ISSN
- 1225-6161
2287-9307
- Abstract
- The percentage of moisture content in rice before harvest is crucial to reduce the economic loss in terms of yield, quality and drying cost. This paper discusses the application of artificial neural network (ANN) in developing a reliable prediction model using the low altitude fixed-wing unmanned air vehicle (UAV) based reflectance value of green, red, and MR and statistical moisture content data. A comparison between the actual statistical data and the predicted data was performed to evaluate the performance of the model. The correlation coefficient (R) is 0.862 and the mean absolute percentage error (MAPE) is 0.914% indicate a very good accuracy of the model to predict the moisture content in rice before harvest. The model predicted values are matched well with the measured values (R-2 = 0.743, and Nash-Sutcliffe Efficiency = 0.730). The model results are very promising and show the reliable potential to predict moisture content with the error of prediction less than 7%. This model might be potentially helpful for the rice production system in the field of precision agriculture (PA).
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - 농업생명과학대학 > 생물산업기계공학과 > Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.