Classification of Rice Growth Stage on UAV Image Based on Convolutional Neural Network Method
DOI:
https://doi.org/10.23887/janapati.v12i1.60959Keywords:
classification, high resolution UAV imagery, deep learning, rice growthAbstract
Currently, the majority of the agricultural sector in Indonesia is carried out by small communities. Half of the Indonesian people (approximately 10 million people) work in the agricultural sector and utilize agricultural land. Some of the tools used by farmers are still using traditional tools, but some are already using modern farming tools. In general, agricultural tools are divided into 3 categories, namely agricultural tools used before the seeds are planted, agricultural tools used when caring for seedlings that are growing and developing, and agricultural tools used when harvesting. One of the technologies used in agriculture is the use of drones or Unmanned Aerial Vehicles (UAV) in the process of sowing fertilizers and seeds and spraying pesticides. The current use of UAVs supports agriculture with manual operation and based on GPS waypoint positioning. In the process, the visual aspects that can be obtained from the UAV have not been considered, so the treatment carried out on agricultural land is the same. The problem of similarity in treatment can lead to similar treatment on heterogeneous agricultural land. Agricultural land should be treated according to the conditions of the land. Because the condition of the land will affect the growth of the planted vegetation. Another problem found in agricultural land is the different rice growth in each paddy field. Rice growth can be seen by farmers through visual aspects but farmers cannot directly see the visual condition of rice growth as a whole because of the large area of land. Utilization of UAV by taking high-resolution aerial imagery can provide visuals of the overall condition of rice from various angles of image capture. The general objective of this research is to classify rice growth on high resolution UAV images based on the Convolutional Neural Network (CNN). The data used in this study were acquired using a multirotor UAV in the same rice field area. The data consists of 500 images consisting of 5 groups. Group 1-2 is the vegetative phase, group 3 is the generative phase and group 4-5 is the ripening phase. CNN is used to conduct training with variations of epochs are 100, 250 and 500. The best accuracy results are obtained in the training epoch 500 with 96% of Accuration
References
https://www.pertanian.go.id/, “Sektor Pertanian Masih Menjadi Kekuatan Ekonomi di Indonesia,” KEMENTERIAN PERTANIAN REPUBLIK INDONESIA, Mar. 05, 2018.
KEMENTERIAN PERTANIAN REPUBLIK INDONESIA, “Teknologi Canggih Drone Digunakan Petani,” https://www.pertanian.go.id/home/?show=news&act=view&id=3995, 2021.
Admin Dinas Pertanian dan Perkebunan, “Petani Masa Kini Wajib Tahu Jenis-jenis Alat Pertanian Modern Ini,” https://distanbun.ntbprov.go.id/artikel3.php?id=53, Mar. 05, 2019.
D. W. Triscowati, B. Sartono, and A. Kurnia, “Klasifikasi Fase Pertumbuhan Padi Menggunakan Random Forest berdasarkan Data Multitemporal Landsat-8,” 2020.
M. J. Arifin, A. Basuki, B. Sena, B. Dewantara, and P. Korespondensi, “SEGMENTASI PERTUMBUHAN PADI BERBASIS AERIAL IMAGE MENGGUNAKAN FITUR WARNA DAN TEKSTUR UNTUK ESTIMASI PRODUKSI HASIL PANEN SEGMENTATION OF PADDY GROWTH AREA BASED ON AERIAL IMAGERY USING COLOR AND TEXTURE FEATURE FOR ESTIMATING HARVEST PRODUCTION,” vol. 8, no. 1, pp. 209–216, 2021, doi: 10.25126/jtiik.202183438.
A. Y. Putri and R. Sumiharto, “Purwarupa Sistem Prediksi Luas dan Hasil Panen Padi suatu Wilayah menggunakan Pengolahan Citra Digital dengan Metode Sobel dan Otsu,” IJEIS, vol. 6, no. 2, pp. 187–198, 2016.
O. Hall, S. Dahlin, H. Marstorp, M. F. Archila Bustos, I. Öborn, and M. Jirström, “Classification of Maize in Complex Smallholder Farming Systems Using UAV Imagery,” Drones, vol. 2, no. 3, 2018, doi: 10.3390/drones2030022.
M. N. Reza, I. S. Na, S. W. Baek, and K.-H. Lee, “Rice yield estimation based on K-means clustering with graph-cut segmentation using low-altitude UAV images,” Biosystems Engineering, vol. 177, pp. 109–121, 2019, doi: https://doi.org/10.1016/j.biosystemseng.2018.09.014.
M. Hassanein, Z. Lari, and N. El-Sheimy, “A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms,” Sensors, vol. 18, no. 4, 2018, doi: 10.3390/s18041253.
L. Chen, S. Li, Q. Bai, J. Yang, S. Jiang, and Y. Miao, “Review of Image Classification Algorithms Based on Convolutional Neural Networks,” Remote Sensing, vol. 13, no. 22, 2021, doi: 10.3390/rs13224712.
M. H. Abed, A. H. I. Al-Rammahi, and M. J. Radif, “Real-Time Color Image Classification Based On Deep Learning Network,” Journal of Southwest Jiaotong University, vol. 54, no. 5, 2019, doi: 10.35741/issn.0258-2724.54.5.23.
L. Liu, Y. Wang, and W. Chi, “Image Recognition Technology Based on Machine Learning,” IEEE Access, pp. 1–1, Sep. 2020, doi: 10.1109/access.2020.3021590.
N. Xu, “The Application of Deep Learning in Image Processing is Studied Based on the Reel Neural Network Model,” in Journal of Physics: Conference Series, Apr. 2021, vol. 1881, no. 3. doi: 10.1088/1742-6596/1881/3/032096.
Tim Penyusun Renstra Undiksha, “Rencana Strategis Universitas Pendidikan Ganesha Tahun 2020-2024,” Singaraja, Jan. 2019.
OECD, “Agricultural Land,” https://stats.oecd.org/glossary/detail.asp?ID=74, 2003.
A. Kamirm Makarim and E. Suhartatik, “Morfologi dan Fisiologi Tanaman Padi.”
H. S. Saroinsong et al., “Rancang Bangun Wahana Pesawat Tanpa Awak (Fixed Wing) Berbasis Ardupilot,” Jurnal Teknik Elektro dan Komputer, vol. 7, no. 1, pp. 73–84, 2018, doi: 10.35793/jtek.7.1.2018.19195.
I. M. G. D. M. S. W. Y. , S. B. Sunarya, “Detection of Single Wet Rice Field Bund on Unmanned Aerial Vehicle Image Using Convolutional Neural Network,” 2021.
Darmawiguna, Santyadiputra., and I. M. G. Sunarya, “perancangan prototipe perangkat c-uav (courier unmanned aerial vehicle) berbasis GPS),” 2017.
J. M. Ariyanto, M. M. Setiawan, and T. Parabowo, “Uji Terbang Autonomous Low Cost Fixed Wing UAV Menggunakan PID Compensator,” ROTASI, vol. 17, no. 4, 2017.
H. A. Nugroho, Y. Triyani, and M. Rahmawaty, “Performance Analysis of Filtering Techniques for Speckle Reduction on Breast Ultrasound Images,” pp. 450–454, 2016.
T. Dapper e Silva, V. Cabreira, and E. P. de Freitas, “Development and Testing of a Low-Cost Instrumentation Platform for Fixed-Wing UAV Performance Analysis,” Drones, vol. 19, no. 2, 2018.
I. M. G. Sunarya, M. R. al Affan, A. Kurniawan, and E. M. Yuniarno, “Digital Map Based on Unmanned Aerial Vehicle,” CENIM 2020 - Proceeding: International Conference on Computer Engineering, Network, and Intelligent Multimedia 2020, no. Cenim, pp. 211–216, 2020, doi: 10.1109/CENIM51130.2020.9297883
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 I Made Gede Sunarya, I Wayan Treman, Putu Zasya Eka Satya Nugraha
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with Janapati agree to the following terms:- Authors retain copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (CC BY-SA 4.0) that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work. (See The Effect of Open Access)