Cited 1 time in
무인기 기반 RGB 영상 활용 U-Net을 이용한 수수 재배지 분할
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Park, Kisu | - |
| dc.contributor.author | Ryu, Chanseok | - |
| dc.contributor.author | Kang, Yeseong | - |
| dc.contributor.author | Kim, Eunri | - |
| dc.contributor.author | Jeong, Jongchan | - |
| dc.contributor.author | Park, Jinki | - |
| dc.date.accessioned | 2023-11-28T05:42:39Z | - |
| dc.date.available | 2023-11-28T05:42:39Z | - |
| dc.date.issued | 2023-10 | - |
| dc.identifier.issn | 1225-6161 | - |
| dc.identifier.issn | 2287-9307 | - |
| dc.identifier.uri | https://scholarworks.gnu.ac.kr/handle/sw.gnu/68599 | - |
| dc.description.abstract | When converting rice fields into fields, sorghum (sorghum bicolor L. Moench) has excellent moisture resistance, enabling stable production along with soybeans. Therefore, it is a crop that is expected to improve the self-sufficiency rate of domestic food crops and solve the rice supply-demand imbalance problem. However, there is a lack of fundamental statistics, such as cultivation fields required for estimating yields, due to the traditional survey method, which takes a long time even with a large manpower. In this study, U-Net was applied to RGB images based on unmanned aerial vehicle to confirm the possibility of non-destructive segmentation of sorghum cultivation fields. RGB images were acquired on July 28, August 13, and August 25, 2022. On each image acquisition date, datasets were divided into 6,000 training datasets and 1,000 validation datasets with a size of 512 × 512 images. Classification models were developed based on three classes consisting of Sorghum fields (sorghum), rice and soybean fields (others), and non-agricultural fields (background), and two classes consisting of sorghum and non-sorghum (others+background). The classification accuracy of sorghum cultivation fields was higher than 0.91 in the three class-based models at all acquisition dates, but learning confusion occurred in the other classes in the August dataset. In contrast, the two-class-based model showed an accuracy of 0.95 or better in all classes, with stable learning on the August dataset. As a result, two class-based models in August will be advantageous for calculating the cultivation fields of sorghum. Copyright © 2023 by The Korean Society of Remote Sensing. | - |
| dc.format.extent | 15 | - |
| dc.language | 한국어 | - |
| dc.language.iso | KOR | - |
| dc.publisher | Korean Society of Remote Sensing | - |
| dc.title | 무인기 기반 RGB 영상 활용 U-Net을 이용한 수수 재배지 분할 | - |
| dc.title.alternative | Sorghum Field Segmentation with U-Net from UAV RGB | - |
| dc.type | Article | - |
| dc.publisher.location | 대한민국 | - |
| dc.identifier.doi | 10.7780/kjrs.2023.39.5.1.5 | - |
| dc.identifier.scopusid | 2-s2.0-85177037915 | - |
| dc.identifier.wosid | 001111495700011 | - |
| dc.identifier.bibliographicCitation | Korean Journal of Remote Sensing, v.39, no.5-1, pp 521 - 535 | - |
| dc.citation.title | Korean Journal of Remote Sensing | - |
| dc.citation.volume | 39 | - |
| dc.citation.number | 5-1 | - |
| dc.citation.startPage | 521 | - |
| dc.citation.endPage | 535 | - |
| dc.type.docType | Article | - |
| dc.identifier.kciid | ART003014634 | - |
| dc.description.isOpenAccess | N | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.description.journalRegisteredClass | esci | - |
| dc.description.journalRegisteredClass | kci | - |
| dc.relation.journalResearchArea | Remote Sensing | - |
| dc.relation.journalWebOfScienceCategory | Remote Sensing | - |
| dc.subject.keywordAuthor | Remote sensing | - |
| dc.subject.keywordAuthor | RGB | - |
| dc.subject.keywordAuthor | Sorghum | - |
| dc.subject.keywordAuthor | U-Net | - |
| dc.subject.keywordAuthor | UAV | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Gyeongsang National University Central Library, 501, Jinju-daero, Jinju-si, Gyeongsangnam-do, 52828, Republic of Korea+82-55-772-0532
COPYRIGHT 2022 GYEONGSANG NATIONAL UNIVERSITY LIBRARY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
