Cited 1 time in
Assessment of Automated Identification of Phases in Videos of Total Hip Arthroplasty Using Deep Learning Techniques
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Kang, Yang Jae | - |
| dc.contributor.author | Kim, Shin June | - |
| dc.contributor.author | Seo, Sung Hyo | - |
| dc.contributor.author | Lee, Sangyeob | - |
| dc.contributor.author | Kim, Hyeon Su | - |
| dc.contributor.author | Yoo, Jun-Il | - |
| dc.date.accessioned | 2024-04-08T02:30:30Z | - |
| dc.date.available | 2024-04-08T02:30:30Z | - |
| dc.date.issued | 2024-04 | - |
| dc.identifier.issn | 2005-291x | - |
| dc.identifier.issn | 2005-4408 | - |
| dc.identifier.uri | https://scholarworks.gnu.ac.kr/handle/sw.gnu/70109 | - |
| dc.description.abstract | Background: As the population ages, the rates of hip diseases and fragility fractures are increasing, making total hip arthroplasty (THA) one of the best methods for treating elderly patients. With the increasing number of THA surgeries and diverse surgical methods, there is a need for standard evaluation protocols. This study aimed to use deep learning algorithms to classify THA videos and evaluate the accuracy of the labelling of these videos. Methods: In our study, we manually annotated 7 phases in THA, including skin incision, broaching, exposure of acetabulum, acetabular reaming, acetabular cup positioning, femoral stem insertion, and skin closure. Within each phase, a second trained annotator marked the beginning and end of instrument usages, such as the skin blade, forceps, Bovie, suction device, suture material, retractor, rasp, femoral stem, acetabular reamer, head trial, and real head. Results: In our study, we utilized YOLOv3 to collect 540 operating images of THA procedures and create a scene annotation model. The results of our study showed relatively high accuracy in the clear classification of surgical techniques such as skin incision and closure, broaching, acetabular reaming, and femoral stem insertion, with a mean average precision (mAP) of 0.75 or higher. Most of the equipment showed good accuracy of mAP 0.7 or higher, except for the suction device, suture material, and retractor. Conclusions: Scene annotation for the instrument and phases in THA using deep learning techniques may provide potentially useful tools for subsequent documentation, assessment of skills, and feedback. © 2024 by The Korean Orthopaedic Association. | - |
| dc.format.extent | 7 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | 대한정형외과학회 | - |
| dc.title | Assessment of Automated Identification of Phases in Videos of Total Hip Arthroplasty Using Deep Learning Techniques | - |
| dc.type | Article | - |
| dc.publisher.location | 대한민국 | - |
| dc.identifier.doi | 10.4055/cios23280 | - |
| dc.identifier.scopusid | 2-s2.0-85188506134 | - |
| dc.identifier.wosid | 001197125000016 | - |
| dc.identifier.bibliographicCitation | Clinics in Orthopedic Surgery, v.16, no.2, pp 210 - 216 | - |
| dc.citation.title | Clinics in Orthopedic Surgery | - |
| dc.citation.volume | 16 | - |
| dc.citation.number | 2 | - |
| dc.citation.startPage | 210 | - |
| dc.citation.endPage | 216 | - |
| dc.type.docType | Article | - |
| dc.identifier.kciid | ART003067730 | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.description.journalRegisteredClass | kci | - |
| dc.relation.journalResearchArea | Orthopedics | - |
| dc.relation.journalWebOfScienceCategory | Orthopedics | - |
| dc.subject.keywordAuthor | Arthroplasty | - |
| dc.subject.keywordAuthor | Deep learning | - |
| dc.subject.keywordAuthor | Hip | - |
| dc.subject.keywordAuthor | Surgical procedures | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Gyeongsang National University Central Library, 501, Jinju-daero, Jinju-si, Gyeongsangnam-do, 52828, Republic of Korea+82-55-772-0532
COPYRIGHT 2022 GYEONGSANG NATIONAL UNIVERSITY LIBRARY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
