Cited 3 time in
2D-3D Reconstruction of a Femur by Single X-Ray Image Based on Deep Transfer Learning Network
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Ha, Ho-Gun | - |
| dc.contributor.author | Lee, Jinhan | - |
| dc.contributor.author | Jung, Gu-Hee | - |
| dc.contributor.author | Hong, Jaesung | - |
| dc.contributor.author | Lee, HyunKi | - |
| dc.date.accessioned | 2024-02-20T09:00:24Z | - |
| dc.date.available | 2024-02-20T09:00:24Z | - |
| dc.date.issued | 2024-02 | - |
| dc.identifier.issn | 1959-0318 | - |
| dc.identifier.issn | 1876-0988 | - |
| dc.identifier.uri | https://scholarworks.gnu.ac.kr/handle/sw.gnu/69694 | - |
| dc.description.abstract | Objective: Constructing a 3D model from its 2D images, known as 2D-3D reconstruction, is a challenging task. Conventionally, a parametric 3D model such as a statistical shape model (SSM) is deformed by matching the shapes in its 2D images through a series of processes, including calibration, 2D-3D registration, and optimization for nonrigid deformation. To overcome this complicated procedure, a streamlined 2D-3D reconstruction using a single X-ray image is developed in this study. Methods: We propose 2D-3D reconstruction of a femur by adopting a deep neural network, where the deformation parameters in the SSM determining the 3D shape of the femur are predicted from a single X-ray image using a deep transfer-learning network. For learning the network from distinct features representing the 3D shape information in the X-ray image, a specific proximal part of the femur from a unique X-ray pose that allows accurate prediction of the 3D femur shape is designated and used to train the network. Then, the corresponding proximal/distal 3D femur model is reconstructed from only the single X-ray image acquired at the designated position. Results: Experiments were conducted using actual X-ray images of a femur phantom and X-ray images of a patient's femur derived from computed tomography to verify the proposed method. The average errors of the reconstructed 3D shape of the proximal and distal femurs from the proposed method were 1.20 mm and 1.08 mm in terms of root mean squared point-to-surface distance, respectively. Conclusion: The proposed method presents an innovative approach to simplifying the 2D-3D reconstruction using deep neural networks that exhibits performance compatible with the existing methodologies. © 2024 AGBM | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | Elsevier Masson | - |
| dc.title | 2D-3D Reconstruction of a Femur by Single X-Ray Image Based on Deep Transfer Learning Network | - |
| dc.type | Article | - |
| dc.publisher.location | 미국 | - |
| dc.identifier.doi | 10.1016/j.irbm.2024.100822 | - |
| dc.identifier.scopusid | 2-s2.0-85183969775 | - |
| dc.identifier.wosid | 001176441800001 | - |
| dc.identifier.bibliographicCitation | IRBM, v.45, no.1 | - |
| dc.citation.title | IRBM | - |
| dc.citation.volume | 45 | - |
| dc.citation.number | 1 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | N | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
| dc.subject.keywordPlus | STATISTICAL SHAPE MODEL | - |
| dc.subject.keywordPlus | 3D RECONSTRUCTION | - |
| dc.subject.keywordPlus | PROXIMAL FEMUR | - |
| dc.subject.keywordPlus | SURFACE MODEL | - |
| dc.subject.keywordPlus | RADIOGRAPHS | - |
| dc.subject.keywordAuthor | 2D-3D reconstruction | - |
| dc.subject.keywordAuthor | 3D modeling | - |
| dc.subject.keywordAuthor | Deep transfer learning network | - |
| dc.subject.keywordAuthor | Statistical shape model | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Gyeongsang National University Central Library, 501, Jinju-daero, Jinju-si, Gyeongsangnam-do, 52828, Republic of Korea+82-55-772-0532
COPYRIGHT 2022 GYEONGSANG NATIONAL UNIVERSITY LIBRARY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
