Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Does the Quality and Readability of Information Related to Varicocele Obtained from ChatGPT 4.0 Remain Consistent Across Different Models of Inquiry?

Full metadata record
DC Field Value Language
dc.contributor.authorLuo, Zhao-
dc.contributor.authorKam, Sung Chul-
dc.contributor.authorKim, Ji Yong-
dc.contributor.authorHu, Wenhao-
dc.contributor.authorLin, Chuan-
dc.contributor.authorPark, Hyun Jun-
dc.contributor.authorShin, Yu Seob-
dc.date.accessioned2025-07-02T05:00:09Z-
dc.date.available2025-07-02T05:00:09Z-
dc.date.issued2025-05-
dc.identifier.issn2287-4208-
dc.identifier.issn2287-4690-
dc.identifier.urihttps://scholarworks.gnu.ac.kr/handle/sw.gnu/79115-
dc.description.abstractPurpose There is a growing tendency of individuals resorting to Chat-Generative Pretrained Transformer (ChatGPT) as a source of medical information on specific ailments. Varicocele is a prevalent condition affecting the male reproductive system. The quality, readability, and consistency of the information related to varicocele that individuals obtain through interactive access to ChatGPT remains uncertain. Materials and Methods This study employed Google Trends data to extract 25 trending questions since 2004. Two distinct inquiry methodologies were employed with ChatGPT 4.0: repetition mode (each question repeated three times) and cyclic mode (each question input once in three consecutive cycles). The generated texts were evaluated according to a number of criteria, including the Automated Readability Index (ARI), the Flesch Reading Ease Score (FRES), the Gunning Fog Index (GFI), the DISCERN score and the Ensuring Quality Information for Patients (EQIP). Kruskal-Wallis and Mann-Whitney U tests were employed to compare the text quality, readability, and consistency between the two modes. Results The results demonstrated that the texts generated in repetition and cyclic modes exhibited no statistically significant differences in ARI (12.06 +/- 1.29 vs. 12.27 +/- 1.74), FRES (36.08 +/- 8.70 vs. 36.87 +/- 7.73), GFI (13.14 +/- 1.81 vs. 13.25 +/- 1.50), DISCERN scores (38.08 +/- 6.55 vs. 38.35 +/- 6.50) and EQIP (47.92 +/- 6.84 vs. 48.35 +/- 5.56) (p>0.05). These findings indicate that ChatGPT 4.0 consistently produces information of comparable complexity and quality across different inquiry modes. Conclusions This study found that ChatGPT-generated medical information on "varicocele" demonstrates consistent quality and readability across different modes, highlighting its potential for stable healthcare information provision. However, the content's complexity poses challenges for general readers, and notable limitations in quality and reliability highlight the need for improved accuracy, credibility, and readability in AI-generated medical content.-
dc.format.extent10-
dc.language영어-
dc.language.isoENG-
dc.publisher대한남성과학회-
dc.titleDoes the Quality and Readability of Information Related to Varicocele Obtained from ChatGPT 4.0 Remain Consistent Across Different Models of Inquiry?-
dc.typeArticle-
dc.publisher.location대한민국-
dc.identifier.doi10.5534/wjmh.240331-
dc.identifier.scopusid2-s2.0-105025454408-
dc.identifier.wosid001513652500001-
dc.identifier.bibliographicCitationThe World Journal of Men's Health, v.44, no.1, pp 161 - 170-
dc.citation.titleThe World Journal of Men's Health-
dc.citation.volume44-
dc.citation.number1-
dc.citation.startPage161-
dc.citation.endPage170-
dc.type.docTypeArticle; Early Access-
dc.identifier.kciidART003278484-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.description.journalRegisteredClasskci-
dc.relation.journalResearchAreaEndocrinology & Metabolism-
dc.relation.journalResearchAreaHealth Care Sciences & Services-
dc.relation.journalResearchAreaUrology & Nephrology-
dc.relation.journalWebOfScienceCategoryAndrology-
dc.relation.journalWebOfScienceCategoryHealth Care Sciences & Services-
dc.relation.journalWebOfScienceCategoryUrology & Nephrology-
dc.subject.keywordPlusHEALTH-
dc.subject.keywordAuthorComprehension-
dc.subject.keywordAuthorInfertility-
dc.subject.keywordAuthorLarge language models-
dc.subject.keywordAuthorVaricocele-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Medicine > Department of Medicine > Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kam, Sung Chul photo

Kam, Sung Chul
의과대학 (의학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE