Cited 2 time in
Improving I-ELM structure through optimal addition of hidden nodes: Compact I-ELM
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Seo, Sunghyo | - |
| dc.contributor.author | Jo, Jongkwon | - |
| dc.contributor.author | Hamza, Muhammad | - |
| dc.contributor.author | Kim, Youngsoon | - |
| dc.date.accessioned | 2024-12-03T06:30:41Z | - |
| dc.date.available | 2024-12-03T06:30:41Z | - |
| dc.date.issued | 2024-10 | - |
| dc.identifier.issn | 2045-2322 | - |
| dc.identifier.issn | 2045-2322 | - |
| dc.identifier.uri | https://scholarworks.gnu.ac.kr/handle/sw.gnu/74459 | - |
| dc.description.abstract | Incremental extreme learning machines (I-ELMs) can automatically determine the structure of neural networks and achieve high learning speeds. However, during the process of adding hidden nodes, unnecessary hidden nodes that have little relevance to the target may be added. Several studies have proposed methods to overcome this problem by measuring the relevance between hidden nodes and outputs and adding or removing hidden nodes accordingly. Random hidden nodes have the advantage of creating diverse patterns, but they encounter a problem in which hidden nodes that generate patterns with little or no relevance to the target can be added, thereby increasing the number of hidden nodes. Unlike in existing I-ELMs, which use random hidden nodes, we propose a compact I-ELM algorithm that initially adds linear regression nodes and subsequently applies a method to ensure that the hidden nodes have patterns differing from the existing ones. Based on benchmark data, we confirmed that the proposed method constructs a compact neural network structure with fewer hidden nodes compared to the existing I-ELM systems. | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | Nature Publishing Group | - |
| dc.title | Improving I-ELM structure through optimal addition of hidden nodes: Compact I-ELM | - |
| dc.type | Article | - |
| dc.publisher.location | 영국 | - |
| dc.identifier.doi | 10.1038/s41598-024-74446-w | - |
| dc.identifier.scopusid | 2-s2.0-85205527425 | - |
| dc.identifier.wosid | 001328801300034 | - |
| dc.identifier.bibliographicCitation | Scientific Reports, v.14, no.1 | - |
| dc.citation.title | Scientific Reports | - |
| dc.citation.volume | 14 | - |
| dc.citation.number | 1 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Science & Technology - Other Topics | - |
| dc.relation.journalWebOfScienceCategory | Multidisciplinary Sciences | - |
| dc.subject.keywordPlus | EXTREME LEARNING-MACHINE | - |
| dc.subject.keywordPlus | FEEDFORWARD NETWORKS | - |
| dc.subject.keywordPlus | CLASSIFICATION | - |
| dc.subject.keywordPlus | REGRESSION | - |
| dc.subject.keywordPlus | BOUNDS | - |
| dc.subject.keywordAuthor | Compact node | - |
| dc.subject.keywordAuthor | Hidden nodes | - |
| dc.subject.keywordAuthor | Incremental extreme learning machine | - |
| dc.subject.keywordAuthor | And neural networks | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
Gyeongsang National University Central Library, 501, Jinju-daero, Jinju-si, Gyeongsangnam-do, 52828, Republic of Korea+82-55-772-0532
COPYRIGHT 2022 GYEONGSANG NATIONAL UNIVERSITY LIBRARY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
