관성 마찰용접 공정에서 심층 신경망을 이용한 업셋 길이와 업셋 시간의 예측Prediction of Upset Length and Upset Time in Inertia Friction Welding Process Using Deep Neural Network
- Other Titles
- Prediction of Upset Length and Upset Time in Inertia Friction Welding Process Using Deep Neural Network
- Authors
- 양영수; 배강열
- Issue Date
- 2019
- Publisher
- 한국기계가공학회
- Keywords
- Inertia Friction Welding(관성 마찰용접); Upset(업셋); Numerical Analysis(수치해석); Process Parameters(공정 매개변수); Deep Neural Network(심층 신경망)
- Citation
- 한국기계가공학회지, v.18, no.11, pp 47 - 56
- Pages
- 10
- Indexed
- KCI
- Journal Title
- 한국기계가공학회지
- Volume
- 18
- Number
- 11
- Start Page
- 47
- End Page
- 56
- URI
- https://scholarworks.gnu.ac.kr/handle/sw.gnu/9674
- ISSN
- 1598-6721
2288-0771
- Abstract
- A deep neural network (DNN) model was proposed to predict the upset in the inertia friction welding process using a database comprising results from a series of FEM analyses. For the database, the upset length, upset beginning time, and upset completion time were extracted from the results of the FEM analyses obtained with various of axial pressure and initial rotational speed. A total of 35 training sets were constructed to train the proposed DNN with 4 hidden layers and 512 neurons in each layer, which can relate the input parameters to the welding results. The mean of the summation of squared error between the predicted results and the true results can be constrained to within 1.0e-4 after the training. Further, the network model was tested with another 10 sets of welding input parameters and results for comparison with FEM. The test showed that the relative error of DNN was within 2.8% for the prediction of upset. The results of DNN application revealed that the model could effectively provide welding results with respect to the exactness and cost for each combination of the welding input parameters.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - 융합기술공과대학 > Division of Mechatronics Engineering > Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.