Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Recurrent neural network-induced Gaussian process

Authors
Sun X.Kim, SeongyoonChoi J.-I.
Issue Date
Oct-2022
Publisher
ELSEVIER
Citation
NEUROCOMPUTING, v.509, pp 75 - 84
Pages
10
Journal Title
NEUROCOMPUTING
Volume
509
Start Page
75
End Page
84
URI
https://yscholarhub.yonsei.ac.kr/handle/2021.sw.yonsei/6392
DOI
10.1016/j.neucom.2022.07.066
ISSN
0925-2312
Abstract
In this study, we develop a recurrent neural network-induced Gaussian process (RNNGP) to model sequence data. We derive the equivalence between infinitely wide neural networks and Gaussian processes (GPs) for a relaxed recurrent neural network (RNN) with untied weights. We compute the covariance function of the RNNGP using an analytical iteration formula derived through the RNN procedure with an error-function-based activation function. To simplify our discussion, we use the RNNGP to perform Bayesian inference on vanilla RNNs for various problems, such as Modified National Institute of Standards and Technology digit identification, Mackey?Glass time-series forecasting, and lithium-ion battery state-of-health estimation. The results demonstrate the flexibility of the RNNGP in modeling sequence data. Furthermore, the RNNGP predictions typically outperform those of the original RNNs and GPs, demonstrating the efficiency of the RNNGP as a data-driven model. Moreover, the RNNGP can quantify the uncertainty in the predictions, which implies the significant potential of the RNNGP in uncertainty quantification analyses.
Files in This Item
There are no files associated with this item.
Appears in
Collections
일반대학원 > 일반대학원 계산과학공학과 > 1. Journal Articles
College of Science > 이과대학 수학 > 1. Journal Articles

qrcode

Items in Scholar Hub are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Seongyoon photo

Kim, Seongyoon
이과대학 수학과+계산과학공학과
Read more

Altmetrics

Total Views & Downloads

BROWSE