Preprint

Fast Bayesian Updates for Deep Learning with a Use Case in Active Learning



Details zur Publikation
Autor(inn)en:
Herde, M.; Huang, Z.; Huseljic, D.; Kottke, D.; Vogt, S.; Sick, B.

Publikationsjahr:
2022
Zeitschrift:
arXiv Preprint
Seitenbereich:
TBD
Abkürzung der Fachzeitschrift:
arXiv
DOI-Link der Erstveröffentlichung:


Zusammenfassung, Abstract
Retraining deep neural networks when new data arrives is typically computationally expensive. Moreover, certain applications do not allow such costly retraining due to time or computational constraints. Fast Bayesian updates are a possible solution to this issue. Therefore, we propose a Bayesian update based on Monte-Carlo samples and a last-layer Laplace approximation for different Bayesian neural network types, i.e., Dropout, Ensemble, and Spectral Normalized Neural Gaussian Process (SNGP). In a large-scale evaluation study, we show that our updates combined with SNGP represent a fast and competitive alternative to costly retraining. As a use case, we combine the Bayesian updates for SNGP with different sequential query strategies to exemplarily demonstrate their improved selection performance in active learning.


Autor(inn)en / Herausgeber(innen)

Zuletzt aktualisiert 2025-07-05 um 11:49