Sequential neural models with stochastic layers

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

  • Marco Fraccaro
  • Søren Kaae Sønderby
  • Ulrich Paquet
  • Winther, Ole

How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.

OriginalsprogEngelsk
TitelNeural Information Processing Systems 2016
RedaktørerD. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, R. Garnett
Antal sider9
ForlagNeural Information Processing Systems Foundation
Publikationsdato2016
Sider2207-2215
StatusUdgivet - 2016
Begivenhed30th Annual Conference on Neural Information Processing Systems - Barcelona, Spanien
Varighed: 5 dec. 201610 dec. 2016
Konferencens nummer: 30

Konference

Konference30th Annual Conference on Neural Information Processing Systems
Nummer30
LandSpanien
ByBarcelona
Periode05/12/201610/12/2016
NavnAdvances in Neural Information Processing Systems
Vol/bind29
ISSN1049-5258

ID: 179362819