Sequential neural models with stochastic layers

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Sequential neural models with stochastic layers. / Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich; Winther, Ole.

Neural Information Processing Systems 2016. red. / D. D. Lee; M. Sugiyama; U. V. Luxburg; I. Guyon; R. Garnett. Neural Information Processing Systems Foundation, 2016. s. 2207-2215 (Advances in Neural Information Processing Systems, Bind 29).

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Fraccaro, M, Sønderby, SK, Paquet, U & Winther, O 2016, Sequential neural models with stochastic layers. i DD Lee, M Sugiyama, UV Luxburg, I Guyon & R Garnett (red), Neural Information Processing Systems 2016. Neural Information Processing Systems Foundation, Advances in Neural Information Processing Systems, bind 29, s. 2207-2215, 30th Annual Conference on Neural Information Processing Systems, Barcelona, Spanien, 05/12/2016. <https://papers.nips.cc/paper/6039-sequential-neural-models-with-stochastic-layers.pdf>

APA

Fraccaro, M., Sønderby, S. K., Paquet, U., & Winther, O. (2016). Sequential neural models with stochastic layers. I D. D. Lee, M. Sugiyama, U. V. Luxburg, I. Guyon, & R. Garnett (red.), Neural Information Processing Systems 2016 (s. 2207-2215). Neural Information Processing Systems Foundation. Advances in Neural Information Processing Systems Bind 29 https://papers.nips.cc/paper/6039-sequential-neural-models-with-stochastic-layers.pdf

Vancouver

Fraccaro M, Sønderby SK, Paquet U, Winther O. Sequential neural models with stochastic layers. I Lee DD, Sugiyama M, Luxburg UV, Guyon I, Garnett R, red., Neural Information Processing Systems 2016. Neural Information Processing Systems Foundation. 2016. s. 2207-2215. (Advances in Neural Information Processing Systems, Bind 29).

Author

Fraccaro, Marco ; Sønderby, Søren Kaae ; Paquet, Ulrich ; Winther, Ole. / Sequential neural models with stochastic layers. Neural Information Processing Systems 2016. red. / D. D. Lee ; M. Sugiyama ; U. V. Luxburg ; I. Guyon ; R. Garnett. Neural Information Processing Systems Foundation, 2016. s. 2207-2215 (Advances in Neural Information Processing Systems, Bind 29).

Bibtex

@inproceedings{8a9e65c4da6f402086401f2391291514,
title = "Sequential neural models with stochastic layers",
abstract = "How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.",
author = "Marco Fraccaro and S{\o}nderby, {S{\o}ren Kaae} and Ulrich Paquet and Ole Winther",
year = "2016",
language = "English",
series = "Advances in Neural Information Processing Systems",
publisher = "Neural Information Processing Systems Foundation",
pages = "2207--2215",
editor = "Lee, {D. D.} and M. Sugiyama and Luxburg, {U. V.} and I. Guyon and R. Garnett",
booktitle = "Neural Information Processing Systems 2016",
note = "null ; Conference date: 05-12-2016 Through 10-12-2016",

}

RIS

TY - GEN

T1 - Sequential neural models with stochastic layers

AU - Fraccaro, Marco

AU - Sønderby, Søren Kaae

AU - Paquet, Ulrich

AU - Winther, Ole

N1 - Conference code: 30

PY - 2016

Y1 - 2016

N2 - How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.

AB - How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.

UR - http://www.scopus.com/inward/record.url?scp=85019203505&partnerID=8YFLogxK

M3 - Article in proceedings

AN - SCOPUS:85019203505

T3 - Advances in Neural Information Processing Systems

SP - 2207

EP - 2215

BT - Neural Information Processing Systems 2016

A2 - Lee, D. D.

A2 - Sugiyama, M.

A2 - Luxburg, U. V.

A2 - Guyon, I.

A2 - Garnett, R.

PB - Neural Information Processing Systems Foundation

Y2 - 5 December 2016 through 10 December 2016

ER -

ID: 179362819