Classification of pig calls produced from birth to slaughter according to their emotional valence and context of production: [Inkl. correction]

Research output: Contribution to journalJournal articleResearchpeer-review


  • Fulltext

    Final published version, 1.3 MB, PDF document

  • Mandel-Briefer, Elodie Floriane
  • Ciara C.-R. Sypherd
  • Pavel Linhart
  • Lisette M. C. Leliveld
  • Monica Padilla de la Torre
  • Eva R. Read
  • Carole Guérin
  • Véronique Deiss
  • Chloé Monestier
  • Jeppe H. Rasmussen
  • Marek Špinka
  • Sandra Düpjan
  • Alain Boissy
  • Andrew M. Janczak
  • Edna Hillmann
  • Céline Tallet

Vocal expression of emotions has been observed across species and could provide a non-invasive and reliable means to assess animal emotions. We investigated if pig vocal indicators of emotions revealed in previous studies are valid across call types and contexts, and could potentially be used to develop an automated emotion monitoring tool. We performed an analysis of an extensive and unique dataset of low (LF) and high frequency (HF) calls emitted by pigs across numerous commercial contexts from birth to slaughter (7414 calls from 411 pigs). Our results revealed that the valence attributed to the contexts of production (positive versus negative) affected all investigated parameters in both LF and HF. Similarly, the context category affected all parameters. We then tested two different automated methods for call classification; a neural network revealed much higher classification accuracy compared to a permuted discriminant function analysis (pDFA), both for the valence (neural network: 91.5%; pDFA analysis weighted average across LF and HF (cross-classified): 61.7% with a chance level at 50.5%) and context (neural network: 81.5%; pDFA analysis weighted average across LF and HF (cross-classified): 19.4% with a chance level at 14.3%). These results suggest that an automated recognition system can be developed to monitor pig welfare on-farm.

Original languageEnglish
Article number3409
JournalScientific Reports
Number of pages10
Publication statusPublished - 2022

Bibliographical note

Correction: 10.1038/s41598-023-45242-9

Publisher Copyright:
© 2022, The Author(s).

Number of downloads are based on statistics from Google Scholar and

No data available

ID: 302899677