{"id":8,"date":"2017-06-28T07:36:33","date_gmt":"2017-06-28T07:36:33","guid":{"rendered":"https:\/\/mbic-auditorylab.nl\/?page_id=8"},"modified":"2025-06-03T09:07:22","modified_gmt":"2025-06-03T09:07:22","slug":"home","status":"publish","type":"page","link":"https:\/\/mbic-auditorylab.nl\/home\/","title":{"rendered":"Home"},"content":{"rendered":"<div class=\"wpb-content-wrapper\"><p>[vc_row][vc_column][vc_column_text css=&#8221;&#8221; el_class=&#8221;white-text mb100&#8243;]<\/p>\n<h1 style=\"text-align: center;\"><strong>Auditory Cognition in <\/strong><strong>Humans and Machines<\/strong><\/h1>\n<h1 style=\"text-align: center;\"><strong> (<a href=\"https:\/\/mbic-auditorylab.nl\/home\/archie_new\/\">ARCHIE<\/a>)<\/strong><\/h1>\n<h3 style=\"text-align: center;\" data-start=\"222\" data-end=\"609\">AuditoRy Cognition in Humans and MachInEs (ARCHIE) is a research initiative exploring how humans and machines make sense of sound. Led by Elia Formisano (Maastricht University), ARCHIE\u00a0 brings together insights from neuroscience, psychology, artificial intelligence, and information science.<\/h3>\n<h3 style=\"text-align: center;\" data-start=\"222\" data-end=\"609\">Our mission is to understand how the brain recognizes and interprets everyday sounds\u2014and to develop intelligent systems that can do the same. By combining cutting-edge brain imaging with computational modeling, we aim to unravel the neural mechanisms of sound processing and build neurobiologically-informed AI systems that &#8220;hear&#8221; the world more like we do.<\/h3>\n<h3 style=\"text-align: center;\">Whether you&#8217;re a student curious about the science of sound, a researcher in cognitive or computer science, or a company interested in auditory AI applications\u2014we invite you to explore our work and get involved.<\/h3>\n<h5 style=\"text-align: center;\"><\/h5>\n<p>[\/vc_column_text][\/vc_column][\/vc_row][vc_row][vc_column][vc_column_text]    <h2>News<\/h2>\r\n    <div class=\"row\">\r\n                 <div class=\"col-12\">\r\n             <a class=\"newsitem text-grey\" href=\"https:\/\/mbic-auditorylab.nl\/home\/new-paper-in-neurocomputing-deciphering-the-transformation-of-sounds-into-meaning-insights-from-disentangling-intermediate-representations-in-sound-to-event-dnns\/\">\r\n                 <div class=\"news-wrapper wpb_text_column wpb_content_element round-border bg-white p25 m5\">\r\n                     <div class=\"row\">\r\n                         <div class=\"col-12 col-lg-4\">\r\n                             <div class=\"newsitem-block__image\">\r\n                                                                      <img decoding=\"async\" alt=\"New paper in Neurocomputing: Deciphering the transformation of sounds into meaning: Insights from disentangling intermediate representations in sound-to-event DNNs afbeelding\" class=\"img-fluid\" src=\"https:\/\/mbic-auditorylab.nl\/wp-content\/uploads\/2025\/10\/1-s2.0-S0925231225022726-gr4.jpg\">\r\n                                                              <\/div>\r\n                         <\/div>\r\n                         <div class=\"col-12 col-lg-8\">\r\n                             <div class=\"wpb_wrapper\">\r\n                                 <h3>New paper in Neurocomputing: Deciphering the transformation of sounds into meaning: Insights from disentangling intermediate representations in sound-to-event DNNs<\/h3>\r\n                                 <p class=\"mb-3\">\r\n                                        2 October 2026                                 <\/p>\r\n\r\n                                 <p class=\"description\">\r\n                                     In neuroscientific applications of deep neural networks (DNNs), interpretability of latent representations is crucial. Otherwise, we risk replacing one unknown (the brain) with another (the network). This new Neurocomputing article...                                 <\/p>\r\n                             <\/div>\r\n                         <\/div>\r\n                     <\/div>\r\n                 <\/div>\r\n             <\/a>\r\n         <\/div>\r\n                 <div class=\"col-12\">\r\n             <a class=\"newsitem text-grey\" href=\"https:\/\/mbic-auditorylab.nl\/home\/archie-goes-to-neurips2025\/\">\r\n                 <div class=\"news-wrapper wpb_text_column wpb_content_element round-border bg-white p25 m5\">\r\n                     <div class=\"row\">\r\n                         <div class=\"col-12 col-lg-4\">\r\n                             <div class=\"newsitem-block__image\">\r\n                                                                      <img decoding=\"async\" alt=\"ARCHIE Goes to NeurIPS2025! afbeelding\" class=\"img-fluid\" src=\"https:\/\/mbic-auditorylab.nl\/wp-content\/uploads\/2025\/10\/AudsemThinker1.png\">\r\n                                                              <\/div>\r\n                         <\/div>\r\n                         <div class=\"col-12 col-lg-8\">\r\n                             <div class=\"wpb_wrapper\">\r\n                                 <h3>ARCHIE Goes to NeurIPS2025!<\/h3>\r\n                                 <p class=\"mb-3\">\r\n                                        2 October 2026                                 <\/p>\r\n\r\n                                 <p class=\"description\">\r\n                                     Our paper \"AudSemThinker: Enhancing Audio-Language Models through Reasoning over Semantics of Sound\" has been accepted at NeurIPS2025 in\u00a0San Diego!\u00a0\ud83c\udf89 The paper by Gijs Wingaard, Elia Formisano, Michele Esposito and Michel...                                 <\/p>\r\n                             <\/div>\r\n                         <\/div>\r\n                     <\/div>\r\n                 <\/div>\r\n             <\/a>\r\n         <\/div>\r\n                 <div class=\"col-12\">\r\n             <a class=\"newsitem text-grey\" href=\"https:\/\/mbic-auditorylab.nl\/home\/716-2\/\">\r\n                 <div class=\"news-wrapper wpb_text_column wpb_content_element round-border bg-white p25 m5\">\r\n                     <div class=\"row\">\r\n                         <div class=\"col-12 col-lg-4\">\r\n                             <div class=\"newsitem-block__image\">\r\n                                                                      <img decoding=\"async\" alt=\"Launching of ERC-Synergy project NASCE afbeelding\" class=\"img-fluid\" src=\"https:\/\/mbic-auditorylab.nl\/wp-content\/uploads\/2025\/05\/Logo-ERC-black_0_broad-e1749723408854.png\">\r\n                                                              <\/div>\r\n                         <\/div>\r\n                         <div class=\"col-12 col-lg-8\">\r\n                             <div class=\"wpb_wrapper\">\r\n                                 <h3>Launching of ERC-Synergy project NASCE<\/h3>\r\n                                 <p class=\"mb-3\">\r\n                                        7 February 2026                                 <\/p>\r\n\r\n                                 <p class=\"description\">\r\n                                     In October 2024, Natural Auditory SCEnes in Humans and Machines (NASCE), collaborative projects by Elia Formisano and Bruno Giordano received ERC Synergy funding (8.6 M\u20ac)\u00a0 NASCE integrates AI and multimodal...                                 <\/p>\r\n                             <\/div>\r\n                         <\/div>\r\n                     <\/div>\r\n                 <\/div>\r\n             <\/a>\r\n         <\/div>\r\n                 <div class=\"col-12\">\r\n             <a class=\"newsitem text-grey\" href=\"https:\/\/mbic-auditorylab.nl\/home\/new-article-in-ieee-access-audio-language-datasets-of-scenes-and-events-a-survey\/\">\r\n                 <div class=\"news-wrapper wpb_text_column wpb_content_element round-border bg-white p25 m5\">\r\n                     <div class=\"row\">\r\n                         <div class=\"col-12 col-lg-4\">\r\n                             <div class=\"newsitem-block__image\">\r\n                                                              <\/div>\r\n                         <\/div>\r\n                         <div class=\"col-12 col-lg-8\">\r\n                             <div class=\"wpb_wrapper\">\r\n                                 <h3>New Article in IEEE Access: Audio-Language Datasets of Scenes and Events: A Survey<\/h3>\r\n                                 <p class=\"mb-3\">\r\n                                        7 February 2026                                 <\/p>\r\n\r\n                                 <p class=\"description\">\r\n                                     This work - published in IEEE Access (Open Access) - offers a comprehensive analysis of many audio-language datasets used to train audio-language models, examining dataset origins, audio-linguistic characteristics, and use...                                 <\/p>\r\n                             <\/div>\r\n                         <\/div>\r\n                     <\/div>\r\n                 <\/div>\r\n             <\/a>\r\n         <\/div>\r\n            <\/div>\r\n    <a class=\"sub-link\" href=\"https:\/\/mbic-auditorylab.nl\/home\/news\/\">All News<\/a>\r\n\r\n    [\/vc_column_text][\/vc_column][\/vc_row]<\/p>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>[vc_row][vc_column][vc_column_text css=&#8221;&#8221; el_class=&#8221;white-text mb100&#8243;] Auditory Cognition in Humans and Machines (ARCHIE) AuditoRy Cognition in Humans and MachInEs (ARCHIE) is a research initiative exploring how humans and machines make sense of sound. Led by Elia Formisano (Maastricht University), ARCHIE\u00a0 brings together insights from neuroscience, psychology, artificial intelligence, and information science. Our mission is to understand how [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"footnotes":""},"class_list":["post-8","page","type-page","status-publish","hentry"],"acf":[],"_links":{"self":[{"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/pages\/8","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/comments?post=8"}],"version-history":[{"count":63,"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/pages\/8\/revisions"}],"predecessor-version":[{"id":790,"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/pages\/8\/revisions\/790"}],"wp:attachment":[{"href":"https:\/\/mbic-auditorylab.nl\/home\/wp-json\/wp\/v2\/media?parent=8"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}