新闻与活动 活动信息

CIS Lecture Series | Representational learning in brain and artificial neural networks: Lessons from the olfactory system

时间

2024年3月26日(周二)
14:30-16:00

地点

西湖大学云谷校区师生服务中心A2-108报告厅

主持

西湖大学交叉科学中心 讲席教授汤雷翰

受众

全体师生

分类

学术与研究

CIS Lecture Series | Representational learning in brain and artificial neural networks: Lessons from the olfactory system


时间2024326日(周二) 14:30-16:00

Time14:30-16:00, Tuesday, March 26th, 2024

主持人: 西湖大学交叉科学中心 讲席教授汤雷翰

Host: Dr. Leihan Tang, Chair Professor, the Center for Interdisciplinary Studies

地址:西湖大学云谷校区师生服务中心A2-108报告厅

Venue: Lecture Hall A2-108, Yungu Campus, Westlake University


涂豫海 研究员

Prof. Yuhai Tu, Research Staff Member,

IBM Thomas J. Watson Research Center


主讲人/Speaker

Yuhai Tu received his PhD in theoretical physics from UCSD in 1991. He was a Division Prize Fellow at Caltech from 1991-1994. He joined IBM Watson Research Center as a Research Staff Member in 1994 and served as head of the Theory group during 2003-2015. He has been an APS Fellow since 2004 and served as the APS Division of Biophysics (DBIO) Chair in 2017. He is also a Fellow of AAAS. For his work in theoretical statistical physics, he was awarded (together with John Toner and Tamas Vicsek) the 2020 Lars Onsager Prize from APS: "For seminal work on the theory of flocking that marked the birth and contributed greatly to the development of the field of active matter."


讲座摘要/Abstract:

Learning happens in realistic neural networks in brain as well as in artificial neural networks (ANN) such as those in deep learning, which has achieved near or above human level performance for an increasing list of specific tasks. However, both their network architectures and the underlying learning rules are significantly different. On the architecture side, realistic neural networks in brains have recurrent connections between neurons while ANNs in deep learning have a simple feedforward architecture. More importantly, brain learns through updates of the synaptic weights through a local learning rule such as the Hebbian learning rule while the weight parameters in deep learning ANN models are updated to minimize a global loss function.

In this talk, we will discuss the commonalities and differences between learning dynamics of realistic neural networks and artificial neural networks in the context of representational learning by using two examples from the mammalian olfactory system: 1) alignment of neural representations from two sides of the brain; 2) representational drift in piriform cortex.


讲座联系人/Contact:

理学院,徐恺吟,邮箱:xukaiyin@westlake.edu.cn

School of Science, Kaiyin Xu, Email: xukaiyin@westlake.edu.cn