检测到您当前使用浏览器版本过于老旧,会导致无法正常浏览网站;请您使用电脑里的其他浏览器如:360、QQ、搜狗浏览器的极速模式浏览,或者使用谷歌、火狐等浏览器。
下载Firefox北京大学定量生物学中心
学术报告
题 目: Representational learning in brain and artificial neural networks: Lessons from the olfactory system
报告人: Professor Yuhai Tu
IBM Thomas J. Watson Research Center, Yorktown Heights, NY USA
AAAS Fellow, APS Fellow, Chair of the APS Division of Biophysics (DBIO)
时 间: 4月2日(周二)13:30-14:30
地 点: 金光生命科学大楼101报告厅(邓祐才报告厅)
主持人: 罗冬根 教授
摘要:
Learning happens in realistic neural networks in brain as well as in artificial neural networks (ANN) such as those in deep learning, which has achieved near or above human level performance for an increasing list of specific tasks. However, both their network architectures and the underlying learning rules are significantly different. On the architecture side, realistic neural networks in brains have recurrent connections between neurons while ANNs in deep learning have a simple feedforward architecture. More importantly, brain learns through updates of the synaptic weights through a local learning rule such as the Hebbian learning rule while the weight parameters in deep learning ANN models are updated to minimize a global loss function.
In this talk, we will discuss the commonalities and differences between learning dynamics of realistic neural networks and artificial neural networks in the context of representational learning by using two examples from the mammalian olfactory system: 1) alignment of neural representations from two sides of the brain; 2) representational drift in piriform cortex.
Professor Yuhai Tu graduated from University of Science and Technology of China in 1987. He came to the US under the CUSPEA program and received his PhD in physics from UCSD in 1991. He was a Division Prize Fellow at Caltech from 1991-1994. He joined IBM Watson Research Center as a Research Staff Member in 1994 and served as head of the theory group during 2003-2015. He has been an APS Fellow since 2004 and served as the APS Division of Biophysics (DBIO) Chair in 2017. He is also a Fellow of AAAS.
Yuhai Tu has broad research interests, which include nonequilibrium statistical physics, biological physics, theoretical neuroscience, and most recently theoretical foundations of deep learning. He has made seminal contributions in diverse areas including the flocking theory, growth dynamics of Si-aSiO2 interface, pattern discovery in RNA microarray analysis, quantitative models of bacterial chemotaxis, circadium clock, and the energy-speed-accuracy relation in biological systems.
For his work in theoretical statistical physics, he was awarded (together with John Toner and Tamas Vicsek) the 2020 Lars Onsager Prize from APS: "For seminal work on the theory of flocking that marked the birth and contributed greatly to the development of the field of active matter." https://www.aps.org/programs/honors/prizes/prizerecipient.cfm?last_nm=Tu&first_nm=Yuhai&year=2020