News

July 2016

July 10: “All In The Mind” radio interview with Jack Gallant and Alex Huth

allinthemind

Lynne Malcolm from the ABC (not that ABC!) radio show, All In The Mind, did a nice interview with Jack Gallant and Alex Huth. The interview covered the basic approach used in our lab, our previous results on decoding and reconstructing natural movies, and our recent results on semantic representation. You can listen to the interview here.

June 2016

June 9: New video posted, “Using Image Processing to improve reconstruction of movies from brain activity”

BilenkoSavageIn 2011 we published a nice paper, Reconstructing visual experiences from brain activity elicited by natural movies, by Shinji Nishimoto and others from our lab. Now Natalia Bilenko and Valkyrie Savage have developed an improved algorithm for reconstructing movies from brain activity, and a new video describes and demonstrates their work. (A detailed explanation of the algorithm is given at the end of the video.)

June 9: Alex Huth wins early career award from the Burroughs Wellcome Fund!

Alex Huth has won a prestigious early career award from the Borroughs Wellcome Fund. Similar to an NIH K award, these funds can be used to support Alex’s post-doctoral research and the work that he will do in his own lab. You can read the UC Berkeley Press Release here. Congratulations Alex!

April 2016

April 27: Nature article on our semantic atlas

NatureCoverAlex Huth’s paper, “Semantic information in natural narrative speech is represented in complex maps that tile human cerebral cortex” has just been published in Nature. The meaning of language is represented in regions of the cerebral cortex known collectively as the “semantic system”. However, little of the semantic system has been mapped comprehensively, and the semantic selectivity of most regions is still unknown. Here we systematically map semantic selectivity across the cerebral cortex using voxel-wise modeling of fMRI data collected while subjects listened to several hours of natural narrative stories. We show that the semantic system is organized into intricate patterns that appear highly consistent across individuals. We then use a novel Bayesian generative model to map these patterns and create a detailed semantic atlas. Our results suggest that most areas within the semantic system represent information about specific semantic domains and our atlas shows which domains are represented in each area. You can find a detailed writeup about the paper here, and you can find a video summary of the paper here. And be sure to check out the new brain viewer! To request a reprint please send an email to <huthreprint@gmail.com>.

caseforge_won_the_delta_prizeApril 26: Caseforge team wins the Delta Prize!

James Gao, Alex Huth and Young Park, the founders of Caseforge, have won the Delta Prize! The Delta Prize is a UC Berkeley startup prize aimed at accelerating the growth of early-stage ventures. Caseforge is a company formed to manufacture the headcase, a head stabilization device developed in our laboratory. This device can dramatically increase the quality of fMRI data, and it may also have important medical applications.

About our lab

This is the web home of Professor Jack Gallant’s cognitive, computational and systems neuroscience lab at the University of California, Berkeley. Our lab uses functional MRI, computational modeling and machine learning to map perceptual, language and cognitive functions across the human brain. We also study how these maps are altered by top-down processes such as attention, learning and memory, and how they differ across individuals. The computational modeling framework that we have developed for brain mapping can also be used to decode human brain activity with remarkable fidelity.

Our laboratory is located in the Department of Psychology, University of California at Berkeley. We are also affiliated with the programs in Neuroscience,Bioengineering, Biophysics and Vision Science, and with the Department of Electrical Engineering and Computer Science.

Information about joining us as a graduate student or post-doc can be found here.