Fred Simard has organized a Computational Neuroscience day by and for CAMBAM students, which took place May 7, 2013. He came up with the brilliant idea of recording the workshop lectures on video, and we now also have a CAMBAM student youtube account. Check out what we have on offer! (Adapted from original post on INSIDE CAMBAM by Fred Simard,) Thank you Fred, and all the great speakers!
Speaker: Greg Stacey
This video is the first part of the series of two talks on the topic of regression, optimization, classification and decoding. In it you will the basics of linear regression, and explanation of the concept of a cost/loss function and of the gradient method for the optimization of such a function.
You can find a code sample here:
Speaker: Nathan Friedman
In this talk, the second in the series on linear regression, optimization, classification and decoding, I give a very brief overview of some machine learning classification algorithms. I explain what a linear classifier is and demonstrate both binary and multi-class classifiers. The algorithms presented are: Regularized Least Squares, Logistic Regression, Perceptron, Support Vector Machine, and Fisher’s Discriminant.
Speaker: Frederic Simard
In this talk, you will learn the basics of dimensionality reduction. The first algorithm that is presented is the principal component analysis which is based on the variance in the data set. You will learn how to select a subset of dimensions while maintaining the most information about your data, as to, for example, make a classifier. A quick presentation of the Gaussian Process Factor Analysis follows. This algorithm extract trajectories of system state in lower dimension space.
Speaker: Adam Schneider
Collaborator: Mohsen Jamali
Information theory, developed by Claude Shannon in 1949, provides mathematically rigorous tools to quantify the precision with which a systems output contains information about its inputs, setting physical limits on a system’s capacity for information transmission. In this talk I present a brief summary of the fundamental concepts underlying information theory, in the context of its application to neuronal signal processing.
A useful, well documented, MATLAB toolbox for calculating coherence and mutual information in neural systems can be found at www.chronux.org.
Speaker: Ashkan Golzar
Collaborator: Mohsen Jamali
Following the fundamental concepts of information theory in the previous part, in the second part of this talk we present three methods to calculate mutual information between stimulus and neural signal: direct method, upper-bound method, and lower-bound method. We discuss advantages and disadvantages of each method as well as the assumptions inherent to each. We also show how information theory can address central questions in the field of neural coding.