InfoCoBuild

9.40 Introduction to Neural Computation

9.40 Introduction to Neural Computation (Spring 2018, MIT OCW). Instructor: Prof. Michale Fee. This course introduces quantitative approaches to understanding brain and cognitive functions. Topics include mathematical description of neurons, the response of neurons to sensory stimuli, simple neuronal networks, statistical inference and decision making. It also covers foundational quantitative tools of data analysis in neuroscience: correlation, convolution, spectral analysis, principal components analysis, and mathematical concepts including simple differential equations and linear algebra. (from ocw.mit.edu)

Lecture 14 - Rate Models and Perceptrons

This video explores a mathematically tractable model of neural networks, receptive fields, vector algebra, and perceptrons.


Go to the Course Home or watch other lectures:

Lecture 01 - Overview and Ionic Currents
Lecture 02 - RC Circuit and Nernst Potential
Lecture 03 - Nernst Potential and Integrate and Fire Models
Lecture 04 - Hodgkin-Huxley Model, Part 1
Lecture 05 - Hodgkin-Huxley Model, Part 2
Lecture 06 - Dendrites
Lecture 07 - Synapses
Lecture 08 - Spike Trains
Lecture 09 - Receptive Fields
Lecture 10 - Time Series
Lecture 11 - Spectral Analysis, Part 1
Lecture 12 - Spectral Analysis, Part 2
Lecture 13 - Spectral Analysis, Part 3
Lecture 14 - Rate Models and Perceptrons
Lecture 15 - Matrix Operations
Lecture 16 - Basis Sets
Lecture 17 - Principal Components Analysis
Lecture 18 - Recurrent Networks
Lecture 19 - Neural Integrators
Lecture 20 - Hopfield Networks