Arbib: CS564 "Brain Theory and AI" SyllabusFall 20071

/ CSCI 564 (NEUR 535): Fall 2008
Brain Theory and Artificial Intelligence

Tu Th 11:00am-12:20pm

Instructor: Prof. Michael A. Arbib; HNB-03, (213)740-9220, . (Office hours: 1-2 pm Tuesdays, HNB 03.)

TA: Jinyong Lee. Office Hours: to be announced, HNB 10.

Brains have proved highly successful in integrating perception, planning, memory and action in guiding creatures that interact with a complex world. The course has two overlapping aims: “To understand the workings of our own brains” and “To explore the implications of brain function for developing exotic, highly distributed adaptive embodied computing systems.” As we move to distributed computation, sensor networks, embedded systems, and robots interacting with humans in complex ways, we will discover Brain Operating Principles (BOPs) that will not only illuminate our understanding of ourselves but will also guide us in the development of new brain-style adaptive, distributed embedded computing technologies.

The course will introduce you to the basic facts about the brain, teach you how to model the brain conceptually and how to implement those models in our Neural Simulation Language (NSL), and how to keep track of BOPs and brain models in our Brain Operation Database (BODB).

Course Requirements:

One “mid-term” will cover the entire contents of the lectures and required readings up to that time. The final will emphasize, but not be restricted to, material covered after the mid-term.

Each student will be required to prepare a four-part project to get an overall feel for the architecture of a largish brain model, understand how models are related to empirical data, and think through the details of at least one important subsystem. Joint work on Parts 3 and 4 of the Project is allowed but not required.

Prerequisites:

Graduate standing; ability to program in Java, C++ or MatLab or willingness to learn to program in one of these systems. Basic background in neuroscience will be supplied, but students with experience in this area are still invited to join the course to gain an understanding of the computational approach to the brain.

Neuroscience students less skilled in computer programming will still study MatLab and the NSL homework, but may either (a) negotiate a project that involves analysis of a neural system without computer implementation, or (b) conduct joint work on Parts 3 and 4 of the Project taking responsibility for literature review andsystem design rather than programming.

Texts:

[NSL Book]: A. Weitzenfeld, M.A. Arbib and A. Alexander, 2002, NSL Neural Simulation Language, MIT Press (A draft version is available at

[HBTNN] Selected articles from M.A. Arbib, Ed., 2003, The Handbook of Brain Theory and Neural Networks, MIT Press, Second Edition. (The Handbook is available as one of the reference works on-line at the Cognet website of The MIT Press. This can be reached from USC machines by going to

Other articles will be placed on the class Website including extracts from

[TMB2] M.A. Arbib, 1989, The Metaphorical Brain 2: Neural Networks and Beyond, Wiley-Interscience.

Grades: Homework: 20%; Mid-term: 20%; Final 20%; Project: 40% (5% for Part 1; 10% for Part 2; 5% for Part 3; 20% Part 4)

Syllabus

Date / Topic / Readings
  1. 8/26
/ Brain-Inspired Computer Architecture 1 / Arbib, M.A., 2003, Towards a neurally-inspired computer architecture, Natural computing, 2:1-46.
  1. 8/28
/ The Brain as a Network of Neurons / TMB2: Chapter 1; Section 2.3; Section 9.1
HBTNN: Part I, Sections I.1 and I.2; Part III, Single Cell Models
  1. 9/2
/ The Structure of Brains / TMB2: Section 2.4.
  1. 9/4
/ Early Visual Processing / TMB2: Chapter 3.3
HBTNN 2e: Feature Analysis
  1. 9/9
/ Schemas & Cooperative Computation; Brief introduction to BOPS and BODB / TMB2: Sections 2.1, 2.2, 4.2 (on HEARSAY)
Anon Plangprasopchok, Nantana Tinroongroj, and Michael A. Arbib, User’s Manual for the Brain Operation Database.
Supplementary Reading:
HBTNN: Schema Theory ; Multiagent systems
TMB2: Section 5.1 (more on frogs) and 5.4 (brief look at language)
HBTNN: Visuomotor Coordination in Frog and Toad; Hybrid Symbolic/Connectionist Systems
  1. 9/11
/ Differential equations for Neural Networks; Arrays; Winner-Take-All / TMB2: 4.3, pp. 194-197. Prey Selection - or Winner Takes All; 4.4. A Mathematical Analysis of Neural Competition
HBTNN: Part I, Sections I.1 and I.2; Part III, Single Cell Models
  1. 9/16
/ Introduction to the MatLab & Simulink Environment. (Lee) / MatLab & SimuLink documentation (to be specified)
  1. 9/18
/ Practical Introduction to NSL-MatLab 1: Winner-Take-All (Lee) / NSL Book: Chapter 1 & Chapter 2: The book uses Java and C++, but we will use the MatLab version of NSL.
  1. 9/23
/ Eye Movements / Required Readings: TMB2, Section 6.2.
HBTNN: Collicular Visuomotor Transformations for Saccades
  1. 9/25
/ Dominey-Arbib model; Introduction to the Project / “Reprint: Dominey, P. F., and Arbib, M. A., 1992, A Cortico-Subcortical Model for Generation of Spatially Accurate Sequential Saccades, Cerebral Cortex, 2:153-175.
Supplementary Reading:NSL Book: Chapter 14 – The Modular Design of the Oculomotor System in Monkeys
[Even though it uses Java-NSL, not MatLab-NSL, this will help you prepare for the lecture on 10/23: Practical Introduction to MatLab-NSL 3: Basal Ganglia: Learning Associations and Sequences.]
  1. 9/30
/ Adaptive networks 1: Hebbian learning, Perceptrons; Landmark learning
[Include overview of later lectures on RL and backprop.] / Required Readings:TMB2: Section 3.4. HBTNN: Associative Networks: Perceptrons, Adalines, and Back-Propagation.
Supplementary Readings: HBTNN: Competitive Learning; Hebbian Synaptic Plasticity. NSL Book: Chapter 12: The Associative Search Network: Landmark Learning and Hill Climbing
  1. 10/2
/ Supplementary reading (not a lecture): Hopfield Networks, Constraint Satisfaction, and Optimization / Required Reading: TMB2: 3.2. Material on Stability (pp.106-114); 8.2 Material on Associative Networks and Hopfield Networks (pp.375-382); HBTNN 2e: Associative Networks; Computing with Attractors (Hertz)
Adaptive networks 2: Reinforcement learning; Conditional motor learning / Required Reading: HBTNN: Reinforcement Learning; Reinforcement Learning in Motor Control
  1. 10/7
/ Adaptive networks 3: Gradient descent and backpropagation; Forward & Inverse Models 1 / HBTNN: I.3 Dynamics and Adaptation in Neural Networks; Perceptrons, Adalines, and Backpropagation; Sensorimotor Learning
Supplementary Reading: HBTNN: Backpropagation
  1. 10/9
/ Mid-Term / Closed Book
  1. 10/14
/ Practical Introduction to NSL-MatLab 2: Dynamic Remapping (Lee) / NSL Book: Chapter 14 – The Modular Design of the Oculomotor System in Monkeys
  1. 10/16
/ Practical Introduction to NSL-MatLab 3: Basal Ganglia: Learning Associations with Reinforcement Learning (Lee)
  1. 10/21
/ Scene perception & Visual Attention / TMB2 Section 5.3; Itti &Koch; Navalpakkam, Itti & Arbib; Didday & Arbib; A nod to Bayesian models.
  1. 10/23
/ The FARS model of control of grasping / TMB 2, Sections 5.3, 6.3. Fagg, A. H., and Arbib, M. A., 1998, Modeling Parietal-Premotor Interactions in Primate Control of Grasping, Neural Networks, 11:1277-1303.
Supplementary Reading from HBTNN: Decoding Population Codes; Grasping Movements: Visuomotor Transformations.
  1. 10/28
/ The first model of the mirror neuron system / Oztop, E., and Arbib, M.A., 2002, Schema Design and Implementation of the Grasp-Related Mirror Neuron System, Biological Cybernetics, 87:116-140
  1. 10/30
/ Basal Ganglia: Learning Associations and Sequences / HBTNN: Basal Ganglia; Dopamine, Roles of.
Reprint: Dominey, P.F., Arbib, M.A., and Joseph, J.-P., 1995, A Model of Corticostriatal Plasticity for Learning Associations and Sequences, J. Cog. Neurosci., 7:311-336
  1. 11/4
/ Adaptive networks 4: Learning Sequences / Required: TMB2: 3.2. Material on Stability; 8.2 Material on Associative Networks and Hopfield Networks. HBTNN 2: Recurrent Networks: Learning Algorithms (Doya); Associative Networks (Anderson); Computing with Attractors (Hertz).
Optional: HBTNN: Dynamics and bifurcations in neural nets (Ermentrout).
  1. 11/6
/ Practical Introduction to NSL-MatLab 4: From Backprop to Learning Sequences (BPTT) (Bonaiuto)
  1. 11/11
/ Further modeling of the mirror neuron system / Bonaiuto, J., Rosta, E., and Arbib, M.A., 2005, Extending the Mirror Neuron System Model, I: Audible Actions and Invisible Grasps
  1. 11/13
/ (Augmented) Competitive Queuing / Bonaiuto & Arbib: ACQ paper;
HBTNN: Competitive Queuing
  1. 11/18
/ Neural Models of Imitation / Oztop, E., Kawato, M., & Arbib, M.A., 2006, Mirror Neurons and Imitation: A Computationally Guided Review, Neural Networks
  1. 11/20
/ Prefrontal cortex: Working memories; Neural mechanisms for planning / HBTNN 2e. Required:Cortical Memory; Prefrontal Cortex in Temporal Organization of Action; Patricia S. Goldman-Rakic, Seamas P. O Scalaidhe, and Matthew V. Chafee, 2000, Domain Specificity in Cognitive Systems, In The New Cognitive Neurosciences (2nd ed.), Edited by Michael S. Gazzaniga et al., Cambridge, MA: MIT Press. (You can find this book by going to Cognet.)
  1. 11/25

  1. 12/2
/ From Action to Language: The Mirror System Hypothesis 1 / Required:Arbib, M.A., 2005, From Monkey-like Action Recognition to Human Language: An Evolutionary Framework for Neurolinguistics, Behavioral and Brain Sciences, 28:105-167.) with supplement at
  1. 12/4
/ From Action to Language: The Mirror System Hypothesis 2
Project 4 due / Readings from the previous lecture continued.
Final Exam