From Noisebridge
Revision as of 01:19, 27 June 2017 by (talk) (Books)
Jump to: navigation, search


Our first meetup will be July 6th at 7pm, at Noisebridge Hackerspace, 2169 Mission st. San Francisco.

I'll talk about some of the key ideas at the intersection of stat mech and AI, in the context of what's known as the Boltzmann machine. A specialized version of this -- the restricted Boltzmann machine (RBM) -- is an algorithm/AI architecture that was behind the resurgence of interest in neural networks in the late aughts, prompted by Geoffrey Hinton "figuring out how to make them 10,000 times faster" via a procedure known as contrastive divergence. A significant amount of the buzz today re: DEEP LEARNING came from this development.

Some concepts to (hopefully re) acquaint yourselves with before the first meeting: Ising Model, partition function, thermodynamic equilibrium, Lagrange multipliers (and Lagrangian dual), Bayesian inference, global versus local extrema, simulated annealing.

Please try to

  • be on top of the prerequisites (below)
  • watch/look at all the links (also below)
  • skim the papers below (most importantly: read the abstract, introduction and conclusions, slowly. look up words you don't understand, try to connect what they seem to be talking about with what you already know. Research is ALL about throwing yourself into stuff you don't understand, and learning by exposure.)

But if you can't, come anyway!


nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only Noisebridge DreamTeam.

We're focused on theory. Implementation is fun too, but has its own set of skills that are mostly orthogonal to what we'll be covering, so our focus on it will be light.


Our discussions are ~at upper division to graduate level in machine learning and statistical mechanics. To be able to get something out of them, you should have at least undergrad proficiency in

  • linear algebra (at the level of D. Lay's book)
  • single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart's textbook).
  • statistics, including bayesian
  • statistical mechanics (at the level of McGreevy's MIT lecture notes)


Here are some cool links (you can use these to figure out what to study to get up to speed)


Good large scale overview of why the stat mech side is important

  • Advani et. al. - Stat mech of complex neural systems and high dimensional data - arXiv:1301.7115v1

Less emphasis on the physics, more emphasis on the stat mech <-> statistical inference connection.

  • Mastromatteo - On the typical properties of inverse problems in stat mech - arXiv:1311.0910v1


  • A great place to find books and articles is Library Genesis. I use these links for books and articles: [1], [2].
  • Huang's text is a standard for grad level stat mech, and also a rite of passage: I hated it and thought it was complete shite when I first read it, but loved it after giving it sufficient time.