NBDSM: Difference between revisions

From Noisebridge
Jump to navigation Jump to search
 
(33 intermediate revisions by 3 users not shown)
Line 1: Line 1:


== Schedule ==
== Schedule ==
=== 7/6/17 - Talk and Discussion: Steve Young - Boltzmann Machines and Statistical Mechanics. ===
 
=== 8/10/17 - Informal Meetup ===
 
Bring papers you've been reading, material/texts you've been working through, questions to ask, and Things You Understand to teach.
 
Dress code this week is all black.
 
Social events to follow.
 
=== 7/6/17 - Talk and Discussion: Dr. Steve Young - Boltzmann Machines and Statistical Mechanics. ===


PREREADINGS:  
PREREADINGS:  
Line 11: Line 20:
== What ==
== What ==


nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet weekly to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].
nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet periodically to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only [https://www.noisebridge.net/wiki/DreamTeam Noisebridge DreamTeam].


We're focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we'll cover only lightly.
We're focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we'll cover only lightly.
Line 39: Line 48:
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we're trying to build.
* A [http://bactra.org/weblog/361.html blog] post about exponential families that demonstrates the sort of intuition we're trying to build.
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy
* [[Media:maxEntChap10.pdf]] Intro to principle of Maximum Entropy
These are my ongoing personal written working notes. They are a mess, but you can at least use them to see what I'm working on.
* [https://1drv.ms/o/s!AmTN0QVCYp0Og-BvczbkIA4xN1FfKg Steve's AI Learning Notes (read only, so people don't draw butts or flowers all over everything)]


== Papers ==
== Papers ==
Line 56: Line 69:


* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]
* Chen et al. - On the Equivalence of Restricted Boltzmann Machines and Tensor Network States - [https://arxiv.org/abs/1701.04831v1 arXiv:1701:04831v1]
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831]
* Mehta et al. - An exact mapping between the Variational Renormalization Group and Deep Learning - [https://arxiv.org/abs/1410.3831 arXiv:1410.3831], Recreation at  [https://github.com/fineline179/MEHTA_project https://github.com/fineline179/MEHTA_project]
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]
* Saxe et al. - Exact solutions to the nonlinear dynamics of learning in deep linear neural networks - [https://arxiv.org/abs/1312.6120 arXiv:1312.6120]
* Deng et al. - Quantum Entanglement in Neural Network States [https://arxiv.org/abs/1701.04844 arXiv:1701.04844]


== Books ==
== Books ==


=== IMPORTANT NOTICE ON PIRACY AND INTELLECTUAL PROPERTY ===
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].


= IMPORTANT NOTICE ON PIRACY =
=== Stat Mech ===
* A great place to find books and articles is [https://www.reddit.com/r/Scholar/comments/3bs1rm/meta_the_libgenscihub_thread_howtos_updates_and/ Library Genesis]. I use these links for books and articles: [https://libgen.unblocked.srl/],  [https://libgen.unblocked.li/scimag].
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang's] text is the bronze standard for grad level stat mech.


= Stat Mech =
* [https://libgen.unblocked.srl/book/index.php?md5=836019C9EC85D594F380D1D898B59713 Huang's] text is the bronze standard for grad level stat mech.
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler's] text is supposedly great for stat mech, although I haven't read it.
* [https://libgen.unblocked.srl/book/index.php?md5=05528E3012BCB03B7E0142343E58D0C6 Chandler's] text is supposedly great for stat mech, although I haven't read it.


= Stat Mech and Stat Inference =
=== Stat Mech and Stat Inference ===
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven't looked at this yet, but it seems promising.
* [https://libgen.unblocked.pub/book/index.php?md5=4CFDD37EEDB946D6E944750F746DB72B Bishop - Pattern Recognition and Machine Learning] Great pedagogical introduction to the basics. Good treatment of exponential family.
 
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]
* [http://www.inference.org.uk/itila/p0.html MacKay - Information Theory, Inference, and Learning Algorithms] Link [https://libgen.unblocked.srl/book/index.php?md5=E70CC484C7FF51073859B15779162C25 here]
* [https://libgen.unblocked.pub/book/index.php?md5=4CFDD37EEDB946D6E944750F746DB72B Bishop - Pattern Recognition and Machine Learning]
 
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]
* [https://libgen.unblocked.srl/book/index.php?md5=E4B2AB0EF22458F94C835D4D2397034E Goodfellow et al. - Deep Learning]
* [https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]
* [https://libgen.unblocked.srl/book/index.php?md5=3E76F8F5189A047550CA9020D97848E4 Mezard - Information, Physics, and Computation]
* [https://libgen.unblocked.srl/book/index.php?md5=AFEE8F3BEF62DC5C7A191451259AD8EB Engel - Statistical Mechanics of Learning] I haven't looked at this yet, but it seems promising.
* [https://libgen.unblocked.srl/book/index.php?md5=0AF108CB2FEFA5A20F7B186BC2C88656 Jaynes - Probability Theory, The Logic of Science] An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.
* [https://libgen.unblocked.srl/book/index.php?md5=0AF108CB2FEFA5A20F7B186BC2C88656 Jaynes - Probability Theory, The Logic of Science] An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.



Latest revision as of 19:55, 8 July 2018

Schedule[edit]

8/10/17 - Informal Meetup[edit]

Bring papers you've been reading, material/texts you've been working through, questions to ask, and Things You Understand to teach.

Dress code this week is all black.

Social events to follow.

7/6/17 - Talk and Discussion: Dr. Steve Young - Boltzmann Machines and Statistical Mechanics.[edit]

PREREADINGS:

MacKay - Information Theory, Inference, and Learning Algorithms Chapter 43 on the Boltzmann machine. Chapter 42 on Hopfield networks.

Media:hinton_lect11.pdf Media:hinton_lect12.pdf Lecture notes from Hinton's Coursera class. Good overview of Boltzmann machines and Hopfield nets. You can sign up for the free course and watch the accompanying videos here. They're also on Youtube.

What[edit]

nBDSM is the noiseBridge Deepnet and Statistical Mechanics working group. We meet periodically to learn, teach, and discuss topics at the intersection of AI/deep learning and statistical mechanics. Note that we have a non-trivial overlap with The One, The Only Noisebridge DreamTeam.

We're focused on theory. Implementation is fun too, but has its own set of (mostly orthogonal) skills that we'll cover only lightly.

Prerequisites[edit]

Our discussions are at upper division to grad level in machine learning and statistical mechanics. To be able to get something out of them, you should know

  • linear algebra (at the level of D. Lay's book)
  • single and multi-variable calculus, vector calculus, Lagrange multiplers, Taylor expansions (all of Stewart's textbook).
  • basics of statistics, including bayesian
  • statistical mechanics (at the level of McGreevy's MIT lecture notes)

There are plenty of other places to learn this stuff. Eg you can review your probability, stats and linear algebra from chapters 2 and 3 of Goodfellow.

Links[edit]

Check out these cool links

These are my ongoing personal written working notes. They are a mess, but you can at least use them to see what I'm working on.

Papers[edit]

Overviews[edit]

Good large scale overview of why the stat mech side is important

  • Advani et al. - Stat mech of complex neural systems and high dimensional data - arXiv:1301.7115v1

Less emphasis on the physics, more emphasis on the stat mech <-> statistical inference connection.

  • Mastromatteo - On the typical properties of inverse problems in stat mech - arXiv:1311.0910v1

Interesting papers[edit]

Books[edit]

IMPORTANT NOTICE ON PIRACY AND INTELLECTUAL PROPERTY[edit]

  • A great place to find books and articles is Library Genesis. I use these links for books and articles: [1], [2].

Stat Mech[edit]

  • Huang's text is the bronze standard for grad level stat mech.
  • Chandler's text is supposedly great for stat mech, although I haven't read it.

Stat Mech and Stat Inference[edit]

  • Jaynes - Probability Theory, The Logic of Science An excellent introduction to Bayesian reasoning and probability theory in general, from a very pedagogical, opinionated point-of-view. Dives into the motivations behind some information theory as well.

Ideas for future talks[edit]

Here's some ideas for future talks. If you want to present one of these,

A) Feel free to be advanced as you like -- assume an audience of graduate students.

but

B) Don't feel pressured to go any faster than you want. If you think you can give a pedagogical 'for dummies' talk in the course of an hour and a half, go for it!

  • Derive capacity of Hopfield net and understand this limitation intuitively
  • Explain similarity/relationship/identity of Bayesian inference and maximum entropy formalism.
  • Deep intuitive dive on Lagrangian duals and what they really do/mean in the context of statistical inference/machine learning/stat mech