Machine Learning

From Noisebridge
Revision as of 17:40, 16 April 2011 by (talk | contribs) (Future Talks and Topics)
Jump to: navigation, search

Next Meeting

  • When: Wednesday, 4/20/2011 @ 7:30-9:00pm
  • Where: 2169 Mission St. (back corner classroom)
  • Topic: Random Forests, How Joint Estimation Works in the Kinect
  • Details: Going into detail about random forests, and an overview of how they're used in the kinect for joint estimation.
  • Presenter: Mike S

Future Talks and Topics

  • Graphical Models, Tony
  • Boltzmann Machines (Mike S, May 2011)
  • Boosting and Bagging (Thomas, unscheduled)
  • CS229 second problem set
  • RPy?

Mailing List



Software Tools

Presentations and other Materials

Topics to Learn and Teach

NBML Course - Noisebridge Machine Learning Curriculum (work-in-progress)

CS229 - The Stanford Machine learning Course @ noisebridge

  • Supervised Learning
    • Linear Regression
    • Linear Discriminants
    • Neural Nets/Radial Basis Functions
    • Support Vector Machines
    • Classifier Combination [1]
    • A basic decision tree builder, recursive and using entropy metrics
  • Reinforcement Learning
    • Temporal Difference Learning
  • Math, Probability & Statistics
    • Metric spaces and what they mean
    • Fundamentals of probabilities
    • Decision Theory (Bayesian)
    • Maximum Likelihood
    • Bias/Variance Tradeoff, VC Dimension
    • Bagging, Bootstrap, Jacknife [2]
    • Information Theory: Entropy, Mutual Information, Gaussian Channels
    • Estimation of Misclassification [3]
    • No-Free Lunch Theorem [4]
  • Machine Learning SDK's
    • OpenCV ML component (SVM, trees, etc)
    • Mahout a Hadoop cluster based ML package.
    • Weka a collection of data mining tools and machine learning algorithms.
  • Applications
    • Collective Intelligence & Recommendation Engines

Meeting Notes