mike at mindmech.com
Wed Feb 9 22:50:34 PST 2011
Hey, did anyone meet up tonight? I've been on
a kind of silent hiatus for the past two weeks,
just taking a break and gathering thoughts.
I was inspired by talk on the discuss mailing list
about the motorized wheelchair that has the
kinect hooked up to it. I haven't seen it but it
sounds pretty cool!
The idea inspired from it was this: why not build an
interactive distributed sensory cortex for Noisebridge?
What I specifically mean is to hook up low-cost computers
with webcams and speakers at different places around the
space, and train different machine learning classifiers and
deep neural nets to recognize what's going on around them.
Deep neural nets allow people to "look inside" what the
net is "thinking". So while the distributed learner figures
things out, it can also spit out a representation (images for
webcam input, sounds for microphone input) of what it's
"thinking". I don't mean it to be neural net specific, there
are plenty of opportunities for other types of learners to
be integrated into the system.
The idea is in it's infancy, but if people are interested, could
I present a cleaned-up version at next week's meeting? I think
it would be a fun project for us to work on and would contribute
to the space.
More information about the ml