jareddunne at gmail.com
Wed Feb 9 23:09:18 PST 2011
This is a great idea! I'm into it. I'll be there and bring some ideas of
On Feb 9, 2011 10:50 PM, "Mike Schachter" <mike at mindmech.com> wrote:
Hey, did anyone meet up tonight? I've been on
a kind of silent hiatus for the past two weeks,
just taking a break and gathering thoughts.
I was inspired by talk on the discuss mailing list
about the motorized wheelchair that has the
kinect hooked up to it. I haven't seen it but it
sounds pretty cool!
The idea inspired from it was this: why not build an
interactive distributed sensory cortex for Noisebridge?
What I specifically mean is to hook up low-cost computers
with webcams and speakers at different places around the
space, and train different machine learning classifiers and
deep neural nets to recognize what's going on around them.
Deep neural nets allow people to "look inside" what the
net is "thinking". So while the distributed learner figures
things out, it can also spit out a representation (images for
webcam input, sounds for microphone input) of what it's
"thinking". I don't mean it to be neural net specific, there
are plenty of opportunities for other types of learners to
be integrated into the system.
The idea is in it's infancy, but if people are interested, could
I present a cleaned-up version at next week's meeting? I think
it would be a fun project for us to work on and would contribute
to the space.
ml mailing list
ml at lists.noisebridge.net
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the ml