Ultrasonic Hearing & Echo Location
Inspired by this Wired Article on learning to do echo location by clicking your palate, wouldn't it be great to be able to use echolocation?
Expanding out the idea, most species which do echolocation use higher frequencies, because the spacial resolution goes up with decreasing wave size (=increasing frequency). From my (Eric Boyd's) email to the cyborg list:
Having thought about it more, I am keen on an "echo location" kit. Imagine some electronics which generates an ultra-sonic noise (40kHz? 60kHz?), then receives it (via normal microphone?), and then frequency shifts the sound back into human hearing range, and uses ear-buds to display it to you. If the frequency shift is done correctly, you could even pre-process the sound data to help "amplify" the difference that close object bounces creates in the sound. This leaves the real data processing to the brain, of course - it's still going to just be a bunch of sounds, not a map of what's around you. But I actually think this could be way superior - who knows what kind of patterns your brain could pull out of the sound if you just wore the electronics for a week? I have no idea how complicated the frequency shift math might be, but I think the electronics for this should be doable using Arduino-class hardware?
I also have some thoughts about the armature. According to Song of the Mouse, mice make many noises in the ultrasonic region. I think it's only natural that a device which allows you to hear mice should be made from Disney Mouse Ears! This also means it's located conveniently near your ears, where it must display it's data anyway...
As I see it, the ultrasonic noise generating device is actually a separate thing entirely, expanding the original "ultrasonic hearing" device into an echo-location device. So in terms of working towards a prototype, first you build the ultrasonic hearing rig, then you build the emitter and tie the two together to get the ranging information.