|Home > Media|
These videos demonstrate some of the ongoing human-robot interaction work at BioMimetic Systems. Our device (the Smart Neural Acoustic Processor or SNAP) is mounted on the head of each robot. In this configuration, the SNAP generates simulated neural firings, from microphone inputs, and analyzes these firings to:
In these videos, the SNAP, integrated with an ASR, is cueing autonomous behaviors on the robots.
Skippy Outside (2 MB AVI) -- Note that the passing children do not influence the result of the turn. Please excuse the poor lighting.
Skippy and Natasha One (18 MB WMV) and Skippy and Natasha Two (14 MB WMV) -- Both robots are running the same hardware and software, except that each one is given a unique name. The name assignment is a simple text file change. The right side is from Natasha's viewpoint.
MIT OCW and OpenMP
The images below were generated using my implementation (C++) of homework #3 (ray tracing) of course 6.837 (Computer Graphics) with transparent shadows. Additionally, I used OpenMP to parallelize the ray tracing across CPU cores. Doing this decreased run time by approximately 65%. The model geometries themselves were provided by the MIT OCW materials.
I made a small Matlab script to count cells in an image in order to speed a manual process being done in a biomedical engineering laboratory. By leveraging my own previous image processing work and the Image Processing Toolkit, the script took little time to develop.