
More information:
http://www.sfgate.com/cgi-bin/article.cgi?file=/c/a/2011/12/27/MNHU1MDLEU.DTL
Virtual Reality • Augmented Reality • Human Machine Interaction • Brain Computer Interfaces • Serious Games • Computer Graphics

Some teams used vision systems to identify where pieces were, but none attempted to distinguish between a rook and a knight. Instead they relied upon remembering where pieces were last placed to identify them and move them accordingly. The bots quickly ran into snags - their vision systems often misread moves. One approach, by robotics company Road Narrows, used a commercially available fixed robotic arm normally used for light industrial applications without any vision at all. The winner was a team of researchers at the University of Albany, in New York, which had a mobile robot with an arm attached. Despite the many variables introduced when moving a robot around, the droid's vision system managed to keep track of the board and pieces as it moved about.
By modifying the equipment, the researchers were able to create slow-motion movies, showing what appears to be a bullet of light that moves from one end of the bottle to the other. The pulses of laser light enter through the bottom and travel to the cap, generating a conical shock wave that bounces off the sides of the bottle as the bullet passes. The streak tube scans and captures light in much the same way a cathode ray tube emits and paints an image on the inside of a computer monitor. Each horizontal line is exposed for just 1.71 picoseconds, or trillionths of a second, enough time for the laser beam to travel less than half a millimeter through the fluid inside the bottle. To create a movie of the event, the researchers record about 500 frames in just under a nanosecond, or a billionth of a second. Because each individual movie has a very narrow field of view, they repeat the process a number of times, scanning it vertically to build a complete scene that shows the beam moving from one end of the bottle, bouncing off the cap and then scattering back through the fluid. If a bullet were tracked in the same fashion moving through the same fluid, the resulting movie would last three years.
More information:
http://www.nytimes.com/2011/12/13/science/speed-of-light-lingers-in-face-of-mit-media-lab-camera.html?_r=1
Researchers predict a world where we can 'print' DNA, and even 'decode' it. A literal virus - injected into a 'host' in the guise of a vaccine, say - could be used to control behaviour. Synthetic biology will lead to new forms of bioterrorism. Bio-crime today is akin to computer crime in the early Eighties, Few initially recognized the problem - but it grew exponentially.
This openness has the advantage that it can be executed on any web browser, handle JavaScript, Java applets and Flash, and can be used not only on a PC but also on mobile devices like smart phones and tablet computers. The user can interact with the system by speaking directly with an anthropomorphic agent that employs speech recognition, speech synthesis and facial image synthesis. For example, a user can recite a telephone number, which is recorded by the computer and the data sent via the browser to a session manager on the server housing the MMI system. The data is processed by the speech recognition software and sent to a scenario interpreter.
Over the years, the program has grown more sophisticated, now with robots able to chat on 25 topics in 2,000 available conversations. The robots can detect the 800 most common errors learning English-speakers make, Lee said, and know all the irregular verbs, provide different tenses, explain grammatical terms and give advice on how to learn English. Users still have to type in their questions, rather than speak, although he said users with speech recognition software can talk into the microphone.
The team of undergraduates at Leeds University has devised a solution that combines a computer-generated virtual simulation with a hand-held haptic feedback device. The system works by varying feedback pressure on the user's hand when the density of the tissue being examined changes. In tests, team members simulated tumours in a human liver using a soft block of silicon embedded with ball bearings. The user was able to locate these lumps using haptic feedback. Engineers hope this will one day allow a surgeon to feel for lumps in tissue during surgery. The project has just been declared one of four top student designs in a global competition run by US technology firm National Instruments.