28 October 2010

BCI Eavesdrops on a Daydream

New research points to the ability to snoop on people’s visual imagination—although it’s still a long way away from the full-fledged dream-reading technologies popularized in this summer’s blockbuster movie Inception. Scientists from Germany, Israel, Korea, the United Kingdom, and the United States have performed experiments in which they were able to monitor individual neurons in a human brain associated with specific visual memories. They then taught people to will one visual memory onto a television monitor to replace another. The results suggest that scientists have found a neural mechanism equivalent to imagination and daydreaming, in which the mental creation of images overrides visual input. And, if technology someday advances to enable reading the electrical activity of many thousands or millions of individual neurons (as opposed to the dozens typically available by hard-wiring methods today), scientists might begin to access snippets of real daydreams or actual dreams. The researchers inserted microwires into the brains of patients with severe epilepsy as part of a presurgery evaluation to treat their seizures.

The microwires threaded into the medial temporal lobe (MTL), a region of the brain associated with both visual processing and visual memory. A typical patient might have 64 microwires cast into his MTL, like fishing lines into the ocean, researchers at Caltech mentioned. Soon after the patients’ surgery, researchers interviewed the subjects about places they’d recently visited or movies or television shows they’d recently seen. Then, on a display, he’d show images of the actors or visual landmarks the subjects had described. Slides of the Eiffel Tower, for instance, or Michael Jackson—who had recently died at the time of the experiment—would appear on a screen. Any image that reliably caused voltage spikes in one or more of the microwires would become one of the subject’s go-to images. There are about 5 million neurons in our brain that encode for the same concept. There are many neurons that fire all together when you think of Michael Jackson. But, he adds, each neuron also codes for numerous other people, ideas, or images, which is partly how we associate one memory with another thought, place, idea, or person.

More information:

http://www.youtube.com/user/NatureVideoChannel?feature=mhump/a/u/0/bqkUbiUkR5k

http://spectrum.ieee.org/biomedical/bionics/braincomputer-interface-eavesdrops-on-a-daydream/?utm_source=techalert&utm_medium=email&utm_campaign=102810

26 October 2010

Learning Neural Mechanisms

Learning from competitors is a critically important form of learning for animals and humans. A new study has used brain imaging to reveal how people and animals learn from failure and success. The team from Bristol University scanned the brains of players as they battled against an artificial opponent in a computer game. In the game, each player took turns with the computer to select one of four boxes whose payouts were simulating the ebb and flow of natural food sources. Players were able to learn from their own successful selections but those of their competitor failed completely to increase their neural activity. Instead, it was their competitor’s unexpected failures that generated this additional brain activity.

Such failures generated both reward signals in the brains of the players, and learning signals in regions involved with inhibiting response. This suggests that we benefit from our competitors’ failures by learning to inhibit the actions that lead to them. Surprisingly, when players were observing their competitor make selections, the players’ brains were activated as if they were performing these actions themselves. Such ‘mirror neuron’ activities occur when we observe the actions of other humans but here the players knew their opponent was just a computer and no animated graphics were used. Previously, it has been suggested that the mirror neuron system supports a type of unconscious mind-reading that helps us, for example, judge others’ intentions.

More information:


21 October 2010

Lightweight Mobile AR Navigation

A lightweight pair of augmented reality glasses that overlay the world with digital content, such as directions or a travel guide, has debuted in Japan. The headset, created by Olympus and phone-maker NTT Docomo, uses augmented reality software on an attached phone. A virtual tour of Kyoto was used as the first demonstration of the technology. While AR glasses are nothing new, these are among the first to add a miniature projecting display without too causing much encumbrance to the wearer. Researchers at the two companies said they had managed to whittle an earlier "AV Walker" prototype down from 91g to no more than 20g. The retinal display projects text and images directly into the user's peripheral vision, allowing the wearer to maintain eye contact with whatever they are observing normally.

As the glasses are attached to a smartphone with AR software, an acceleration sensor and a direction sensor, the AR Walker knows approximately what you are looking at and provides augmented information relevant to where you may be. The display can also be used to give directions with arrows and if a person lifts their head up to the sky a weather forecast is automatically protected into their peripheral vision. Augmented reality apps for smartphones such as Laya and Wikitude are already having some success as guides to our immediate surroundings. But as this usually involves holding up and pointing the mobile's camera in the direction you are looking AV Walker and its like have the added benefit of accessing information about your surroundings without altering your natural behaviour. According to the developers a release date for the AR glasses has yet to be determined.

More information:

http://www.bbc.co.uk/news/technology-11494729

19 October 2010

Vital Signs On Camera

You can check a person’s vital signs — pulse, respiration and blood pressure — manually or by attaching sensors to the body. But a student in the Harvard-MIT Health Sciences and Technology program is working on a system that could measure these health indicators just by putting a person in front of a low-cost camera such as a laptop computer’s built-in webcam. So far, the graduate student has demonstrated that the system can indeed extract accurate pulse measurements from ordinary low-resolution webcam imagery. Now the student is working on extending the capabilities so it can measure respiration and blood-oxygen levels. The system measures slight variations in brightness produced by the flow of blood through blood vessels in the face. Public-domain software is used to identify the position of the face in the image, and then the digital information from this area is broken down into the separate red, green and blue portions of the video image.

In tests, the pulse data derived from this setup were compared with the pulse determined by a commercially available FDA-approved blood-volume pulse sensor. The big challenge was dealing with movements of the subject and variations in the ambient lighting. But researchers were able to adapt signal-processing techniques originally developed to extract a single voice from a roomful of conversations, a method called Independent Component Analysis, in order to extract the pulse signal from the ‘noise’ of these other variations. The system produced pulse rates that agreed to within about three beats per minute with the rates obtained from the approved monitoring device, and was able to obtain valid results even when the subject was moving a bit in front of the camera. In addition, the system was able to get accurate pulse signals from three people in the camera’s view at the same time.

More information:

http://web.mit.edu/newsoffice/2010/pulse-camera-1004.html

16 October 2010

Mobile Health Monitoring

Imec and Holst Centre, together with TASS software professionals have developed a mobile heart monitoring system that allows to view your electrocardiogram on an Android mobile phone. The innovation is a low-power interface that transmits signals from a wireless ECG (electrocardiogram or heart monitoring)-sensor system to an android mobile phone. With this interface, imec, Holst Centre and TASS are the first to demonstrate a complete Body Area Network (BAN) connected to a mobile phone enabling reliable long-term ambulatory monitoring of various health parameters such as cardiac performance (ECG), brain activity (EEG), muscle activity (EMG), etc. The system will be demonstrated at the Wireless Health Conference in San Diego (US, October 5-7).

The aging population, combined with the increasing need for care and the rising costs of healthcare has become a challenge for our society. Mobile health, which integrates mobile computing technologies with healthcare delivery systems, will play a crucial role in solving this problem by delivering a more comfortable, more efficient and more cost-efficient healthcare. Body Area Networks (BAN) are an essential component of mHealth. BANs are miniaturized sensor networks; consisting of lightweight, ultra low-power, wireless sensor nodes which continuously monitor physical and vital parameters. They provide long-term monitoring, while maintaining user mobility and comfort. For example patients who are no longer compelled to stay in a hospital could be monitored at home.

More information:

http://www2.imec.be/be_en/press/imec-news/wirelesshealthnecklaceinterface.html

12 October 2010

Pin-Size Tracking Device

Optical gyroscopes, also known as rotation sensors, are widely used as a navigational tool in vehicles from ships to airplanes, measuring the rotation rates of a vehicle on three axes to evaluate its exact position and orientation. Researchers of Tel Aviv University's School of Physical Engineering are now scaling down this crucial sensing technology for use in smartphones, medical equipment and more futuristic technologies. Working in collaboration with Israel's Department of Defense, researchers have developed nano-sized optical gyroscopes that can fit on the head of a pin. These gyroscopes will have the ability to pick up smaller rotation rates, delivering higher accuracy while maintaining smaller dimensions.

At the core of the new device are extremely small semi-conductor lasers. As the devices start to rotate, the properties of the light produced by the lasers changes, including the light's intensity and wavelength. Rotation rates can be determined by measuring these differences. These lasers are a few tens-of-micrometers in diameter, as compared to the conventional gyroscope, which measures about 6 to 8 inches. The device itself, when finished, will look like a small computer chip. Measuring a millimeter by a millimeter (0.04 inches by 0.04 inches), about the size of a grain of sand, the device can be built onto a larger chip that also contains other necessary electronics.

More information:

http://www.aftau.org/site/News2?page=NewsArticle&id=13047

04 October 2010

Cars As Traffic Sensors

Data about road and traffic conditions can come from radio stations’ helicopters, the Department of Transportation’s roadside sensors, or even, these days, updates from ordinary people with cell phones. But all of these approaches have limitations: Helicopters are costly to deploy and can observe only so many roads at once, and it could take a while for the effects of congestion to spread far enough that a road sensor will detect them. MIT’s CarTel project is investigating how cars themselves could be used as ubiquitous, highly reliable mobile sensors. Members of the CarTel team recently presented a new algorithm that would optimize the dissemination of data through a network of cars with wireless connections.

Researchers at Ford are already testing the new algorithm for possible inclusion in future versions of Sync, the in-car communications and entertainment system developed by Ford and Microsoft. For the last four years, CarTel, has been collecting data about the driving patterns of Boston-area taxicabs equipped with GPS receivers. On the basis of those data, the CarTel researchers have been developing algorithms for the collection and dissemination of information about the roadways. Once the algorithms have been evaluated and refined, the CarTel researchers plan to test them in an additional, real-world experiment involving networked vehicles. The new algorithm is among those that the group expects to test.

More information:

http://web.mit.edu/newsoffice/2010/cars-sensors-0924.html

01 October 2010

Feelings by Phone

A system which enables psychologists to track people’s emotional behaviour through their mobile phones has been successfully road-tested by researchers. ‘EmotionSense’ uses speech-recognition software and phone sensors in standard smart phones to assess how people's emotions are influenced by factors such as their surroundings, the time of day, or their relationships with others. It was developed by a University of Cambridge-led team of academics, including both psychologists and computer scientists. They will report the first successful trial of the system today at the Association for Computing Machinery's conference on Ubiquitous Computing in Copenhagen. Early results suggest that the technology could provide psychologists with a much deeper insight into how our emotional peaks - such as periods of happiness, anger or stress - are related to where we are, what we are doing or who we are with. EmotionSense uses the recording devices which already exist in many mobile phones to analyse audio samples of the user speaking.

The samples are compared with an existing speech library (known as the ‘Emotional Prosody Speech and Transcripts Library’) which is widely used in emotion and speech processing research. The library consists of actors reading a series of dates and numbers in tones representing 14 different emotional categories. From here, the samples are grouped into five broader categories - "Happy" emotions (such as elation, or interest); "Sadness"; "Fear", "Anger" (which includes related emotions such as disgust) and "Neutral" emotions (such as boredom or passivity. The data can then be compared with other information which is also picked up by the phone. Built-in GPS software enables researchers to cross-refer the audio samples with the user's location, Bluetooth technology can be used to identify who they were with and the phone also records data about who they were talking to and at what time the conversation took place. The software is also set up so that the analysis is carried out on the phone itself. This means that data does not need to be transmitted elsewhere and can be discarded post-analysis with ease to maintain user privacy.

More information:

http://www.admin.cam.ac.uk/news/dp/2010092804