26 November 2014

VR Self Compassion

Self-compassion can be learned using avatars in an immersive virtual reality, finds new research led by UCL. This innovative approach reduced self-criticism and increased self-compassion and feelings of contentment in naturally self-critical individuals. The scientists behind the MRC-funded study say it could be applied to treat a range of clinical conditions including depression. The team of psychologists and computer scientists from UCL, University of Barcelona and University of Derby designed a method to improve people's compassion to themselves, by creating a unique self-to-self situation using avatars and computer gaming technology. Virtual reality has previously been used to treat psychological disorders including phobias and post-traumatic stress disorder but this research focused on a new application for promoting emotional well-being


In the study, 43 healthy but self-critical women experienced a life-size virtual body substituting their own, giving a first person perspective of a virtual room through the eyes of the avatar. The participants were all trained to express compassion towards a distressed virtual child while in their adult virtual body. As they talked to the crying child, it appeared to listen and respond positively to the compassion. After a few minutes, 22 of the participants were then transferred to the virtual child body and from this perspective they saw their original virtual adult body deliver their own compassionate words and gestures to them. The remaining 21 participants observed their original virtual adult body express compassion to the child from a third person perspective. The participants were surveyed for mood, state and personality traits before and after the experiment using verified tests.

More information:

25 November 2014

Brain Reaction to VR

UCLA neurophysicists have found that space-mapping neurons in the brain react differently to virtual reality than they do to real-world environments. Their findings could be significant for people who use virtual reality for gaming, military, commercial, scientific or other purposes. The pattern of activity in a brain region involved in spatial learning in the virtual world is completely different than when it processes activity in the real world. Since so many people are using virtual reality, it is important to understand why there are such big differences. The scientists were studying the hippocampus, a region of the brain involved in diseases such as Alzheimer's, stroke, depression, schizophrenia, epilepsy and post-traumatic stress disorder. The hippocampus also plays an important role in forming new memories and creating mental maps of space. For example, when a person explores a room, hippocampal neurons become selectively active, providing a cognitive map of the environment. The mechanisms by which the brain makes those cognitive maps remains a mystery, but neuroscientists have surmised that the hippocampus computes distances between the subject and surrounding landmarks, such as buildings and mountains. But in a real maze, other cues, such as smells and sounds, can also help the brain determine spaces and distances


To test whether the hippocampus could actually form spatial maps using only visual landmarks, researchers devised a non-invasive virtual reality environment and studied how the hippocampal neurons in the brains of rats reacted in the virtual world without the ability to use smells and sounds as cues. They placed a small harness around rats and put them on a treadmill surrounded by a virtual world on large video screens in an otherwise dark, quiet room. The scientists measured the rats' behavior and the activity of hundreds of neurons in their hippocampi. They also measured the rats' behavior and neural activity when they walked in a real room designed to look exactly like the virtual reality room. The scientists were surprised to find that the results from the virtual and real environments were entirely different. In the virtual world, the rats' hippocampal neurons seemed to fire completely randomly, as if the neurons had no idea where the rat was -- even though the rats seemed to behave perfectly normally in the real and virtual worlds. Mathematical analysis showed that neurons in the virtual world were calculating the amount of distance the rat had walked, regardless of where he was in the virtual space. They also found that although the rats' hippocampal neurons were highly active in the real-world environment, more than half of those neurons shut down in the virtual space.

More information:

24 November 2014

Magic Tricks Created Using AI

Researchers working on artificial intelligence at Queen Mary University of London have taught a computer to create magic tricks. The researchers gave a computer program the outline of how a magic jigsaw puzzle and a mind reading card trick work, as well the results of experiments into how humans understand magic tricks, and the system created completely new variants on those tricks which can be delivered by a magician. The magic tricks created were of the type that use mathematical techniques rather than sleight of hand or other theatrics, and are a core part of many magicians' repertoires.
 

The tricks, proved popular with audiences and the magic puzzle was put on sale in a London magic shop. The card trick is available as an app called Phoney in the Google Play Store. Computer intelligence can process much larger amounts of information and run through all the possible outcomes in a way that is almost impossible for a person to do on their own. So while, a member of the audience might have seen a variation on this trick before, the AI can now use psychological and mathematical principles to create lots of different versions and keep audiences guessing.

More information:

16 November 2014

'Local' Clock in the Brain

All animals, from ants to humans, have internal 'circadian' clocks that respond to changes in light and tell the body to rest and go to sleep, or wake up and become active. A master clock found in part of the brain called the suprachiasmatic nucleus (SCN) is thought to synchronise lots of 'local' clocks that regulate many aspects of our metabolism, for example in the liver. But until now scientists have not had sufficient evidence to demonstrate the existence of these local clocks in the brain or how they operate. In a new study looking at mice, at Imperial College London and at the MRC Laboratory of Molecular Biology in Cambridge have investigated a local clock found in another part of the brain, outside the SCN, known as the tuberomamillary nucleus (TMN).
 

This is made up of histaminergic neurons, which are inactive during sleep, but release a compound called histamine during waking hours, which awakens the body. The researchers deleted a well-known 'clock' gene, Bmal1, from the histaminergic neurons and found that the mice produced higher levels of the enzyme that makes histamine and were awake for much longer periods than usual. The mice also experienced a more fragmented sleep, a shallower depth of sleep, and much slower recovery after a period of sleeplessness. This work with mice suggests that local body clocks play a key role in ensuring their sleeping and waking processes work properly. When a local clock was disrupted, their whole sleep and wake system malfunctioned.

More information:

12 November 2014

VR Echolocation in Humans

A pair of researchers with Ludwig-Maximilians-Universität München in Germany has found that echolocation in humans involves more than just the ears. This research describes how echolocation is thought to work in humans as compared to other animals, and the results of a study they conducted using volunteers and a virtual reality system. Echolocation is a means of determining the location of an object in the near vicinity by emitting sounds and then listening to the echoes that are bounced off objects when they come back. Bats are perhaps most famous for their echolocation abilities but many other animals have some degree of ability as well, including humans. Researchers note that several studies have been conducted recently to discover just how well humans can use sounds as a means of navigating terrain when they are unable to see. Thus far, they also note, none of the studies conducted to date have been able to quantify such an ability, which tends to muddy the results. In their study, they sought to do just that.


To find out how good people are at echolocation and what parts of the body are involved, they enlisted the assistance of eight sighted students—each was asked to wear a blindfold and to make clicking noises as they made their way through a long corridor. Over several weeks' time, each learned to differentiate between sounds that were echoed back to them, which allowed them to gauge wall distance and eventually to walk easily through the corridor with no other assistance. Once they'd mastered the real corridor, each of the volunteers was asked to sit at a VR workstation that simulated a walk through the same corridor and to use the same clicks they'd used earlier. In the simulation, the researchers varied the experience. They found that the volunteers lost most of their echolocation abilities when they were restricted from movement—they ran into walls that were easily avoided when allowed to move freely. By moving echolocation to a simulated environment, the researchers believe that they have finally found a way to quantify echolocation ability in humans.

More information:

09 November 2014

Direct Brain Interface Between Humans

University of Washington researchers have successfully replicated a direct brain-to-brain connection between pairs of people as part of a scientific study following the team's initial demonstration a year ago. In the newly published study, which involved six people, researchers were able to transmit the signals from one person's brain over the Internet and use these signals to control the hand motions of another person within a split second of sending that signal. At the time of the first experiment in August 2013, the UW team was the first to demonstrate two human brains communicating in this way.  The new study brings the brain-to-brain interfacing paradigm from an initial demonstration to something that is closer to a deliverable technology. Now researchers have replicated our methods and know that they can work reliably with walk-in participants.


The research team combined two kinds of non-invasive instruments and fine-tuned software to connect two human brains in real time. The process is fairly straightforward. One participant is hooked to an electroencephalography machine that reads brain activity and sends electrical pulses via the Web to the second participant, who is wearing a swim cap with a transcranial magnetic stimulation coil placed near the part of the brain that controls hand movements. Using this setup, one person can send a command to move the hand of the other by simply thinking about that hand movement. The UW study involved three pairs of participants. Each pair included a sender and a receiver with different roles and constraints. They sat in separate buildings on campus about a half mile apart and were unable to interact with each other in any way – except for the link between their brains.

More information: