27 December 2016

Playing Computer Game using Direct Brain Stimulation

University of Washington researchers have taken a first step in showing how humans can interact with virtual realities via direct brain stimulation. They described the first demonstration of humans playing a simple, two-dimensional computer game using only input from direct brain stimulation -- without relying on any usual sensory cues from sight, hearing or touch. The subjects had to navigate 21 different mazes, with two choices to move forward or down based on whether they sensed a visual stimulation artifact called a phosphene, which are perceived as blobs or bars of light. To signal which direction to move, the researchers generated a phosphene through transcranial magnetic stimulation, a well-known technique that uses a magnetic coil placed near the skull to directly and noninvasively stimulate a specific area of the brain. The five test subjects made the right moves in the mazes 92 percent of the time when they received the input via direct brain stimulation, compared to 15 percent of the time when they lacked that guidance.


The simple game demonstrates one way that novel information from artificial sensors or computer-generated virtual worlds can be successfully encoded and delivered noninvasively to the human brain to solve useful tasks. It employs a technology commonly used in neuroscience to study how the brain works (transcranial magnetic stimulation) to instead convey actionable information to the brain. The test subjects also got better at the navigation task over time, suggesting that they were able to learn to better detect the artificial stimuli. The initial experiment used binary information to let the game players know whether there was an obstacle in front of them in the maze. In the real world, even that type of simple input could help blind or visually impaired individuals navigate. Theoretically, any of a variety of sensors on a person's body -- from cameras to infrared, ultrasound, or laser rangefinders -- could convey information about what is surrounding or approaching the person in the real world to a direct brain stimulator that gives that person useful input to guide their actions.

More information:

26 December 2016

Apple VR Project

Apple has been exploring virtual reality and augmented reality technologies for more than 10 years based on patent filings, but with virtual and augmented reality exploding in popularity, Apple's dabbling may be growing more serious and could lead to an actual product or feature in the not-too-distant future. Apple is rumored to have a secret research unit comprising hundreds of employees working on AR and VR, exploring ways the emerging technologies could be used in future Apple products. In recent months, VR/AR hiring has ramped up and Apple has acquired multiple AR/VR companies.


There are dozens of possibilities for VR/AR technology in Apple products, ranging from augmented reality features within Maps, Snapchat-style camera filters, and other apps to virtual 3D interfaces for the iPhone to full-on virtual reality headsets. With products like the Oculus Rift and Microsoft HoloLens garnering significant interest, Apple has been inspired to test its own headset. Apple is said to be working on developing several prototype virtual reality headsets, but little is known about the company's work beyond the fact that prototypes exist.

More information:

22 December 2016

Lumus AR Display

Augmented reality wearable display producer Lumus, completed $45 million investment in a Series C round, the company announced. Lumus makes the optical engine to empower augmented reality solutions. The company is working on optical technology for see-through wearable displays. Its augmented reality display technology is used by consumer electronics and smart-eyewear manufacturers to implement their augmented reality offerings.


Lumus was founded in 2000, with a mission to create optics that transform the way people interact with their reality. Previously this year, the company had announced it had raised $15 million in funding led by Shanda Group and Crystal-Optech. Additional $30 million came from Taiwanese tech companies Quanta and HTC, along with other strategic investors. Current investment is aligned with HTC’s natural extension into augmented reality.

More information:

18 December 2016

Your Brain Could Be Similar With Someone Else

I’m sure we’ve all felt that we’ve clicked with someone or were on the same wave length. Our everyday language is full of these kind of expressions, but is it just a manner of speaking?. Researchers have found that human brains can literally tune into each other through a process called brain coupling.


They looked at brain scans of a person telling a story and another person listening to it. Even though one person was listening and the other person was speaking, they found that the wavelengths of each brain came out incredibly similar. What’s even more amazing is that the more similar the brainwaves were, the better the understanding was between the two.

More information:

15 December 2016

VR Treatment May Improve Impaired Hands

Virtual reality training may improve the motor skills in damaged limbs, a new Tel Aviv University research suggests. Patients suffering from hemiparesis — the weakness or paralysis of one of two paired limbs — undergo physical therapy, but this therapy is challenging, exhausting, and usually has a fairly limited effect. Results suggest that training with a healthy hand through a virtual reality intervention provides a promising way to repair mobility and motor skills in an impaired limb. According to the research statement, 53 healthy participants completed baseline tests to assess the motor skills of their hands. The participants then strapped on virtual reality headsets that showed simulated versions of their hands.


The virtual reality technology, presented the participants with a 'mirror image' of their hands, so when they moved their real right hand, their virtual left hand would move too. In the first experiment, participants completed a series of finger movements with their right hands, while the screen showed their virtual left hands moving instead. In the next, participants placed motorized gloves on their left hands, which moved their fingers to match the motions of their right hands. Again, the headsets presented the virtual left hands moving instead of their right hands. Improvements occurred when the virtual reality screen showed the left hand moving while in reality the motorized glove moved the hand.

More information: