19 February 2016

Oculus Launch Avatar Lip Sync Plugin for Unity

Oculus have released a dedicated Unity plugin which aims to automatically detect and transform an audio stream from speech into virtual character lip movements automatically. It seems that the substantial investment in research and development at Oculus over the last couple of years is leading to some welcome, if somewhat unexpected developments to come out of their labs.

At Unity’s 2016 Vision VR/Summit, the company have unveiled a lip sync plugin dedicated to producing lifelike avatar mouth animations, generated from analysing an audio stream. The new plugin for the Unity engine analyses a canned or live audio stream, such as microphone captured live voice chat, into potentially realistic lip animations for an in-world avatar.

More information:

18 February 2016

Leap Motion’s Orion Improves Finger Tracking

Dubbed Orion, the third version of Leap Motion’s software stack comes with a number of major improvements to the tracking from lower latency, to faster more reliable tracking. The improvements stem from a goal set in late 2014 when Leap Motion decided to pivot their development from interaction with flat screens to interaction with VR. There is an inherent disconnect when you try to pick something up naturally and you don’t feel it there that is very similar to the Uncanny Valley in robotics. You know you are getting there but you are just far enough away from it that you notice. Audio helps with this however as Leap Motion attempted to incorporate ‘audio haptics’, a synesthesia effect that comes with incorporating a visceral and well timed sound effect with the visuals on screen.

These effects helped add to the immersion, but the real star was the fact that my hands no longer went all cattywampus when I made a fast or difficult gesture. Even when Leap fails (which it still will if you try) it doesn’t fail in the same catastrophic way as it used to, they have smoothed out the edge cases. For example, overlapping your hands (something the software still isn’t able to resolve) leads to the bottom hand being hidden, rather than going crazy. This actually helps increase the level of immersion because it is far less noticeable than a hand dancing off into the corner. The system also handles other difficult situations far better, such as tracking a hand against another surface, like a pair of pants or a desk something that the previous edition of the stack struggled with.

More information:

17 February 2016

Music in the Brain

We know we love music, and we know that love must have something to do with how our brains work, but for most of human history we haven’t had many credible explanations for what’s going on. But science has discovered more about the relationship between music and the brain, and we’ve posted about some of those fascinating discoveries as they come out.

A study from MIT’s McGovern Institute for Brain Research has revealed exactly which parts of our brains respond specifically to music. They’ve explained their process, which involved putting subjects into an MRI and playing them various sounds, then studying how their brains responded differently to music. They found which regions of the brain lit up in response to music.

More information:

15 February 2016

Mathematical Model Shows How Brains Make Complex Decisions

Getting to the bottom of understanding how our brains work is a fascinating challenge for scientists, and new research promises to shed more light on the inner workings of our minds - through a complex mathematics model. Scientists in the UK say they've constructed ‘the first biologically realistic mathematical model’ that matches the way the brain makes complex decisions. Not only can this model predict behaviour, it's also capable of predicting actual neural activity too. It simulates the way the human mind goes through the decision-making process, as well as the ways in which we learn from our mistakes and adapt for the future. The team's findings could eventually help us in better understanding a multitude of conditions, from obsessive compulsive disorder to Parkinson's disease.

The intricate mathematical algorithm written by the researchers was compared to experimental data and accurately captured behavioural choice probabilities as well as predicting choice reversal. The model shows how a network of neurons, when connected in a certain way, identifies the best decision in any given situation, as well as the future cumulative reward. Significantly, the model also demonstrates how synapses can adapt and reshape themselves depending on what has or hasn't worked in the past - this is the same behaviour we see in humans and animals every day. The researchers found that in goal-based decision-making, synapses connecting the neurons together 'embed' the knowledge of how situations follow on from one another, depending on the actions chosen and the immediate rewards.

More information:

13 February 2016

Editing Memories

Are there any memories you'd like to permanently remove from your head? Or what if you could alter unpleasant memories so they're no longer upsetting? Or create entirely new memories of events that never occurred? It sounds like the stuff of science fiction, but according to a documentary scientists have discovered how to do just that - and more.

For much of human history, memory has been seen as a tape recorder that faithfully registers information and replays it intact. But now, researchers are discovering that memory is far more malleable, always being written and rewritten, not just by us but by others. They are discovering the precise mechanisms that can explain and even control our memories.

More information:

09 February 2016

AR/VR - The Future of Perception

At the moment, Augmented Reality and Virtual Reality primarily are taught in association with game development. And while both help bring interactive games vividly to life, neither is strictly (or solely) a game development technology. However, both AR and VR have numerous applications across many fields and industries, including education and healthcare. It won’t be long before schools offer degrees in Augmented and Virtual Reality for many related and newly created careers: Haptic (touch) and scent development and design, gestural linguistics, hardware and software engineering, VR/AR therapies, Architecture and the Arts.

It is expected that AR/VR will change the way we think and feel. Experiences will be rendered and interacting with us in holographic form in our living rooms, as we will in theirs. Reality is defined as the state or quality of having existence or substance. Going forward, our individual and collective imagination will co-exist with us in real time. Not only will augmentation change what we see and hear, we’ll be able to augment what we touch, feel, even smell. The new definition of reality will be an integration of what we currently perceive as ‘reality’ with anything and everything we can imagine becoming real.

More information: