26 February 2017

BCIs Allow Fast and Accurate Typing For Patients

A clinical research publication led by Stanford University investigators has demonstrated that a brain-to-computer hookup can enable people with paralysis to type via direct brain control at the highest speeds and accuracy levels reported to date. The report involved three study participants with severe limb weakness - two from amyotrophic lateral sclerosis, also called Lou Gehrig’s disease, and one from a spinal cord injury. They each had one or two baby-aspirin-sized electrode arrays placed in their brains to record signals from the motor cortex, a region controlling muscle movement. These signals were transmitted to a computer via a cable and translated by algorithms into point-and-click commands guiding a cursor to characters on an onscreen keyboard.

Each participant, after minimal training, mastered the technique sufficiently to outperform the results of any previous test of brain-computer interfaces, or BCIs, for enhancing communication by people with similarly impaired movement. Notably, the study participants achieved these typing rates without the use of automatic word-completion assistance common in electronic keyboarding applications nowadays, which likely would have boosted their performance. One participant, was able to type 39 correct characters per minute, equivalent to about eight words per minute. This point-and-click approach could be applied to a variety of computing devices, including smartphones and tablets, without substantial modifications, the Stanford researchers said.

More information:

21 February 2017

Emotions Are Cognitive, Not Innate

Emotions are not innately programmed into our brains, but, in fact, are cognitive states resulting from the gathering of information, researchers revealed from New York University and City University of New York. They argue that conscious experiences, regardless of their content, arise from one system in the brain. The differences between emotional and non-emotional states are the kinds of inputs that are processed by a general cortical network of cognition, a network essential for conscious experiences. As a result, the brain mechanisms that give rise to conscious emotional feelings are not fundamentally different from those that give rise to perceptual conscious experiences.

While emotions, or feelings, are the most significant events in our lives, there has been relatively little integration of theories of emotion and emerging theories of consciousness in cognitive science. Existing work posits that emotions are innately programmed in the brain’s subcortical circuits. As a result, emotions are often treated as different from cognitive states of consciousness, such as those related to the perception of external stimuli. In other words, emotions aren’t a response to what our brain takes in from our observations, but, rather, are intrinsic to our makeup. However, after taking into account existing scholarship on both cognition and emotion, researchers conclude that emotions are “higher-order states” embedded in cortical circuits.

More information:

20 February 2017

VR Changes eCommerce

While e-commerce has revolutionised the way many goods are sold, offering customers greater level of convenience and forcing many bricks-and-mortar stores to completely change their business model, it is still far from the perfect experience. The fact is that e-commerce hasn’t been able to completely replace the physical store because there are some things that people just won’t buy online. Who would buy a brand new car, for example, or even a house online without ever having seen it in the real world? But thanks to advances in VR technology this will all change - and this future is closer than you might think. Most smartphones will have VR capabilities built in within the next couple of years, maybe even sooner, and VR headsets such as Oculus are becoming more affordable all the time.

This means forward-looking e-commerce players will be able to exploit VR to create completely new experiences, changing their relationship with customers and enabling the sale of goods never thought possible through online channels. While some retailers are beginning to experiment with in-store VR systems - with only limited success - the real potential for this technology lies outside of physical stores and in our own homes. Augmented reality systems are more suited to the inside of a store, while VR is very much something people can enjoy in their own homes and other safe spaces. And while it will be great to get a much more lifelike experience when shopping for goods online, getting a better feel for what a product looks like and its actual physical dimensions, there is so much more that VR can add to e-commerce than this.

More information:

16 February 2017

How the Brain Maintains Useful Memories

Researchers from the University of Toronto, Canada, have discovered a reason why we often struggle to remember the smaller details of past experiences. They found that there are specific groups of neurons in the medial prefrontal cortex (mPFC) of a rat’s brain – the region most associated with long-term memory. These neurons develop codes to help store relevant, general information from multiple experiences while, over time, losing the more irrelevant, minor details unique to each experience. The findings provide new insight into how the brain collects and stores useful knowledge about the world that can be adapted and applied to new experiences. Memories of recent experiences are rich in incidental detail but, with time, the brain is thought to extract important information that is common across various past experiences. They predicted that groups of neurons in the mPFC build representations of this information over the period when long-term memory consolidation is known to take place, and that this information has a larger representation in the brain than the smaller details.

To test their prediction, the team studied how two different memories with overlapping associative features are coded by neuron groups in the mPFC of rat brains, and how these codes change over time. Rats were given two experiences with an interval between each: one involving a light and tone stimulus, and the other involving a physical stimulus. This gave them two memories that shared a common stimulus relationship. The scientists then tracked the neuron activity in the animals’ brains from the first day of learning to four weeks following their experiences. This experiment revealed that groups of neurons in the mPFC initially encode both the unique and shared features of the stimuli in a similar way. Further experiments also revealed that the brain can adapt the general knowledge gained from multiple experiences immediately to a new situation. Concluding, researchers showed that groups of neurons develop coding to store shared information from different experiences while, seemingly independently, losing selectivity for irrelevant details.

More information:

12 February 2017

Archaeology Turns to Virtual Reality

A Melbourne-based VR company has secured nearly $1 million in seed funding from investors. Lithodomos (which means stonemason in Ancient Greek) develops VR content that re-creates ancient architecture, allowing people to see what archaeological sites once looked like.

Their first commercial project will be working with Spain's University of Cordoba and the Ministry of Economy, Industry and Competitiveness to re-create part of the Roman settlement of Mellaria in the Guadiato Valley, in the country's north.

More information: