29 December 2017

Multi-Dimensional Universe in Brain Networks

For most people, it is a stretch of the imagination to understand the world in four dimensions but a new study has discovered structures in the brain with up to eleven dimensions - ground-breaking work that is beginning to reveal the brain's deepest architectural secrets. Using algebraic topology in a way that it has never been used before in neuroscience, a team from the Blue Brain Project has uncovered a universe of multi-dimensional geometrical structures and spaces within the networks of the brain.


The research shows that these structures arise when a group of neurons forms a clique: each neuron connects to every other neuron in the group in a very specific way that generates a precise geometric object. The more neurons there are in a clique, the higher the dimension of the geometric object. Since 5, 6 or more dimensions are too complex for most of us to comprehend, algebraic topology comes in: a branch of mathematics that can describe systems with any number of dimensions.

More information:

17 December 2017

Teleoperating Robots With VR

Many manufacturing jobs require a physical presence to operate machinery. But what if such jobs could be done remotely? Researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) presented a virtual-reality (VR) system that lets you teleoperate a robot using an Oculus Rift headset. The system embeds the user in a VR control room with multiple sensor displays, making it feel like they are inside the robot's head. By using gestures, users can match their movements to the robot's to complete various tasks. The researchers even imagine that such a system could help employ increasing numbers of jobless video-gamers by gamefying manufacturing positions. The team demonstrated their VC control approach with the Baxter humanoid robot from Rethink Robotics, but said that the approach can work on other robot platforms and is also compatible with the HTC Vive headset.


The system mimics the homunculus model of mind, the idea that there's a small human inside our brains controlling our actions, viewing the images we see and understanding them for us. While it's a peculiar idea for humans, for robots it fits: inside the robot is a human in a control room, seeing through its eyes and controlling its actions. Using Oculus' controllers, users can interact with controls that appear in the virtual space to open and close the hand grippers to pick up, move, and retrieve items. A user can plan movements based on the distance between the arm's location marker and their hand while looking at the live display of the arm. To make these movements possible, the human's space is mapped into the virtual space, and the virtual space is then mapped into the robot space to provide a sense of co-location. To test the system, the team first teleoperated Baxter to do simple tasks like picking up screws or stapling wires.

More information:

16 December 2017

Brain Cells That Respond To Sound

Some expectant parents play classical music for their unborn babies, hoping to boost their children’s cognitive capacity. While some research supports a link between prenatal sound exposure and improved brain function, scientists had not identified any structures responsible for this link in the developing brain. A new study by University of Maryland School of Medicine (UMSOM) scientists, along with colleagues from the University of Maryland College Park, is the first to identify a mechanism that could explain an early link between sound input and cognitive function, often called the 'Mozart effect'. Working with an animal model, the researchers found that a type of cell in the brain’s primary processing area during early development, long thought to have no role in transmitting sensory information, may conduct such signals after all. Working with young ferrets, researchers observed sound-induced nerve impulses in subplate neurons, which help guide the formation of neural circuits in the same way that a scaffolding helps a construction crew erect a new building. This is the first time such impulses have been seen in these neurons. During development, subplate neurons are among the first neurons to form in the cerebral cortex–the outer part of the mammalian brain that controls perception, memory and, in humans, higher functions such as language and abstract reasoning.


The role of subplate neurons is thought to be temporary. Once the brain’s permanent neural circuits form, most subplate neurons disappear. Researchers assumed that subplate neurons had no role in transmitting sensory information, given their transient nature. Scientists had thought that mammalian brains transmit their first sensory signals in response to sound after the thalamus, a large relay center, fully connects to the cerebral cortex. Studies from some mammals demonstrate that the connection of the thalamus and the cortex also coincides with the opening of the ear canals, which allows sounds to activate the inner ear. This timing provided support for the traditional model of when sound processing begins in the brain. However, researchers had struggled to reconcile this conventional model with observations of sound-induced brain activity much earlier in the developmental process. Until they directly measured the response of subplate neurons to sound, the phenomenon had largely been overlooked. By identifying a source of early sensory nerve signals, the current study could lead to new ways to diagnose autism and other cognitive deficits that emerge early in development. Early diagnosis is an important first step toward early intervention and treatment. The next step is to begin studying in more detail how subplate neurons affect brain development.

More information:

14 December 2017

A Worm Brain in a Lego Robot Body

The brain is really little more than a collection of electrical signals. If we can learn to catalogue those then, in theory, you could upload someone's mind into a computer, allowing them to live forever as a digital form of consciousness. But it's not just science fiction. Sure, scientists aren't anywhere near close to achieving such a feat with but there's few better examples than the time an international team of researchers managed to do just that with the roundworm Caenorhabditis elegans. C. elegans is a little nematodes that have been extensively studied by scientists - we know all their genes and their nervous system has been analysed many times.


In 2014, a collective called the OpenWorm project mapped all the connections between the worm's 302 neurons and managed to simulate them in software. The ultimate goal of the project was to completely replicate C. elegans as a virtual organism. But they managed to simulate its brain, and then they uploaded that into a simple Lego robot. This Lego robot has all the equivalent limited body parts that C. elegans has - a sonar sensor that acts as a nose, and motors that replace the worm's motor neurons on each side of its body. Amazingly, without any instruction being programmed into the robot, the C. elegans virtual brain controlled and moved the Lego robot.

More information:

11 December 2017

Starbucks AR Experience

Starbucks has ordered up a venti cup of AR to make the visit more interactive. The company's new Reserve Roastery in Shanghai, will be the first Starbucks location to employ AR powered by Alibaba Group's scene recognition platform to integrate the on-site and online customer experiences. Shoppers can point the cameras of their mobile devices at points of interest and use the Roastery's web app or Alibaba's Taobao app to access AR content about Starbucks and coffee.

 

The AR platform also provides a digital menu that displays details of the coffee bars, brewing techniques, and more. Access to the experience is more convenient for users of the Taobao app. Using its location-tracking feature, the app notices when users enter the location, and then serves up site content. Moreover, Starbucks has added an element of gamification to the experience, challenging customers to collect digital badges and earn a custom Roastery filter.

More information:

10 December 2017

AR Twitter

News travels fast on Twitter, making it one of the most powerful social media channels for disseminating or collecting information. Now, you can immerse yourself in the data firehose of Twitter in augmented reality AR). The Twitter client gives users the ability to perform a few basic functions in AR, including zooming into the wall by moving their iPhone (or iPad) closer to the virtual display, and tapping on tweets to extract them from the wall and view replies.


Users can also compose tweets, retweet and like tweets, reply to tweets, view mentions, and even perform searches. However, the app is missing a few features, such as the ability to follow links to other sites, one of the key features on Twitter. Also, while pictures are visible, I wasn't able to view videos or gifs. This is a great peek at the future of immersive social media. But, as it stands today, it's more of a gimmick than a utility.

More information: