22 April 2017

Understanding Dreaming

Scientists have unpicked the regions of the brain involved in dreaming, in a study with significant implications for our understanding of the purpose of dreams and of consciousness itself. What’s more, changes in brain activity have been found to offer clues as to what the dream is about. Dreaming had long been thought to occur largely during rapid eye-movement (REM) sleep, a period of slumber involving fast brain activity similar to that when awake, but dreams have also been reported to occur during non-REM sleep, leaving scientists scratching their heads as to the hallmark of dreaming.


It seemed a mystery that you can have both dreaming and the absence of dreaming in these two different types of stages. However, now it seems the puzzle has been solved. In addition the team found that dreaming about faces was linked to increased high-frequency activity in the region of the brain involved in face recognition, with dreams involving spatial perception, movement and thinking similarly linked to regions of the brain that handle such tasks when awake. Experts have hailed the significance of the research, saying it could help to solve the conundrum of what dreams are for, and even the nature of human consciousness.

More information:

21 April 2017

The Dark Secret at the Heart of AI

Last year, a strange self-driving car was released onto the quiet roads of Monmouth County, New Jersey. The experimental vehicle, developed by researchers at the chip maker Nvidia, didn’t look different from other autonomous cars, but it was unlike anything demonstrated by Google, Tesla, or General Motors, and it showed the rising power of artificial intelligence. The car didn’t follow a single instruction provided by an engineer or programmer. Instead, it relied entirely on an algorithm that had taught itself to drive by watching a human do it. Getting a car to drive this way was an impressive feat. But it’s also a bit unsettling, since it isn’t completely clear how the car makes its decisions. Information from the vehicle’s sensors goes straight into a huge network of artificial neurons that process the data and then deliver the commands required to operate the steering wheel, the brakes, and other systems. The result seems to match the responses you would expect from a human driver.


The system is so complicated that even the engineers who designed it may struggle to isolate the reason for any single action. And you can’t ask it: there is no obvious way to design such a system so that it could always explain why it did what it did. The mysterious mind of this vehicle points to a looming issue with artificial intelligence. The car’s underlying AI technology, known as deep learning, has proved very powerful at solving problems in recent years, and it has been widely deployed for tasks like image captioning, voice recognition, and language translation. There is now hope that the same techniques will be able to diagnose deadly diseases, make million-dollar trading decisions, and do countless other things to transform whole industries. But this won’t happen—or shouldn’t happen—unless we find ways of making techniques like deep learning more understandable to their creators and accountable to their users. Otherwise it will be hard to predict when failures might occur—and it’s inevitable they will. That’s one reason Nvidia’s car is still experimental.

More information:

20 April 2017

Facebook BCIs

Today Facebook revealed it has a team of 60 engineers working on building a brain-computer interface that will let you type with just your mind without invasive implants. The team plans to use optical imaging to scan your brain a hundred times per second to detect you speaking silently in your head, and translate it into text. The goal is to eventually allow people to type at 100 words per minute, 5X faster than typing on a phone, with just your mind.


Eventually, brain-computer interfaces could let people control augmented reality and virtual reality experiences with their mind instead of a screen or controller. They only began working on the brain typing project six months ago, but it now they are collaborating with UC San Francisco, UC Berkeley, Johns Hopkins Medicine, Johns Hopkins University’s Applied Physics Laboratory and Washington University School of Medicine in St. Louis.

More information:

19 April 2017

Xiaomi's VR Headset

Xiaomi announced its new VR headset called 'Mi VR Play 2' and costs CNY 99 ($14 USD). It is a successor to the original Mi VR Play headset released in 2016. Mi VR Play 2 is an entry-level headset that is attached to a smartphone to deliver VR apps, games, movies, and TV shows.


The new headset comes with improved, cloth-like material that sits comfortably on the face. Xiaomi’s headset will become available for purchase starting April 19 in China. There is no announcement yet if it will be available in US or Europe.

More information:

10 April 2017

Squid and Octopus Can Edit Their Brain

Octopuses and squid have confirmed their reputation as Earth-bound aliens with the discovery that they can edit their own genetic instructions. Unlike other animals, cephalopods – the family that includes octopuses, squid and cuttlefish – do not obey the commands of their DNA to the letter. Instead, they sometimes interfere with the code as it is being carried by a molecular messenger. This has the effect of diversifying the proteins their cells can produce, leading to some interesting variations. The system may have produced a special kind of evolution based on RNA editing rather than DNA mutations and could be responsible for the complex behaviour and high intelligence seen in cephalopods, some scientists believe. RNA, a close cousin of DNA, is used to transfer software-like instructions from the genes to protein-making machinery in cells.


Scientists discovered that more than 60 per cent of RNA transcripts in the squid brain are re-coded by editing. In other animals, ranging from fruit flies to humans, such re-coding events only occur a fraction of 1 per cent of the time. Similar high levels of RNA editing were identified in three other smart cephalopod species, two octopuses and one cuttlefish. The mechanics of cephalopod RNA editing are still being investigated. When do they turn it on, and under what environmental influences? It could be something as simple as temperature changes or as complicated as experience, a form of memory. Octopuses and other cephalopods have a number of characteristics that have caused experts to compare them with aliens, including instantaneous colour-changing camouflage, blue blood, and an ability to see polarised light.

More information: