14 February 2018

BCIs Identify Songs People Listen

It's not quite mind reading, but it's close: scientists have been able to identify songs people are listening to just by using fMRI scans of their brains, which measure blood flow and brain activity. The research promises to help us understand both how the mind reacts to music and how future brain interfaces could be developed to help people who can't communicate in the usual way (i.e. locked-in syndrome). Further down the line we could even be composing songs using our thoughts, according to the international team of researchers behind the study, though that kind of sci-fi concept is still some time away. The experiments relied on an encoding-decoding model, where a computer system monitored the brain activity patterns caused by particular songs – which parts of the mind lit up and when – and then tried to identify the right song again just from the fMRI data. Six volunteers were played 40 pieces of music covering classical music, rock, pop, jazz, and others. Software hooked up to the fMRI scanner was trained to measure brain activity against musical features including tonality, dynamics, rhythm and timbre. When the analysis was complete, some of the tunes were repeated, and the computer system had to guess which songs were chosen. When the computer was given a straightforward A or B choice, it picked the right one up to 85 percent of the time.
 

The experiment was then widened so the software system had to pick the right song out of ten possible options, using only the brain scan data of the listener to go off. This time, the computer got it right 74 percent of the time. Among the other findings from the study was the way in which listeners didn't really show a 'hemispheric preference' for musical processing – there was no bias towards the left or right-hand side of the brain. While this isn't the first time scientists have tried to map songs against brain activity, this particular experiment does go into greater depth with a wider choice of songs and a more varied playlist than previous research did. Further down the line, the researchers say, this kind of technique could be used to work out which hooks and melodies people like best, and why some people can really fall in love with a song while it leaves others cold. The study is part of a wider effort to understand more about the effect music can have on us, with recent research looking at how certain music boosts productivity, and the way in which changes in brain activity can actually alter our taste in music. Eventually, the new technique could even be applied to help people who have problems with auditory hallucinations, though we're going to need a lot more data before that can happen.

More information:

13 February 2018

Wearable Scanners Will Read Our Minds

This year, a San Francisco-based start-up hopes to demonstrate a scanning device that could revolutionise the diagnosis of cancer and heart disease and, eventually, read our minds. The new device will do the same job as a Magnetic Resonance Imaging (MRI) machine, but Openwater, the start-up, promises it will be cheaper and more accurate. Using infrared light, the handheld gadget can scan five or six inches deep into the body, reporting what it sees to the focus of a micron (the same size as a neuron). The tool can be used to spot a tumour by detecting the surrounding blood vessels and to see where arteries are clogged. One day, it could follow the flow of oxygenated blood to different areas of the brain, tracking our thoughts and desires. The device benefits from three scientific breakthroughs. First, the shrinking of the size of pixels on display screens to almost the size of the wavelength of light. It can detect small changes in the body and beam them back at high resolution.


Second, the device makes use of physics that has been known for 50 years but is only really available in research labs. This focuses on the ability to assess scattering of light, so it can map how waves interfere with each other. Thirdly, developments in neuroscience help us understand where the brain is active by looking at where the oxygenated blood flows. Researchers have already been able to use MRI to guess what people are looking at. The University of California, Berkeley paid graduate students to lie in MRI machines and watch YouTube videos for hundreds of hours, watching how their brains behaved depending on what they saw. Then, it showed the students new video clips and was able to roughly replicate the images they saw. Openwater is aiming to show brain activity at a far higher level of detail making this kind of mind-reading more precise. If the company can produce a mass-market consumer product, it would also give neuroscientists far more data for building brain maps.

More information:

09 February 2018

Intel's Vaunt Smart Glasses

The most important parts of Intel’s new Vaunt smart glasses are the pieces that were left out. There is no camera to creep people out, no button to push, no gesture area to swipe, no glowing LCD screen, no weird arm floating in front of the lens, no speaker, and no microphone. From the outside, the Vaunt glasses look just like eyeglasses.


When you’re wearing them, you see a stream of information on what looks like a screen but it’s actually being projected onto your retina. All of the electronics in Vaunt sit inside two little modules built into the stems of the eyeglasses and located entirely up near the face of the frames so that the rest of the stems can flex a little, just like any other regular pair of glasses.

More information:

05 February 2018

Amazon Patents for Wristbands

Amazon has been granted patents for wristbands that would allow for ultrasonic tracking of a worker’s hands to monitor performance using haptic feedback, which sounds like something straight out of Black Mirror. The new patents are ostensibly for wristbands that Amazon employees would wear, which work in conjunction with ultrasonic devices strategically placed around Amazon’s warehouses. If the worker’s hands move to the wrong item, the bracelet will buzz, pretty much like an invisible fence used for dog training.


Not only do the tracking devices monitor inventory, but they also make sure workers are performing at optimum speed. The original patents were filed back in 2016 and granted to Amazon on January 30 of this year. This isn’t a good look for Amazon, which has already been accused of intolerable working conditions at its warehouses, including enforcing timed bathroom breaks and using packing timers to make sure workers were operating at top speed. However, it is worth-mentioning that many patents never actually become reality.

More information:

04 February 2018

Vision, Sensory and Motor Testing Could Improve in Baseball Players

New research from Duke Health suggests baseball scouts looking for a consistent, conscientious hitter may find clues not only in their performance on the field, but also in front of a computer screen. In a study of 252 baseball professionals published in the journal Scientific Reports, Duke researchers found players with higher scores on a series of vision and motor tasks completed on large touch-screen machines called Nike Sensory Stations, had better on-base percentages, more walks and fewer strikeouts (collectively referred to as plate discipline) compared to their peers. The players were on U.S. major and minor league teams. They used large touch-screen stations to complete nine exercises, many of them resembling two-dimensional video games where users track or touch flat shapes as they scoot across the screen.


The tasks test a person's ability to glean information from a faint object or in a split second plus skills such as reaction time and hand-eye coordination. The researchers found that overall, better performance on tasks predicted better batting performance for measures of plate discipline, such as on-base percentage, strikeout rate and walk rate, but not slugging percentage or pitching statistics. In particular, high scores on a perception-span task, which measured the player's ability to remember and recreate visual patterns, were associated with an increased ability to get on base. High scores in hand-eye coordination and reaction time were associated with an increased ability to draw walks, while better scores in spatial recognition, such as the ability to shift attention between near and far targets, were associated with fewer strikeouts.

More information:

03 February 2018

AR Help Surgeons to Reconnect Blood Vessels

Using augmented reality (AR) in the operating theatre could help surgeons to improve the outcome of reconstructive surgery for patients. In a series of procedures carried out by a team at Imperial College London at St Mary's Hospital, researchers have shown for the first time how surgeons can use Microsoft HoloLens headsets while operating on patients undergoing reconstructive lower limb surgery.


The Imperial team used the technology to overlay images of CT scans -- including the position of bones and key blood vessels onto each patient's leg, in effect enabling the surgeon to 'see through' the limb during surgery. According to the team trialling the technology, the approach can help surgeons locate and reconnect key blood vessels during reconstructive surgery, which could improve outcomes for patients.

More information: