25 February 2018

AR Sees Inside Patient's Body

Augmented reality (AR) technologies that blend computer-generated images and data from MRI and CT scans with real-world views are making it possible for doctors to see under the skin of their patients to visualize bones, muscles, and internal organs without having to cut open a body. Experts say AR will transform medical care by improving precision during operations, reducing medical errors, and giving doctors and patients alike a better understanding of complex medical problems. It could help doctors determine exactly where to make injections and incisions. In medical emergencies, it could be used to display life-saving information for AR-equipped paramedics and other first responders. ProjectDR is an AR system that can map internal medical scans into three-dimensional images overlaid on a patient’s body, either with a video projector or via AR smart glasses.
 

AR will eliminate the historic disconnect between a doctor’s efforts to understand data from scans and other diagnostic tests and those to care for flesh-and-blood patients. In a cutting-edge use of AR in medicine, doctors at Imperial College and St. Mary’s Hospital in London have been wearing Microsoft’s HoloLens AR glasses during reconstructive surgery on patients who have suffered severe leg injuries in traffic accidents. Doctors often repair severe leg injuries with flaps of tissue taken from elsewhere on the body. Connecting it to blood vessels at the site of the wound helps fresh oxygen-carrying blood reach the new tissue and keep it alive. Surgeons have typically used a handheld scanner to locate the major blood vessels near the wound. But the augmented reality system helped surgeons find those blood vessels directly, by highlighting them in the 3D virtual image displayed in an AR headset.

More information:

24 February 2018

EEG Reconstructs Perceived Images

A new technique developed by neuroscientists at U of T Scarborough can, for the first time, reconstruct images of what people perceive based on their brain activity gathered by EEG. The technique can digitally reconstruct images seen by test subjects based on electroencephalography (EEG) data. For the study, test subjects hooked up to EEG equipment were shown images of faces. Their brain activity was recorded and then used to digitally recreate the image in the subject’s mind using a technique based on machine learning algorithms. It’s not the first time researchers have been able to reconstruct images based on visual stimuli using neuro-imaging techniques. The current method was pioneered by Nestor who successfully reconstructed facial images from functional magnetic resonance imaging (fMRI) data in the past, but this is the first time EEG has been used.


And while techniques like fMRI, which measures brain activity by detecting changes in blood flow, can grab finer details of what’s going on in specific areas of the brain, EEG has greater practical potential given that it’s more common, portable, and inexpensive by comparison. EEG also has greater temporal resolution, meaning it can measure with detail how a percept develops in time right down to milliseconds. This study provides validation that EEG has potential for this type of image reconstruction, something many researchers doubted was possible given its apparent limitations. Using EEG data for image reconstruction has great theoretical and practical potential from a neuro-technological standpoint, especially since it’s relatively inexpensive and portable. In the future, it could help people who are unable to verbally communicate.

More information:

14 February 2018

BCIs Identify Songs People Listen

It's not quite mind reading, but it's close: scientists have been able to identify songs people are listening to just by using fMRI scans of their brains, which measure blood flow and brain activity. The research promises to help us understand both how the mind reacts to music and how future brain interfaces could be developed to help people who can't communicate in the usual way (i.e. locked-in syndrome). Further down the line we could even be composing songs using our thoughts, according to the international team of researchers behind the study, though that kind of sci-fi concept is still some time away. The experiments relied on an encoding-decoding model, where a computer system monitored the brain activity patterns caused by particular songs – which parts of the mind lit up and when – and then tried to identify the right song again just from the fMRI data. Six volunteers were played 40 pieces of music covering classical music, rock, pop, jazz, and others. Software hooked up to the fMRI scanner was trained to measure brain activity against musical features including tonality, dynamics, rhythm and timbre. When the analysis was complete, some of the tunes were repeated, and the computer system had to guess which songs were chosen. When the computer was given a straightforward A or B choice, it picked the right one up to 85 percent of the time.
 

The experiment was then widened so the software system had to pick the right song out of ten possible options, using only the brain scan data of the listener to go off. This time, the computer got it right 74 percent of the time. Among the other findings from the study was the way in which listeners didn't really show a 'hemispheric preference' for musical processing – there was no bias towards the left or right-hand side of the brain. While this isn't the first time scientists have tried to map songs against brain activity, this particular experiment does go into greater depth with a wider choice of songs and a more varied playlist than previous research did. Further down the line, the researchers say, this kind of technique could be used to work out which hooks and melodies people like best, and why some people can really fall in love with a song while it leaves others cold. The study is part of a wider effort to understand more about the effect music can have on us, with recent research looking at how certain music boosts productivity, and the way in which changes in brain activity can actually alter our taste in music. Eventually, the new technique could even be applied to help people who have problems with auditory hallucinations, though we're going to need a lot more data before that can happen.

More information:

13 February 2018

Wearable Scanners Will Read Our Minds

This year, a San Francisco-based start-up hopes to demonstrate a scanning device that could revolutionise the diagnosis of cancer and heart disease and, eventually, read our minds. The new device will do the same job as a Magnetic Resonance Imaging (MRI) machine, but Openwater, the start-up, promises it will be cheaper and more accurate. Using infrared light, the handheld gadget can scan five or six inches deep into the body, reporting what it sees to the focus of a micron (the same size as a neuron). The tool can be used to spot a tumour by detecting the surrounding blood vessels and to see where arteries are clogged. One day, it could follow the flow of oxygenated blood to different areas of the brain, tracking our thoughts and desires. The device benefits from three scientific breakthroughs. First, the shrinking of the size of pixels on display screens to almost the size of the wavelength of light. It can detect small changes in the body and beam them back at high resolution.


Second, the device makes use of physics that has been known for 50 years but is only really available in research labs. This focuses on the ability to assess scattering of light, so it can map how waves interfere with each other. Thirdly, developments in neuroscience help us understand where the brain is active by looking at where the oxygenated blood flows. Researchers have already been able to use MRI to guess what people are looking at. The University of California, Berkeley paid graduate students to lie in MRI machines and watch YouTube videos for hundreds of hours, watching how their brains behaved depending on what they saw. Then, it showed the students new video clips and was able to roughly replicate the images they saw. Openwater is aiming to show brain activity at a far higher level of detail making this kind of mind-reading more precise. If the company can produce a mass-market consumer product, it would also give neuroscientists far more data for building brain maps.

More information:

09 February 2018

Intel's Vaunt Smart Glasses

The most important parts of Intel’s new Vaunt smart glasses are the pieces that were left out. There is no camera to creep people out, no button to push, no gesture area to swipe, no glowing LCD screen, no weird arm floating in front of the lens, no speaker, and no microphone. From the outside, the Vaunt glasses look just like eyeglasses.


When you’re wearing them, you see a stream of information on what looks like a screen but it’s actually being projected onto your retina. All of the electronics in Vaunt sit inside two little modules built into the stems of the eyeglasses and located entirely up near the face of the frames so that the rest of the stems can flex a little, just like any other regular pair of glasses.

More information:

05 February 2018

Amazon Patents for Wristbands

Amazon has been granted patents for wristbands that would allow for ultrasonic tracking of a worker’s hands to monitor performance using haptic feedback, which sounds like something straight out of Black Mirror. The new patents are ostensibly for wristbands that Amazon employees would wear, which work in conjunction with ultrasonic devices strategically placed around Amazon’s warehouses. If the worker’s hands move to the wrong item, the bracelet will buzz, pretty much like an invisible fence used for dog training.


Not only do the tracking devices monitor inventory, but they also make sure workers are performing at optimum speed. The original patents were filed back in 2016 and granted to Amazon on January 30 of this year. This isn’t a good look for Amazon, which has already been accused of intolerable working conditions at its warehouses, including enforcing timed bathroom breaks and using packing timers to make sure workers were operating at top speed. However, it is worth-mentioning that many patents never actually become reality.

More information: