27 October 2015

Tractor Beam Uses Acoustic Holograms to Move Objects

Scientists had already been working on models that can push tiny objects small distances using light, but a new version of a tractor beam uses sound instead. Researchers figured out a way to manipulate a tiny physical object using what they call, acoustic holograms.


The researchers used a grid of tiny loudspeakers to levitate, rotate, and otherwise manipulate a tiny ball in mid air. They did this by programming the speakers to send out high intensity sounds that create a kind of force field around the object.

More information:

26 October 2015

Understanding How the Brain is Multitasking

If you’ve ever had to cook dinner, prepare for the next day’s work meeting, while also listen to a friend complain over the phone, then you know all too well the importance of multitasking. But what's actually going on inside our brains that allows for us to strategically focus on one task over another? That's remained largely a mystery, at least until recently. Earlier this week, researchers at New York University identified one small region of the brain—the thalamic reticular nucleus (TRN)—as the one that controls our ability to multitask. Working as a task switchboard, the TRN enables our brains to focus on the sensory stimulus that is most vital at any given moment. Now, with a better understanding of how the process works, researchers hope to use the information to study diseases in which multitasking or sensory overload goes awry—autism, schizophrenia, and ADHD, for example. The ability to multitask is a vital part of life, as it's needed to perform everyday functions like driving, cooking, or even socializing with a group of friends. But at any given time, our brains are bombarded with a multitude of sensory information, and we're forced to decide what's important in that instant, focus on it, and tune out everything else. Researchers have known about this process for years, but they weren't sure exactly how it worked because they couldn't come up with a reliable experiment for identifying what parts of the brain were involved.
 

By putting laboratory mice through a game-like experiment, they were able to show that different neurons within the TRN regulated which senses the brain should focus on and which should be set aside. The experiment involved training mice to respond to a specific sensory stimulus, either light or sound. If the mice observed and followed the correct stimulus, they received milk as a reward. At the same time, the researchers would attempt to distract the mice with the opposite stimulus (the mice trained to respond to light would be distracted with sound, for example). In real time, the researchers would record electrical signals that came from the TRN neurons in the mice’s brain. They were also able to inactivate various parts of the neural network—specifically the prefrontal cortex, which seeks out certain stimulus over others. When the mice were trained to pay attention to a particular sound and ignore light, the TRN neurons that control vision were highly active, meaning that they were suppressing visual signals so that the mice could focus more intensely on the sound. The opposite happened when they were trained to follow the light in order to receive their milk reward. Further, when the researchers inactivated the prefrontal cortex—the area of the brain responsible for higher level functioning—using a laser beam, the TRN neural signaling went completely out of whack. This shows that the prefrontal cortex stores incoming sensory information.

More information:

25 October 2015

Holographic Doctor House Calls

In early October, the University of Southern California's Center for Body Computing hosted a conference to discuss the futuristic technologies that could work their way into medicine and health care. During the event, they announced a new type of health clinic, called the USC Virtual Care Clinic. There's a lot of talk around virtual reality for gaming and entertainment, and even communication. But the Center for Body Computing is betting that the new consumer technologies of today--wearable sensors, VR, artificial intelligence--will make for more accessible and more personalized health care in the future.


Implantable sensors could help doctors gather more data. And during the Body Computing Conference this October, she introduced another intriguing technology, the hologram house call. They screened a demonstration of this during the conference, in which a hologram was beamed to a patient in Dubai to speak "face-to-face" about potential diagnoses. This could bring doctors to patients in areas that are far from research institutions, and allow physicians to monitor patients over longer periods of time. Though there will still obviously be procedures a hologram can't perform, like surgery.

More information:

21 October 2015

Algorithm Predict Human Behavior Better Than Humans

Humans are better at understanding fellow humans than machines are. But a new MIT study suggests an algorithm can predict someone’s behavior faster and more reliably than humans can. Researchers at MIT created the Data Science Machine to search for patterns and choose which variables are the most relevant. It’s fairly common for machines to analyze data, but humans are typically required to choose which data points are relevant for analysis. In three competitions with human teams, a machine made more accurate predictions than 615 of 906 human teams. And while humans worked on their predictive algorithms for months, the machine took two to 12 hours to produce each of its competition entries.


When one competition asked teams to predict whether a student would drop out during the next ten days, based on student interactions with resources on an online course, there were many possible factors to consider. Teams might have looked at how late students turned in their problem sets, or whether they spent any time looking at lecture notes. But instead, the two most important indicators turned out to be how far ahead of a deadline the student began working on their problem set, and how much time the student spent on the course website. These statistics weren’t directly collected by MIT’s online learning platform, but they could be inferred from data available.

More information:

18 October 2015

Artificial Skin That Can Send Pressure Sensation To Brain Cell

Stanford engineers have created a plastic skin that can detect how hard it is being pressed and generate an electric signal to deliver this sensory input directly to a living brain cell. Researchers spent a decade trying to develop a material that mimics skin's ability to flex and heal, while also serving as the sensor net that sends touch, temperature and pain signals to the brain. Ultimately she wants to create a flexible electronic fabric embedded with sensors that could cover a prosthetic limb and replicate some of skin's sensory functions.


The heart of the technique is a two-ply plastic construct: the top layer creates a sensing mechanism and the bottom layer acts as the circuit to transport electrical signals and translate them into biochemical stimuli compatible with nerve cells. The top layer in the new work featured a sensor that can detect pressure over the same range as human skin, from a light finger tap to a firm handshake. Researchers scattered billions of carbon nanotubes through the waffled plastic. Putting pressure on the plastic squeezes the nanotubes closer together and enables them to conduct electricity.

More information:

12 October 2015

Placebo Effect Works in Video Games Too

Even in virtual worlds, life is what you make of it. A study has found that gamers have more fun when they think a video game has been updated with fancy new features – even when that’s not true. A professor of human-computer interaction at the University of York, UK, wondered if the placebo effect translates into the world of video games after watching a TV programme about how a sugar pill had improved cyclists’ performance. To test their idea, researchers asked 21 people to play two rounds of Don’t Starve, an adventure game in which the player must collect objects using a map in order to survive. In the first round, the researchers told the players that the map would be randomly generated.
 

In the second, they said it would be controlled by an adaptive AI that could change the map based on the player’s skill level. After each round, the players filled out a survey. In fact, neither game used AI – both versions of the game were identically random. But when players thought that they were playing with AI, they rated the game as more immersive and more entertaining. Some thought the game was harder with AI, others found it easier – but no one found it equally challenging. A different experimental design, with 40 new subjects, confirmed the effect. This time, half of the players were put in a control group and told that the game was random, while the other half thought the game had built-in AI.

More information: