28 June 2018

Realistic Robots or Not?

With Google’s AI assistant able to make phone calls and androids populating households in games and films, the line between machine and man is getting scarily blurred. Just a few weeks ago, Google demonstrated that its home-assistant robot is capable of holding an unsettlingly natural conversation with a human being over the phone to book a haircut or make a restaurant reservation, complete with “ums” and “ahs” to make the listener believe they are talking to a real person. In the game, household androids that have been mistreated by humans start rebelling, eventually banding together to demand rights. 

It is not an original premise, but video games now look so lifelike that it is a good litmus test for how comfortable you feel with the idea of a human-like android. The game’s characters, played by human actors, look almost indistinguishably close to real people. In Japan, where the animus belief perhaps makes people more comfortable with the idea that spirit can reside in something that isn’t human, robots are already being used as shop assistants, in care homes and in schools. Japan is the world leader in robotics and demand is high for robots that could help fill a shortfall in nursing care. In Europe, by contrast, people are generally uncomfortable with the idea of an android performing roles that require interaction with humans.

More information:

25 June 2018

Baiae Underwater AR Testing

Between the 18th to 22nd of June 2018, underwater augmented reality testing was performed at Baiae in Italy for the iMareCulture project.

The aim of the testing was to access the effectiveness of different filtering algorithms for tracking in bad visibility conditions in real-time performance.

More information:

24 June 2018

VR Reduces Kids Fear of Needles

Needle phobia is one of the most common fears among children who receive vaccines and they are exposed to needles on numerous occasions throughout their childhood. This causes many children fear, anxiety and pain. In some cases, needle phobia and needle anxiety may even cause parents to delay scheduled visits with the doctor. A pediatrician has come up with an innovative solution to distract children from their fear, anxiety and pain using a virtual reality headset. He is the first to conduct a pilot study, published in the journal Pain Management, using this technique in a pediatric setting. He got the idea for the study from an 8-year-old patient who came to his office with a virtual reality headset. The child placed the goggles on his head as Rudnick proceeded to give him an injection. Much to Rudnick's delight, the child didn't even flinch. 

The objective of this study was to test the feasibility, efficiency and usefulness of using virtual reality headsets as a means to decrease fear and pain associated with immunizations in pediatric patients. The study focused on fear and pain, both anticipated and actual as reported by the child and their caregiver. For the study, researchers used a 3D virtual reality headset and a smartphone app that was inserted into the goggles giving the children the choice of a roller coaster ride, a helicopter ride or a hot-air balloon ride. Once the virtual reality headset was in place, they administered a single injection with the headset on until after the immunization was completed in about 30 seconds. Study participants ages 6 to 17 completed a pre- and post-questionnaire evaluating fear using the McMurty Children's Fear Scale and the Wong-Baker pain scale.

More information:

17 June 2018

IEEE CGA 2018 Paper

Recently we published a paper at IEEE Computer Graphics and Applications entitled 'Simulation of Underwater Excavation using Dredging Procedures'. The article presents a novel system for simulating underwater excavation techniques using immersive VR. 

The focus is not on simulating swimming but on excavating underwater while following established archaeological methods and techniques. In particular, the use of dredging procedures was implemented by a realistic simulation of sand in real-time performance.

More information:

16 June 2018

AI Detects Pose Through Walls

X-ray vision has long seemed like a far-fetched sci-fi fantasy, but over the last decade a team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) has continually gotten us closer to seeing through walls. Their latest project, 'RF-Pose', uses artificial intelligence (AI) to teach wireless devices to sense people's postures and movement, even from the other side of a wall. The researchers use a neural network to analyze radio signals that bounce off people's bodies, and can then create a dynamic stick figure that walks, stops, sits and moves its limbs as the person performs those actions.

The team says that the system could be used to monitor diseases like Parkinson's and multiple sclerosis (MS), providing a better understanding of disease progression and allowing doctors to adjust medications accordingly. It could also help elderly people live more independently, while providing the added security of monitoring for falls, injuries and changes in activity patterns. The team is currently working with doctors to explore multiple applications in healthcare. Besides health-care, the team says that RF-Pose could also be used for new classes of video games where players move around the house, or even in search-and-rescue missions to help locate survivors.

More information:

14 June 2018

AI Does Household Chores

Recently, computer scientists have been working on teaching machines to do a wider range of tasks around the house. Recently, researchers demonstrated 'VirtualHome', a system that can simulate detailed household tasks and then have artificial agents execute them, opening up the possibility of one day teaching robots to do such tasks. The team trained the system using nearly 3,000 programs of various activities, which are further broken down into subtasks for the computer to understand. 

A simple task like making coffee, would also include the step grabbing a cup. The researchers demonstrated VirtualHome in a 3D world inspired by the Sims video game. The AI agent can execute 1,000 of these interactions in the Sims-style world, with 8 different scenes including a living room, kitchen, dining room, bedroom, and home office. The project was co-developed by researchers from CSAIL and the University of Toronto, McGill University and the University of Ljubljana.

More information: