29 October 2019

Ocean Engineering Article

Recently, HCI Lab researchers and colleagues from iMareCulture EU project have published a peer-review paper at Ocean Engineering entitled “Underwater augmented reality for improving the diving experience in submerged archaeological sites”. The Mediterranean Sea has a vast maritime heritage which exploitation is made difficult because of the many limitations imposed by the submerged environment. Archaeological diving tours, in fact, suffer from the impossibility to provide underwater an exhaustive explanation of the submerged remains. Furthermore, low visibility conditions, due to water turbidity and biological colonization, sometimes make very confusing for tourists to find their way around in the underwater archaeological site.


The paper investigates the feasibility and potentials of the underwater Augmented Reality (UWAR) technologies developed in the iMARECulture project for improving the experience of the divers that visit the Underwater Archaeological Park of Baiae (Naples). It presents two UWAR technologies that adopt hybrid tracking techniques to perform an augmented visualization of the actual conditions and of a hypothetical 3D reconstruction of the archaeological remains as appeared in the past. The first one integrates a marker-based tracking with inertial sensors, while the second one adopts a markerless approach that integrates acoustic localization and visual-inertial odometry. Results show that these technologies could contribute to have a better comprehension of the underwater site.

More information:

 

28 October 2019

VS-Games 2019 Paper

On the 6th of September 2019, we have presented a co-authored paper entitled "Comparison of Trajectories and Quaternions of Folk Dance Movements Using Dynamic Time Warping". The paper was presented at the 11th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games 2019), which was held in Austria, Vienna and sponsored by the EU project called Terpsichore. The paper illustrated a brief methodology of getting ground truth data for digital capturing and analysis of folk dances.


Professional dancers, male and female, were recorded performing folk dances, alone and with a partner using an optical motion capture system. Two cases for dancing in pair, one person was wearing the suit with passive markers, and both were wearing suits. Three-dimensional marker trajectories and quaternions were compared using dynamic time warping and multidimensional dynamic time warping.  Initial results show that dances performed in pair are the most similar and could be used in the applications for learning purposes.

More information:

20 October 2019

Radio Waves See Through Walls and in Darkness

Researchers at MIT, who have found a way to teach a radio vision system to recognize people’s actions by training it with visible-light images. The new radio vision system can see what individuals are up to in a wide range of situations where visible-light imaging fails. The basic idea is to record video images of the same scene using visible light and radio waves. Machine-vision systems are already able to recognize human actions from visible-light images. So the next step is to correlate those images with the radio images of the same scene. But the difficulty is in ensuring that the learning process focuses on human movement rather than other features, such as the background. 


Researchers introduced an intermediate step in which the machine generates 3D stick-figure models that reproduce the actions of the people in the scene. In this way the system learns to recognize actions in visible light and then to recognize the same actions taking place in the dark or behind walls, using radio waves. The obvious applications are in scenarios where visible-light images fail (i.e. in low light conditions and behind closed doors). One problem with visible-light images is that people are recognizable, which raises privacy issues. But a radio system does not have the resolution for facial recognition. Identifying actions without recognizing faces does not raise the same privacy fears.

More information:

19 October 2019

Skin-On Interface for Mobile Phones

Researchers at Telecom Paris in France have devised an artificial skin for interactive devices that responds to touch. The skin is able to detect a variety of gestures, including sliding, stretching and rotation. The artificial skin is programmed to associate different gestures with certain emotions. Sudden hard pressure on the skin is associated with anger and tapping is a means of seeking attention, while sustained contact and stroking are associated with providing comfort. 


The team developed two prototypes: one with a creepily realistic textured layer that resembles human skin and another with a more uniform surface. The artificial skin is made of three layers, consisting of a layer of stretchable copper wire sandwiched between two layers of silicone. Pressure on the skin changes the electric charge of the system. The team created a phone case, computer touch pad and smart watch to demonstrate how the artificial skin works.

More information:

18 October 2019

ARKit and Motion Tracking

Apple’s ARKit already has many of the fundamentals in place to help developers create augmented reality experiences. This year’s release of ARKit 3 added RealityKit and Reality Composer tools focused on easing the process of adding virtual objects and environments to real world spaces. Now it appears that Apple’s next step could be virtual people. Until last month, U.K.-based iKinema was focused on providing 3D motion animation tools to movie and game developers, enabling virtual characters to exhibit highly believable body movements. The company’s flagship RunTime software enables easy but realistic kinematic simulations of the entire human body, including locomotion and other procedural animations, openly winning deals with Google, Microsoft, and numerous game studios.

 
Following legal filings in the U.K. that showed an Apple attorney becoming an iKinema director in mid-September, and a subsequent change of address to the same location as Apple Europe, Apple confirmed today to The Financial Times that it has acquired iKinema, the company famously doesn’t confirm specifics about most of its smaller technology company purchases. Since the purchase wouldn’t make sense as a way to keep supplying various Apple competitors with kinematic tools, there has to be another motivation. Bringing realistic human motion to ARKit and a wide variety of Apple AR-capable platforms makes the most sense, and there’s a particularly interesting set of AR applications that Apple could target: AR avatars.

17 October 2019

Sound Shirt For Deaf People

The soundshirt is a haptic wearable device that allows deaf users to feel music on their skin. It was designed by fashion technology company cute circuit. The sound shirt brings music to life using a series of haptic sensors that are built into the material. It features 30 micro-actuators embedded in the fabric of the garment. These sensors translate the sound in real-time, into a tactile language that is unique to each piece of music being performed. The sound shirt lets deaf people feel music on their skin.


To provide a comfortable experience for the wearer, the sound shirt is created using specifically developed smart and stretchy textiles. There is no need for wires and instead, all of the conductive pathways within the garment are composed of woven conductive textiles that are seamlessly integrated into the garment. The visual design is a metaphor for the relationship between vibrations and sound waves modulating in different frequencies.

More information:

06 October 2019

VR Vibrating Floor from Microsoft

Microsoft has filed a patent for a floor mat that could prevent you from crashing into furniture while you're exploring new worlds in virtual reality. It's also an indicator the company is still interested in bringing VR to the Xbox ecosystem, after it axed virtual reality plans for Xbox One. While some VR systems warn users when they're straying outside of their safe play space, there's still a chance they could hit surrounding objects, potentially damaging items or injuring themselves. 


Some VR users already employ a floor mat to give them a tactile sense of their space, as Variety notes, but Microsoft explained its mat could include markers a VR headset would scan to establish or adjust a safe zone. The application also discusses markers for a start position you'd stand on before hopping into VR as well as pressure sensors for the mat. In addition, Microsoft suggests it could provide haptic feedback through vibrations.

More information: