29 March 2021

IEEE Access 2021 Article

Last week, a co-authored open-access journal paper based on the iMareCulture EU project was published at IEEE Access, sponsored by the IEEE Computer Society. The paper is entitled “Evaluating the Potential of Augmented Reality Interfaces for Exploring Underwater Historical Sites” and presents two novel solutions for underwater augmented reality: a compact marker-based system for small areas, and a complex acoustic system for large areas.

Both were deployed at an underwater cultural heritage site and evaluated by ten divers in experiments analyzing their perception and remembrance, interests, and user experience. For comparison, the same study was also performed with non-divers assessing the marker-based system on land. Results show that both systems allow divers to encounter new and exciting moments and provide valuable insights for underwater augmented reality applications.

More information:

https://ieeexplore.ieee.org/document/9355135

26 March 2021

IV 2020 Article

Recently, I published a co-authored conference paper based on the Terpsichore EU project at the 24th International Conference Information Visualisation (IV). The paper is entitled “Assessing the Learning of Folk Dance Movements Using Immersive Virtual Reality” and presents how digital technologies can help with preservation of cultural heritage. A virtual reality application that has the potential of assisting the learning process of folk dances is introduced.

Three different assisting approaches are presented and evaluated with 30 healthy participants. An animated avatar of the professional dancer is shown in immersive virtual reality and participants were asked to imitate the movements to learn the dance. Movements were recorded using a passive optical motion capture system and afterwards compared to the recordings from the professional dancers. Results indicate that participants that had feedback provided achieved better performance.

More information:

https://ieeexplore.ieee.org/document/9373105

23 March 2021

Wrist-based Interaction from Facebook Reality Labs

Facebook Reality Labs (FRL) Research are building an interface for AR that will not force people to choose between interacting with their devices and the world around them. They are developing natural, intuitive ways to interact with always-available AR glasses because they believe this will transform the way we connect with people near and far. They are trying to create neural interfaces to let people control the machine directly, using the output of the peripheral nervous system — specifically the nerves outside the brain that animate your hand and finger muscles. A wrist-based wearable has the additional benefit of easily serving as a platform for compute, battery, and antennas while supporting a broad array of sensors.

The missing piece was finding a clear path to rich input, and a potentially ideal solution was electromyography (EMG). EMG uses sensors to translate electrical motor nerve signals that travel through the wrist to the hand into digital commands that you can use to control the functions of a device. These signals let you communicate crisp one-bit commands to your device, a degree of control that’s highly personalizable and adaptable to many situations. The signals through the wrist are so clear that EMG can understand finger motion of just a millimeter. That means input can be effortless. Ultimately, it may even be possible to sense just the intention to move a finger. 

More information:

https://tech.fb.com/inside-facebook-reality-labs-wrist-based-interaction-for-the-next-computing-platform/