30 October 2020

Special Issue Augmented Reality for Robotics and Artificial Intelligence

Augmented Reality (AR) is one of the most promising enabling technologies that will become a key factor in future industries thanks to its capability to enhance the perception of the real world by integrating virtual objects and information. The spread of AR techniques will be amplified, especially in the industrial sector, by the rapid advances of modern technologies related to Artificial Reality (AI). AI, in fact, includes machine and deep learning algorithms, as well as statistical models capable of performing tasks without explicit instructions, that can be used in AR applications to enhance the perception and understanding of the real world.

The combination of AR and AI technologies could lead to the development of new, effective, and natural human-robot interaction techniques. The goal of this Research Topic is to bring together state-of-the-art achievements in AR, Robotics and AI. In particular, AI enables scene understanding features that can enrich AR applications with context-aware data directly superimposed over the real-world view. This augmented visualization is particularly useful in robotics applications where the humans are engaged for remote controlling or for cooperative working. Some interesting applications have been recently demonstrated in the context of autonomous or semi-autonomous vehicles.

More information:

https://www.frontiersin.org/research-topics/17301/augmented-reality-for-robotics-and-artificial-intelligence

27 October 2020

Sensors 2020 Article

Recently, I published an open-access journal paper with colleagues from iMareCulture at Sensors. The paper is entitled “Virtual Reality with 360-Video Storytelling in Cultural Heritage: Study of Presence, Engagement, and Immersion” and presents a combined subjective and objective evaluation of an application mixing interactive VR experience with 360° storytelling. The hypothesis that the modern immersive archaeological VR application presenting cultural heritage from a submerged site would sustain high levels of presence, immersion, and general engagement was leveraged in the investigation of the user experience with both the subjective and the objective evaluation methods.


Participants rated the VR experience positively in the questionnaire scales for presence, immersion, and subjective judgement. High positive rating concerned also the psychological states linked to the experience (engagement, emotions, and the state of flow), and the experience was mostly free from difficulties linked to the accustomization to the VR technology (technology adoption to the head-mounted display and controllers, VR sickness). EEG results are in line with past studies examining brain responses to virtual experiences, while new results in the beta band suggest that EEG is a viable tool for future studies of presence and immersion in VR.

More information:

https://www.mdpi.com/1424-8220/20/20/5851

26 October 2020

VR Universal Law of Touch

Scientists have used seismic waves to create a universal scaling for the sense of touch, paving the way for hyper realistic virtual reality. The ‘Universal law of touch’ theory was created by researchers at the University of Birmingham, who used mathematical modelling of touch receptors in humans and other animal species. Researchers studied a type of seismic waves known as Rayleigh waves, which are created by the impact of two objects. By applying the mathematics of earthquakes to model how vibrations travel through the skin, the team discovered that vibration receptors beneath the skin respond to Rayleigh waves in the same way regardless of age, gender, or even species.

The University of Birmingham researchers form part of European consortium H-Reality, which is already using the theory to develop next-generation VR technologies. The ambition of the group is to imbue virtual objects with a physical presence, providing a revolutionary, untethered, virtual-haptic reality. It is one of several efforts to create digital worlds that feel indistinguishable from reality, with Bristol-based startup Ultraleap creating haptic feedback hardware capable of simulating virtual touch.  Applications range from video games and chat rooms, to remote surgery and industrial set ups that allow workers to control dangerous machinery remotely.

More information:

https://www.independent.co.uk/life-style/gadgets-and-tech/virtual-reality-simulation-vr-touch-b986186.html

21 October 2020

HCII 2020 Article

Recently, I published a conference paper with colleagues from iMareCulture at the International Conference on Human-Computer Interaction, in the section Late Breaking Papers: Virtual and Augmented Reality. The paper is entitled “Underwater Search and Discovery: From Serious Games to Virtual Reality” and published as part of the Lecture Notes in Computer Science book series (LNCS, volume 12428). The paper presents search techniques for discovering artefacts in the form of two different educational games.

The first one is a classical serious game that assesses two maritime archaeological methods for search and discovering artefacts including circular and compass search. Evaluation results with 30 participants indicated that the circular search method is the most appropriate one. Based on these results, an immersive virtual reality search and discovery simulation was implemented. To educate the users about underwater site formation process digital storytelling videos were used when an artefact is discovered.

More information:

https://link.springer.com/chapter/10.1007%2F978-3-030-59990-4_15

20 October 2020

Special Issue Deep-Learning Approaches for High Dynamic Range Sensing and Imaging

High dynamic range (HDR) imaging is a well-established technology that enables the acquisition, storage, manipulation, delivery, and evaluation of a higher dynamic range than the one available in traditional 8-bit per color channel technology. This has brought several advantages, such as more realistic color reproduction, more details in bright and dark areas, better contrast, improved colors, etc. This has revolutionized the way we are now experiencing entertainment, as well as the way we are using images and videos in the image processing and computer vision fields.

We are experiencing a large use of HDR technology in the entertainment sector, and are also starting to see its use in industrial applications. On the other hand, we are assisting in a paradigm change in the image-processing area, where traditional techniques are surpassed by more flexible deep-learning-based approaches. In the last few years, we are also observing this specific trend in the HDR imaging field. This has brought a number of challenges that need to be addressed in order to make deep-learning-based HDR approaches more robust and resilient to unseen data and/or data which is too noisy.

More information:

https://www.mdpi.com/journal/sensors/special_issues/HDR

16 October 2020

FOVE Eye-Tracking Headset

While eye-tracking hasn’t entered the consumer field, the technology has been making great strides when it comes to enterprise use cases, from foveated rendering to user analytics. In the medical realm, the technology can aid the diagnosis of eye conditions or dizziness for example. There are three editions of the FOVE0 headset, split only by the installed software. For medical researchers, there’s the new FOVE Pro software upgrade, offering the ability to measure eye torsion as well as measuring the contours of the eye. A new system has been implemented to allow calibration of one eye at a time, ideal for those who work with patients with strabismus or amblyopia.

For companies, there’s the FOVE Enterprise upgrade designed for use at scale. New features include single-point calibration, faster than FOVE’s traditional method, and then there’s support for NVIDIA Jetson Xavier NX embedded computing platforms, reducing the cost of deployments. Both the Enterprise and Pro versions are payable upgrades. The standard FOVE0 software is also getting an update (this one is free). FOVE is adding official support for Ubuntu Linux. This includes all VR features, such as the FOVE Compositor and eye-tracking. For developers, FOVE’s SDK for C, C++, C# and Python have seen big APIs updates to help access the new features, along with the plugins for Unity and Unreal engines.

More information:

https://www.vrfocus.com/2020/10/fove-launches-v1-0-of-its-eye-tracking-headset

05 October 2020

Doll Play Improves Empathy and Social Skills in Kids

A team of researchers from Cardiff University has used neuroscience for the first time to explore the impact doll play has on children. In an 18-month study, the team monitored the brain activity of 33 children, aged between four and eight, as they played with dolls. They found that doll play activated parts of the brain that allow children to develop empathy and social information processing skills, even when they were playing alone. They saw far less activation of this part of the brain when the children played with tablet computers on their own. Researchers used an emerging neuroimaging technology called functional near-infrared spectroscopy (fNIRS) to scan brain activity while the children moved freely around. They found that the pSTS, a region of the brain associated with social information processing such as empathy, was activated even when children played with dolls on their own, regardless of gender. Dolls encourage them to create their own little imaginary worlds, as opposed to say, problem-solving or building games. They encourage children to think about other people and how they might interact with each other.

In the study the play was split into different sections so the Cardiff team could capture the brain activity relating to each kind of play separately – playing with the dolls on their own; playing with the dolls together with a research assistant; playing with the tablet game on their own and playing with the tablet game along with a research assistant. They found that doll play activated parts of the brain that allow children to develop empathy and social information processing skills, even when they were playing alone. Image is credited to Cardiff University. The dolls used included a diverse range of Barbies and sets. Tablet play was carried out using games that allow children to engage with open and creative play (rather than a rule or goal-based games) to provide a similar play experience to doll play. The study found that when children played alone with dolls, they showed the same levels of activation of the pSTS as they do when playing with others. When the children were left to play tablet games on their own there was far less activation of the pSTS, even though the games involved a considerable creative element.

More information:

https://neurosciencenews.com/doll-play-empathy-17118/

02 October 2020

Reverb G2 Omnicept Edition HMD

HP recently revealed the Reverb G2 Omnicept Edition, a virtual reality headset for businesses that captures biometric data for various uses. The Omnicept headset can track your pupil movement, mouth movement, and your heart rate. The HP Reverb G2 Omnicept Edition is based on the upcoming Reverb G2 VR headset and shares most of the core features, including the display resolution, speakers, and general form factor. The Omnicept Edition has a few upgrades that the basic model does not offer, including a wipeable PU leather cushion and a ratcheting adjustment system for the head strap.


The Omnicept Edtion includes eye-tracking sensors from Tobii, which enable gaze-based interactions. The headset is also compatible with NVIDIA's foveated rendering technology, improving rendering performance while enhancing localized image fidelity with super sampling. The face camera always records the user's mouth, and developers can use the captured data to animate avatars with natural human expressions in real-time. The facial expressions this camera captures can also tell a lot about your mood, especially when combined with your heartbeat.

More information:

https://www.tweaktown.com/news/75425/hp-reverb-g2-omnicept-edition-has-face-camera-heart-beat-sensor/index.html