Showing posts with label Extended Reality. Show all posts
Showing posts with label Extended Reality. Show all posts

16 July 2025

Interactive Media for Cultural Heritage

Recently, the latest edited book I co-authored with colleagues from CYENS – Centre of Excellence and the University of Cyprus was published by Springer Series on Cultural Computing. The book is entitled ‘Interactive Media for Cultural Heritage’ and presents the full range of interactive media technologies and their applications in Digital Cultural Heritage. It offers a forum for interaction and collaboration among the interactive media and cultural heritage research communities.

A close-up of a book cover

AI-generated content may be incorrect.

The aim of this book is to provide a point of reference for the latest advancements in the different fields of interactive media applied in Digital Cultural Heritage research, ranging from visual data acquisition, classification, analysis and synthesis, 3D modelling and reconstruction, to new forms of interactive media presentation, visualization and immersive experience provision via extended reality, collaborative spaces, serious games and digital storytelling.

More information:

https://link.springer.com/book/10.1007/978-3-031-61018-9

28 November 2024

Lollipop-Shaped Device for VR Taste

Researchers at the City University of Hong Kong have developed a new interface to simulate taste in virtual and other extended reality (XR). The lollipop-shaped lickable device can produce nine different flavors: sugar, salt, citric acid, cherry, passion fruit, green tea, milk, durian, and grapefruit. Each flavor is produced by food-grade chemicals embedded in a pocket of agarose gel. When a voltage is applied to the gel, the chemicals are transported to the surface in a liquid that then mixes with saliva on the tongue like a real lollipop. Increase the voltage and get a stronger flavor.

Initially, the researchers tested several methods for simulating taste, including electrostimulating the tongue.  The other methods each came with limitations, such as being too bulky or less safe, so the researchers opted for chemical delivery through a process called iontophoresis, which moves chemicals and ions through hydrogels and has a low electrical-power requirement. With a 2-volt maximum, the device is well within the human safety limit of 30 V, which is considered enough to deliver a substantial shock in some situations.

More information:

https://spectrum.ieee.org/virtual-reality-taste

29 September 2024

MTI 2024 Article

Recently, I co-authored an open-access journal paper that was published at Multimodal Technologies and Interaction, sponsored by the MDPI. The paper is entitled “Extended Reality Educational System with Virtual Teacher Interaction for Enhanced Learning”. The paper introduces an interactive XR intelligent assistant featuring a virtual teacher that interacts dynamically with PowerPoint presentations using OpenAI’s ChatGPT API.

It incorporates multilingual speech-to-text and text-to-speech capabilities, custom lip-syncing solutions, eye gaze, head rotation and gestures. Panoramic images can be used as a sky box giving the illusion that the AI assistant is located at another location. Findings from three pilots indicate that the proposed technology has a lot of potential to be used as an additional tool for enhancing the learning process.

More information:

https://www.mdpi.com/2414-4088/8/9/83

22 September 2024

Apple Vision Pro’s Eye Tracking Sees What People Type

A group of six computer scientists are revealed a new attack against Apple’s Vision Pro mixed reality headset where exposed eye-tracking data allowed them to decipher what people entered on the device’s virtual keyboard. The attack allowed the researchers to successfully reconstruct passwords, PINs, and messages people typed with their eyes.

Researchers did not gain access to Apple’s headset to see what they were viewing. Instead, they worked out what people were typing by remotely analyzing the eye movements of a virtual avatar created by the Vision Pro. This avatar can be used in Zoom calls, Teams, Slack, Reddit, Tinder, Twitter, Skype, and FaceTime.

More information:

https://www.wired.com/story/apple-vision-pro-persona-eye-tracking-spy-typing/