04 December 2019

Entertainment Computing 2019 Article

Recently a collaborative paper between De Montfort University, Coventry University and Masaryk University (HCI Lab) was published at the journal of Entertainment Computing, sponsored by Elsevier. The paper is entitled “Assessing the perceived realism of agent grouping dynamics for adaptation and simulation” and is focused on applications targeting human interaction. The paper presents a novel method using psychophysics to assess the perceived realism of behavioural features with respect to virtual crowds. 


Focus is given to the grouping dynamics feature, whereby crowd composition in terms of group frequency and density is evaluated through thirty-six conditions based on crowd data captured from three pedestrianised real-world locations. The study, conducted with seventy-eight healthy participants, allowed for the calculation of perceptual thresholds, with configurations identified that appear most real to human viewers. Results suggest that viewers have more perceptual flexibility when group frequency and density are increased, rather than decreased.

More information:

27 November 2019

Connect Your Brain to a Smartphone

Neuralink tries to build an interface that enables someone's brain to control a smartphone or computer, and to make this process as safe and routine as Lasik surgery. Currently, Neuralink has only experimented on animals. In these experiments, the company used a surgical robot to embed into a rat brain a tiny probe with about 3,100 electrodes on some 100 flexible wires or threads — each of which is significantly smaller than a human hair.


This device can record the activity of neurons, which could help scientists learn more about the functions of the brain, specifically in the realm of disease and degenerative disorders. This device has been tested on at least one monkey, who was able to control a computer with its brain. Neuralink's experiments involve embedding a probe into the animal's brain through invasive surgery with a sewing machine-like robot that drills holes into the skull.

More information:

24 November 2019

VR Interface Allows Long Distance Touch

This new interface is a type of haptic device, a technology that remotely conveys tactile signals. A common example is video game controllers that vibrate when the player’s avatar takes a hit. Some researchers think more advanced, wearable versions of such interfaces will become a vital part of making virtual and augmented reality experiences feel like they are happening. Researchers developed a vibrating disk, only a couple millimeters thick, that can run with very little energy. These actuators (a term for devices that give a system physical motion) need so little energy that they can be powered by near-field communication—a wireless method of transferring small amounts of power, typically used for applications like unlocking a door with an ID card.


The resulting product looks like a lightweight, soft patch of fabric-like material that can flex and twist like a wet suit, maintaining direct contact with the wearer’s skin as their body moves. It consists of thin layers of electronics sandwiched between protective silicone sheets. One layer contains the near-field communication technology that powers the device. This can activate another layer: an array of actuators, each of which can be activated individually and tuned to different vibration frequencies to convey a stronger or weaker sensation. This stack of electronics, slightly thinner than a mouse pad, culminates in a tacky surface that sticks to the skin. Researchers have tested prototype patches of different shapes and sizes to fit on various parts of the body.

More information:

20 November 2019

VRST 2019 Paper

On the 14th of November 2019, we have presented a co-authored paper entitled "A Mobile Augmented Reality Interface for Teaching Folk Dances". The paper was presented at the 25th ACM Symposium on Virtual Reality Software and Technology (VRST 2019), which was held in Australia, Sydney and sponsored by the EU project called Terpsichore. This paper presents a prototype mobile augmented reality interface for assisting the process of learning folk dances. 


As a case study, a folk dance was digitized based on recordings from professional dancers. To assess the effectiveness of the technology, it was comparatively evaluated with a large back-projection system in laboratory conditions. Sixteen participants took part in the study, and their movements were captured using motion capture system and then compared with the recordings from the professional dancers. Experimental results indicate that AR has the potential to be used for learning folk dances.

More information: