16 December 2018

Relationship Between Eyes and Mental Workload

With nearly breakneck speed, the demands of work productivity in today's society seem to have increased tenfold. Enter multitasking as a way to cope with the insistence that tasks be completed almost immediately. Previous studies on workload and productivity include physical aspects, such as how much a person walks or carries, but they do not take into account a person's state of mind. Now, MU College of Engineering researchers have discovered a person's eyes may offer a solution. To do this, they compared data from a workload metric developed by NASA for its astronauts with their observations of pupillary response from participants in a lab study. Using a simulated oil and gas refinery plant control room, researchers watched, through motion-capture and eye-tracking technology, as the participants reacted to unexpected changes, such as alarms, while simultaneously watching the performance of gauges on two monitors.


During the scenario's simple tasks, the participants' eye searching behaviors were more predictable. Yet, as the tasks became more complex and unexpected changes occurred, their eye behaviors became more erratic. Through the use of the data from this lab study by applying a formula applied called fractal dimension, researchers discovered a negative relationship between the fractal dimension of pupil dilation and a person's workload. This showed that pupil dilation could be used to indicate the mental workload of a person in a multitasking environment. Researchers hope this finding can give a better insight into how systems should be designed to avoid mentally overloading workers and build a safer working environment. One day this finding could give employers and educators alike a tool to determine the maximum stress level a person can experience before they become fatigued, and their performance begins to negatively change.

More information:

13 December 2018

Video Games May Improve Empathy in Middle Schoolers

A fantastical scenario involving a space-exploring robot crashing on a distant planet is the premise of a video game developed for middle schoolers by researchers to study whether video games can boost kids' empathy, and to understand how learning such skills can change neural connections in the brain. This fantastical scenario is the premise of a video game developed for middle schoolers by University of Wisconsin-Madison researchers to study whether video games can boost kids' empathy, and to understand how learning such skills can change neural connections in the brain.


Results reveal for the first time that, in as few as two weeks, kids who played a video game designed to train empathy showed greater connectivity in brain networks related to empathy and perspective taking. Some also showed altered neural networks commonly linked to emotion regulation, a crucial skill that this age group is beginning to develop. Researchers obtained functional magnetic resonance imaging scans in the laboratory from participants looking at connections among areas of the brain, including those associated with empathy and emotion regulation.

More information:

11 December 2018

Flexible Electronic Skin for HCI

Electronic skin could be used for many applications, including prosthetic devices, wearable health monitors, robotics and virtual reality. A major challenge is transferring ultrathin electrical circuits onto complex 3D surfaces and then having the electronics be bendable and stretchable enough to allow movement. Some scientists have developed flexible electronic tattoos for this purpose, but their production is typically slow, expensive and requires clean-room fabrication methods such as photolithography. Researchers patterned a circuit template onto a sheet of transfer tattoo paper with an ordinary desktop laser printer. 


They then coated the template with silver paste, which adhered only to the printed toner ink. On top of the silver paste, the team deposited a gallium-indium liquid metal alloy that increased the electrical conductivity and flexibility of the circuit. Finally, they added external electronics, such as microchips, with a conductive glue made of vertically aligned magnetic particles embedded in a polyvinyl alcohol gel. They transferred the electronic tattoo to various objects and demonstrated several applications of the new method, such as controlling a robot prosthetic arm, monitoring human skeletal muscle activity and incorporating proximity sensors into a 3D model of a hand.

More information:

10 December 2018

Edge Hill Invited Talk

On the 5th of December 2018, I delivered an invited talk at the General Research Seminars held at the Department of Psychology, Edge Hill University, UK. I  showcased  how  VR  and  AR  technologies  can  be  used  for a  number  of emerging applications. 


I firstly introduced  the  main  concepts  and  principles  behind  VR  and  AR. In the second part, I demonstrated a  number of novel from a number of  application  domains  ranging  from cultural  heritage to health  and  psychology originating from a number of research projects.

More information:

29 November 2018

GCH 2018 Paper

A few weeks ago, HCI Lab researchers and colleagues from iMareCulture EU project have published a peer-review paper at the Eurographics Workshop on Graphics and Cultural Heritage which was held at Vienna, Austria. The paper is entitled "Improving Marker-Based Tracking for Augmented Reality in Underwater Environments". 

 
A new method was introduced based on white balancing that enhances underwater images to improve the results of detection of markers. This method was compared with several image enhancement methods and their performance was evaluated. Results show that our method improves the detection in underwater environments while keeping the computation time low.

More information:

23 November 2018

Visual Heritage Expo Demo 2018

Between the 13-14 November 2018, I gave a presentation entitled "Underwater Virtual Reality Excavation" at the Visual Heritage Expo 2018 in Vienna, Austria. Members of the HCI Lab delivered a live demonstration and many participants tried it out. The demo presented an implementation of an immersive virtual environment for underwater archaeology in the form of a serious game. The main focus is on expanding the player’s knowledge, by teaching them about maritime archaeology and performing two archaeological procedures - tagging and dredging.


The dredging procedure was accomplished by implementing and extending an existing voxel-based sand simulation approach and rasterizing it using the marching cubes algorithm on the GPU. The extension of the simulation consisted in adding a custom dredging simulation step that removes the sand from the grid. Also, an alternative approach for sand slippage using a custom heuristic exists. The game is developed for the HTC Vive head-mounted display and uses room-scale experience, allowing the user to move freely in the virtual environment.

More information: