29 November 2018

GCH 2018 Paper

A few weeks ago, HCI Lab researchers and colleagues from iMareCulture EU project have published a peer-review paper at the Eurographics Workshop on Graphics and Cultural Heritage which was held at Vienna, Austria. The paper is entitled "Improving Marker-Based Tracking for Augmented Reality in Underwater Environments". 

 
A new method was introduced based on white balancing that enhances underwater images to improve the results of detection of markers. This method was compared with several image enhancement methods and their performance was evaluated. Results show that our method improves the detection in underwater environments while keeping the computation time low.

More information:

23 November 2018

Visual Heritage Expo Demo 2018

Between the 13-14 November 2018, I gave a presentation entitled "Underwater Virtual Reality Excavation" at the Visual Heritage Expo 2018 in Vienna, Austria. Members of the HCI Lab delivered a live demonstration and many participants tried it out. The demo presented an implementation of an immersive virtual environment for underwater archaeology in the form of a serious game. The main focus is on expanding the player’s knowledge, by teaching them about maritime archaeology and performing two archaeological procedures - tagging and dredging.


The dredging procedure was accomplished by implementing and extending an existing voxel-based sand simulation approach and rasterizing it using the marching cubes algorithm on the GPU. The extension of the simulation consisted in adding a custom dredging simulation step that removes the sand from the grid. Also, an alternative approach for sand slippage using a custom heuristic exists. The game is developed for the HTC Vive head-mounted display and uses room-scale experience, allowing the user to move freely in the virtual environment.

More information:

19 November 2018

VR Simulation of Black Hole

The black hole at the centre of our galaxy, Sagittarius A*, has been visualised in virtual reality for the first time. Scientists at Radboud University, The Netherlands and Goethe University, Germany used recent astrophysical models of Sagittarius A* to create a series of images that were then put together to create a 360 degree virtual reality simulation of the black hole, that can be viewed on widely available VR consoles. 

Researchers suggest that this virtual reality simulation could be useful for studying black holes. Traveling to a black hole in our lifetime is impossible, so immersive visualizations like this can help us understand more about these systems from where we are. They also suggest that the virtual reality simulation could help encourage the general public, including children, to take an interest in astrophysics.

More information:

11 November 2018

Oculus Patent for Light Field Cameras for Eye Tracking

Facebook’s Oculus patented an eye tracking technique which uses light field cameras inside the headset. Most previous eye tracking systems used a regular or infrared camera combined with an IR illuminator to keep the eyes lit. A light field camera differs from a regular camera in that it also captures the direction that light is travelling. This directional information can be used to understand the depth of the image, and thus 3D shape of the eye, instead of just the color and brightness. By knowing the 3D shape of the eye, the system can find out where the pupil is relative to the eye itself, and thus a more accurate estimation of the user’s gaze direction than with just the apparent 2D shape of the pupil. 


Eye tracking can greatly enhance the feeling of social presence in multiplayer VR, but its most promising use case is foveated rendering. Foveated rendering is when only what you’re looking at is drawn at full resolution while the rest of the scene in your peripheral view is rendered in low detail. This works because human vision is only high detail in the very center. To see this for yourself, look at some text in the room you’re in right now then look just a few feet to the side of it and try to read it again. Foveated rendering should one day enable much higher resolution VR headsets without requiring an expensive top of the line graphics card. Finding a way to make it work reliably is crucial to the future of VR.

More information:

08 November 2018

iMareCulture 24M Project Meeting

Between the 6th to the 7th of November 2018 the 24 month (annual) meeting of iMareCulture EU project took place at the Istituto Superiore per la Conservazione ed il Restauro in Rome, Italy.


During the meeting important issues regarding the progress of the project where discussed including (but not limited) to: virtual reality serious games, underwater augmented reality and storytelling.

More information:

04 November 2018

Digital Actors for Making Movies After Death

Visual-effects company Digital Domain, which has worked on major pictures like Avengers: Infinity War and Ready Player One, has also taken on individual celebrities as clients, though it hasn’t publicized the service. The suite of services that the company offers actors includes a range of different scans to capture their famous faces from every conceivable angle, making it simpler to re-create them in the future. Using hundreds of custom LED lights arranged in a sphere, numerous images can be recorded in seconds capturing what the person’s face looks like lit from every angle—and right down to the pores.


The lights can also emit different colors, emulating a variety of outdoor conditions where the digital human may be placed. This allows for more detailed and accurately colored, shaded, and reflective skin. They capture basically how the subdermal blood flow will change in the face. Even with all the advances in CGI, digitally re-created people don’t look perfect yet. Therefore, detailed and sometimes frame-by-frame adjustments have to be made if they aren’t picked up by the model generated from the live actors. But the special effects continue to improve, and with more actors preserving their digital likeness at a young age things could get easier.

More information: