25 February 2020

Applied Sciences 2020 Article

Recently, HCI Lab at Masaryk University, Brno, Czech Republic published a scientific paper at the Special Issue Virtual Reality and Its Application in Cultural Heritage, Applied Sciences, sponsored by MDPI. The paper is entitled “Investigating the Learning Process of Folk Dances Using Mobile Augmented Reality”. It presents a prototype mobile augmented reality application for assisting the process of learning folk dances. At the first stage, one folk dance was digitized using motion tracking technology based on recordings from professional dancers. 


Avatar representations are synchronized with the digital representation of the dance. To assess the effectiveness of mobile augmented reality, it was comparatively evaluated with a large back-projection system in laboratory conditions. Twenty healthy participants took part in the study, and their movements were captured using motion capture system and then compared with the recordings from the professional dancers. Experimental results indicate that augmented reality (AR) has the potential to be used for learning process.

More information:

22 February 2020

Sony Patents VR Controller With Finger Tracking

A recently-published patent from Sony Interactive Entertainment suggests that the company is working on a new VR motion controller, similar to those for the Valve Index. The patent for a Controller Device outlines an input mechanism for a home-use game machine that detects movement of a user’s hand. Not only that, but the kit features a plurality of sensor units that detect the fingers of the user. These can detect the proximity or contact of a finger and outputs a finger detection signal indicating the state of proximity or contact of the finger.


The sensor units mentioned above are placed where the fingers would rest just like the sensors on the Index controllers or those embedded in an Oculus Touch controller. When the fingers are wrapped around the sensor, the controllers can relay that information to a given VR game and reflect the user’s in-game hands as making a fist or grabbing object. When the sensors can’t find a finger, they’ll assume its extended outwards and reflect that in-app. This could indicate that Sony is looking to implement this type of finger tracking into new motion controllers.

More information:

19 February 2020

Sandbox VR SDK

Sandbox VR, the location-based VR attraction, will be opening up to third-party developers soon, as the company will be releasing an SDK for its Sandbox ‘holodeck’ VR attraction platform. Sandbox operates a number of VR locations in major cities across North America as well as Hong Kong, Singapore, Macau, and Jakarta. Combining both branded content such as its Star Trek: Discovery experience and in-house developed games, Sandbox offers its experiences in 20-minute gameplay chunks for around $40 per person, accommodating up to six people per session. The company says in a blog post that anyone with the know-how will soon be able to develop new VR experiences for its location-based attractions using its upcoming SDK.


Sandbox’s locations make use of a few technologies that developers likely don’t have, such as the company’s haptic guns and its multi-camera motion capture system. Sandbox says however that developers can create using more modest setups such as an Oculus Rift or HTC Vive. Since professional motion capture can cost thousands of dollars, Beck says the company’s framework is going to abstract away that component and put in placeholders so you can still build for VR without these expensive systems, with full confidence that things will translate correctly when deployed to our full-body motion-captured holodeck. Furthermore, the upcoming networking framework will make it possible to create a mocked-up, multi-user development environment for testing and building experiences.

More information:

11 February 2020

EEG Determines if Antidepressants Work

People getting treated for depression often have to suffer through months of trial-and-error testing of different drugs to see which of them—if any—will help. For a long time, scientists and clinicians have hoped for a biological means of diagnosing depression or predicting which patients will do better on a given treatment. A new study takes a step toward the latter kind of prediction by finding a distinctive signature with the noninvasive technique of electroencephalography (EEG) to test who will benefit from one common antidepressant. The study followed more than 300 people with depression as they began taking the drug sertraline (Zoloft) or a placebo. A computer algorithm could discern the EEGs of those who fared well on the drug from those who did not. Trained on one group, the algorithm also effectively predicted results in several others. 


The work is preliminary and needs to be confirmed with further studies and expanded to include other treatments, such as different antidepressants, transcranial magnetic stimulation and psychotherapy. Right now doctors give patients whichever antidepressant they like best, and then—for all choices in this class of drugs—they have to wait six to eight weeks to know whether it is working or not. If the drug does not work well, it might be another six weeks before they know whether a different dose or a new drug is more effective. Meanwhile many of the people who seek medication are at risk for suicide or too depressed to function normally. Today about 40 percent of patients will respond to the first drug they are given. In the study, about 65 percent of patients’ whose EEG signature suggested they would respond well to sertraline did so.

More information:

09 February 2020

AuraRing - Ring Tracking

University of Washington created AuraRing, a ring and wristband combination that can detect the precise location of someone's index finger and continuously track hand movements. The ring emits a signal that can be picked up on the wristband, which can then identify the position and orientation of the ring—and the finger it's attached to. AuraRing is composed of a coil of wire wrapped 800 times around a 3-D-printed ring. A current running through the wire generates a magnetic field, which is picked up by three sensors on the wristband. Based on what values the sensors detect, the researchers can continuously identify the exact position of the ring in space. From there, they can determine where the user's finger is located.


With continuous tracking, AuraRing can pick up handwriting, potentially for short responses to text messages—or allow someone to have a virtual reality avatar hand that mimics what they're doing with their actual hand. In addition, because AuraRing uses magnetic fields, it can still track hands even when they are out of sight, such as when a user is on a crowded bus and can't reach their phone. The researchers designed AuraRing to be ready to use as soon as it comes out of the box and not be dependent on a specific user. They tested the system on 12 participants with different hand sizes. The team compared the actual location of a participant's finger to where AuraRing said it was. Most of the time, the system's tracked location agreed with the actual location within a few millimeters.

More information:

08 February 2020

Canon AR HMD MREAL Display MD-20

Canon unveiled its next enterprise-focused AR headset which aims to replace its MREAL Display MD-10, which the company launched in Japan in mid-2016 for the astounding price tag of ¥9 million (~$82,300). The PC-tethered AR headset, dubbed MREAL Display MD-20, doesn’t have a release date or price yet, although Canon is showing off the device at this year’s 3D & Virtual Reality Exhibition (IVR), which will be held at Makuhari Messe in Chiba, Japan from February 26th to 28th. The news was first reported by Japanese publication MoguraVR (Japanese). Like its predecessor, the MREAL Display MD-20 is going to be sold on the Japanese market, and is couched as a business support tool for the manufacturing industry such as automobile makers. MD-20’s improvements over MD-10 include a new CMOS sensor with global shutter. 


The MD-20 is also said to have expanded the display panel’s color gamut, which boasts 2,560 × 1,600 per eye resolution, and widened the horizontal field of view by a just a few degrees, now 70° (horizontal) and 40° (vertical); MD-10 features a 60° horizontal and 40° vertical FOV. The CMOS sensor is also used for positional tracking by generating a real-time spatial map, although businesses can purchase add-on extras such as visual markers and optical sensors (sold separately). The MD-20 is admittedly still under development, with Canon still aiming at further miniaturization and weight reduction in addition to working on its room-scale positioning. In comparison to Facebook, Magic Leap, and Microsoft, Canon has been in somewhat of a backseat position when it comes to AR/VR hardware development.

More information:

07 February 2020

Mojo Vision AR Contact Lens

Mojo Vision is revealing a smart contact lens with a tiny built-in display that lets you view augmented reality images on a screen sitting right in front of your eyeballs. The display uses MicroLEDs, a technology expected to play a critical role in the development of next-generation wearables, AR/VR hardware, and heads-up displays (HUDs). MicroLEDs use 10% of the power of current LCD displays, and they have five to 10 times higher brightness than OLEDs. This means MicroLEDs enable comfortable viewing outdoors


Mojo Vision holds patents for the development of an augmented reality (AR) smart contact lens dating back more than a decade. The company is currently demonstrating a working prototype of the device. The Mojo Lens is designed to span a range of consumer and enterprise use cases. Additionally, the company is planning an early application of the product to help people struggling with low vision through enhanced image overlays. This application of the Mojo Lens is designed to provide real-time contrast and lighting enhancements, as well as zoom functionality.

More information: