23 June 2017

How Video Games Change the Brain

A new study reveals how playing video games not only changes how our brain functions, but also the structure of the brain. Scientists have collected and summarized studies looking at how video games can shape our brains and behavior. Research to date suggests that playing video games can change the brain regions responsible for attention and visuospatial skills and make them more efficient. The researchers also looked at studies exploring brain regions associated with the reward system, and how these are related to video game addiction.

The studies show that playing video games can change how our brains perform, and even their structure. For example, playing video games affects our attention, and some studies found that gamers show improvements in several types of attention, such as sustained attention or selective attention. The brain regions involved in attention are also more efficient in gamers and require less activation to sustain attention on demanding tasks. There is also evidence that video games can increase the size and efficiency of brain regions related to visuospatial skills.

More information:

18 June 2017

US Army Is Bringing ‘Tactical AR’ To The Battlefield

When it comes to surviving a firefight information is everything. That’s exactly why the United States military invests so much time and resources into equipping soldiers with state-of-the-art technology designed to keep them informed and updated when deployed in dangerous scenarios. The U.S. Army Research, Development and Engineering Command’s Communications-Electronics Research, Development and Engineering Center is taking a full step into the future by fitting soldiers with advanced augmented reality heads-up displays. The US Army has been using AR in combat for years in the form of simple, monochromatic displays fielded by select individuals.

Tactical Augmented Reality, or TAR, takes the idea much further by providing high quality HD displays miniaturized to the size of a small eyepiece and capable of communicating large amounts of critical data in real-time. The TAR system can effectively provide a soldier with advanced night vision, topographical information and shared vision between squad members. The futuristic display is also capable of providing the GPS locations of both friend and foe using automatic geo-registration to constantly update the geodetically-calibrated reference image needed for functional tracking with geo-positional satellites.

More information:

12 June 2017

Leap Motion Updates Support VR Rift And Vive Controllers

Fully functioning hand-tracking might be a ways off from becoming the standard form of VR input, but Leap Motion is making a big step toward that future today, taking its Interaction development engine to 1.0 and introducing some major new features. The Interaction Engine has been available in early Beta since last year, but this full release focuses on what could be a major application for hand-tracking going forward — interfaces. Leap Motion has built a new user interface module that allows developers to create their own accessible menus and systems that can be navigated a little like Tom Cruise navigates menus in Minority Report. Users reach out to virtual panels to press buttons and alter meters. The company is also adding support for systems like wearables and widgets, enabling wrist-mounted menus and more.

Also updated is the core physics engine, which should make using Leap Motion a much more reliable and immersive experience going forward. Perhaps the most exciting addition to the engine, though, is Oculus Touch and Vive controller support. The combination of these two technologies is very interesting. Touch also has basic gesture recognition but imagine being able to hold a controller and still extend a finger to press a button. The company has also launched a new Graphic Renderer that can curve the user interface and render it in one draw call. This is specifically aimed at mobile and standalone headsets. Leap Motion’s hand-tracking technology has existed for years, but found a new lease of life in VR. We’ve seen the company’s tech integrated into Qualcomm’s reference design for standalone VR headsets though.

More information:

11 June 2017

VR Glove Uses Muscle-Like Chambers To Simulate Touch

New VR gloves designed by engineers at UC San Diego employ soft robotics to deliver tactile feedback to the wearer as they touch and interact with virtual objects. The system is designed to mimic the movement and sensation of muscle with a component called a McKibben Muscle. The glove is structured in a layer of latex chambers, surrounded on the surface by braided muscles. The entire glove (including the muscles) is connected to a circuit board, and as you interact with virtual objects, the gloves inflate and deflate to replicate pressure. It’s a finely tuned process designed to give you the sensation you’re actually lifting and touching objects, just like you would in the real world. In theory, the gloves could paired with other technologies like a Leap Motion sensor to simulate a wide range of activities. This type of technology has been used in similar ways before — though not exactly in the muscle structure described above. The Kor-FX and Hardlight Suit, for example, are VR-ready vests that allow you to feel impacts and pressure on your chest through haptic feedback.

Jointly, these technologies may someday be used to immerse you entirely in a virtual world, whether for entertainment, gaming or more practical purposes like situational training. Of course, they will remain separate pieces of gear for now, at least until engineers or developers figure out a way to create one, seamless outfit or suit. That would require overcoming obstacles like interconnectivity problems that occur with other kinds of electronics. A full-body suit would need to be able to differentiate between pressure, impact, or muscle simulations on different areas of your body, which would also need to be fine-tuned from a software perspective. Video games, for instance, would have to include information such as what part of the player’s body a bullet hit. The gloves are not a commercially viable product just yet, and they probably won’t be for some time. The team was able to 3D-print a soft glove exoskeleton mold (or case mold, if you will) to serve as proof of a mass production opportunity. In other words, they are actively displaying support for a commercial release of such a device.

More information:

06 June 2017


The VR camera called SONICAM enables users to capture both 2D and 3D videos and images in full 360 degrees. It’s a professional, spherical VR camera with 9 fish-eye cameras, 64 microphones, 4K HD resolution, and 360 degrees field of view. The combination of these features in one single device means that users can film any scenes vividly without any blind spots or image distortion. A wide range of events and occasions can be captured such as news coverage, live streaming of sports, broadcasting concerts, producing microfilm for wedding ceremonies, and just capturing normal everyday life. The camera supports H.265/H.264 encoding for both real-time messaging protocol and real-time streaming protocol live streaming broadcast. SONICAM designers have developed 3D spatial sound technology so that sound is mapped in full 360 degrees to the corresponding video. It records sound from any direction and naturally captures the viewer’s attention to specific areas of the video. Moreover, the array of beamforming microphones automatically reduces ambient noise for a more immersive VR experience. This technique is also compacted in the spherical VR camera and users can select different time intervals for a customized time-lapse footage.

Users can perform real-time stitching to produce professional level panoramic videos and images. This is done by combining SONICAM’s very own algorithm and using an FPGA (Field-programmable gate array). Recorded footages and captured images are also available for instant preview while shooting, so the camera controller can immediately judge if the materials are usable or relevant for their project. SONICAM is a photographer’s dream camera as it also contains ISO, white balance, shutter, a professional Gamma setting, sharpness, saturation, contrast, and much more. RAW image format is available for post-production so users can edit them with their own artistic touch. SONICAM is equipped with many other smart features like Wi-Fi, GPS, accelerometer, and it links through a smartphone app where users can easily control the captured images and footages. The app is available both on iOS and Android, which enables remote shooting and controlling of the device. On the actual VR camera, there are only three physical buttons which make it so easy to adapt to its functionalities. All images and footages that you capture are stored in the 128GB SD card included inside the VR camera.

More information: