31 January 2024

Phantom VR Glove

A VR startup based in Colorado, has developed a unique haptic glove called the Phantom. Unlike traditional VR gloves that cover the user's entire hand, the Phantom leaves the user's fingers uncovered. This would enable haptic feedback for hand-tracking applications on headsets like Quest 3 or Apple Vision Pro. The Phantom glove features a device with an electronics module and battery which is strapped around the wrist. 

 

This device also connects to five adjustable rings, one for each finger base. It connects wirelessly via Bluetooth to VR headsets. The rings stimulate the nerves that connect the fingertips to the brain. Through these stimuli, the glove tricks the brain into thinking it is actually touching something. The Phantom Glove would be one of the first models to provide tactile stimuli and feedback for hand tracking.

More information:

https://mixed-news.com/en/this-new-haptic-hand-tracking-glove-could-be-a-perfect-fit-for-the-apple-vision-pro/

29 January 2024

Electronic Skin Senses Like Human Skin

Researchers at Texas A&M University have made strides in developing 3D-printed electronic skin (E-skin) that mimics the flexibility and sensitivity of human skin. The team created an e-skin that can flex, stretch, and sense like human skin using nanoengineered hydrogels with electronic and thermal biosensing capabilities. The E-skin holds significant potential for diverse applications, particularly in the industry of wearable health devices designed to monitor essential signs like motion, temperature, heart rate, and blood pressure.

This innovative technology can provide users with continuous feedback, assisting them in improving motor skills and coordination. The technology effectively tackles the challenges associated with crafting robust materials that emulate the flexibility of human skin, integrate bioelectrical sensing capabilities, and employ fabrication methods suitable for wearable or implantable devices. The research team introduced a triple-crosslinking strategy to enhance the hydrogel-based system, addressing stiffness concerns and facilitating signal transduction for more seamless interaction with the body's tissues.

More information:

https://www.techtimes.com/articles/301063/20240127/new-3d-printed-electronic-skin-replicate-flexibility-sensitivity-human.htm

28 January 2024

Camera Sees Through Eyes of Birds and Bees

An interdisciplinary team has developed an innovative camera system that is faster and more flexible in terms of lighting conditions than existing systems, allowing it to capture moving images of animals in their natural setting. Researchers introduced hardware and software tools for ecologists and filmmakers that can capture and display animal-perceived colors in motion. Different animal species possess unique sets of photoreceptors that are sensitive to a wide range of wavelengths, from ultraviolet to infrared, dependent on each animal's specific ecological needs. Some animals can even detect polarized light. So, every species will perceive color a bit differently. Honeybees and birds, for instance, are sensitive to UV light, which isn't visible to human eyes. However, the authors contend that current techniques for producing false color imagery can't quantify the colors animals see while in motion, an important factor since movement is crucial to how different animals communicate and navigate the world around them via color appearance and signal detection.

Multispectral photography takes a series of photos across various wavelengths (including UV and infrared) and stacks them into different color channels to derive camera-independent measurements of color. This method trades some accuracy for better spatial information and is well-suited for studying animal signals, for instance, but it only works on still objects, so temporal information is lacking. Researchers developed a camera system capable of producing high-precision animal-view videos that capture the full complexity of visual signals as they would be perceived by an animal in a natural setting. They combined existing methods of multispectral photography with new hardware and software designs. The camera records video in four color channels simultaneously (blue, green, red, and UV). Once that data has been processed into perceptual units, the result is an accurate video of how a colorful scene would be perceived by various animals, based on what we know about which photoreceptors they possess. The team's system predicts the perceived colors with 92 percent accuracy.

More information:

https://arstechnica.com/science/2024/01/novel-camera-system-lets-us-see-the-world-through-eyes-of-birds-and-bees/