Showing posts with label Biofeedback. Show all posts
Showing posts with label Biofeedback. Show all posts

08 February 2022

VISIGRAPP 2022 Keynote

On Sunday, 6th February I gave a keynote at the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022). The talk was entitled ‘Brain Computer Interfaces for Extended Reality’.

First, I provided an overview of BCIs and then I showed different case studies demonstrating (a) how to perform novel experiments with the aim of understanding better human perception and (b) how to interact with brainwaves to control XR applications.

More information:

https://visigrapp.scitevents.org/KeynoteSpeakers.aspx#2

28 December 2021

CoG 2021 Article III

Recently, I have published a co-authored paper at the 3rd IEEE Conference on Games sponsored by the IEEE Computer Society. The paper was entitled ‘BCIManager: A library for development of brain-computer interfacing applications in Unity’ and presented a customizable library helping with the development of brain-computer interfaces (BCIs) with 3D graphics scenes, suitable for virtual reality (VR) BCI feedback, gamified BCI training, using BCIs as game inputs, and similar use cases.

The open source library forms a layer between Unity game engine and Openvibe that provides control of the EEG recording process from applications made with Unity. The main feature of the BCIManager is the interface for bi-directional data exchange between Unity and Openvibe; for sending markers (stimulations) from the experimental 3D applications to the EEG recordings, and for sending classification results, features, or other data from Openvibe to the 3D applications.

More information:

https://ieeexplore.ieee.org/document/9619123

27 December 2020

Hack Your Dreams

A team of researchers at MIT’s Dream Lab, which launched in 2017, are working on an open-source wearable device that can track and interact with dreams in a number of ways, including, hopefully, giving you new control over the content of your dreams. The team’s radical goal is to prove once and for all that dreams aren’t just meaningless gibberish but can be hacked, augmented, and swayed to our benefit. A glove-like device called Dormio, developed by the Dream Lab team, is outfitted with a host of sensors that can detect which sleeping state the wearer is in. When the wearer slips into a state between conscious and subconscious, hypnagogia, the glove plays a pre-recorded audio cue, most of the times consisting of a single word.


Hypnagogia may be different for different people. Some say they have woken up from hypnagogia, reporting they experienced strong visual and auditory hallucinations. Others can interact with somebody in the state. But the Dream Lab might be on to something with its Dormio glove. For instance, in a 50-person experiment, the speaking glove was able to insert a tiger into people’s sleep by having the glove say a prerecorded message that simply said tiger. The device is meant to democratize the science of tracking sleep. Step-by-step instructions were posted online with bio signal tracking software available on Github, allowing everybody to theoretically make their own Dormio glove. A similar device built by Dream Lab relies on smell rather than an audio cue.

More information:

https://futurism.com/mit-scientists-devices-hack-dreams