30 September 2012

VSMM 2012 Paper

A few weeks ago, I presented a paper I co-authored with colleagues from Interactive Worlds Applied Research Group (iWARG) and the Serious Games Institute (SGI), was presented at the 18th International Conference on Virtual Systems and Multimedia, Virtual Systems in the Information Society (VSMM 2012). The conference took place at Milan, Italy, 2-5 September 2012 and the paper was titled ‘Brain-Controlled Serious Games for Cultural Heritage’.

The paper proposes a prototype system for cultural heritage based on brain computer interfaces for navigating and interacting with serious games. An interactive serious cultural heritage game was developed based on commercial BCI headsets controlling virtual agents in the ancient city of Rome. Initial results indicate that brain computer technologies can be very useful for the creation of interactive serious games.

A draft version of the paper can be downloaded from here.

27 September 2012

AI Cracks History of Art

Art experts could soon be replaced by computers as scientists of Lawrence Technological University in Detroit have developed software that can identify, evaluate and attribute works of art. Computer scientists created the software that focuses on 4,000 numerical image descriptors and analyzes form, texture, and visual content of the paintings without any human guidance. The program has managed to precisely attribute around 1000 paintings of 18 modern and 16 classical painters with no mistakes.

The computer automatically divided the 34 well-known painters into groups showing that it s able to identify painters of the same artistic movements. It placed the High Renaissance artists Raphael, Leonardo Da Vinci, and Michelangelo close to each other. Then separated the Baroque painters like Vermeer, Rubens and Rembrandt. Van Dyke, Durer and Bruegel were united into another group. Similarly it separated Gauguin and Cézanne and united Salvador Dali, Max Ernst, and Giorgio de Chirico into one group.

More information:

26 September 2012

Turn Dreams Into Music

Computer scientists have developed a method that automatically composes music out of sleep measurements. The composition service works live on the Web. The software was developed at the University of Helsinki, and automatically composes synthetic music using data related to a person’s own sleep as input.

The software composes a unique piece based on the stages of sleep, movement, heart rate and breathing. The project utilises a sensitive force sensor placed under the mattress. Heartbeats and respiratory rhythm are extracted from the sensor’s measurement signal, and the stages of sleep are deducted from them.

More information:

24 September 2012

IUCC 2012 Paper

A few months ago, my research student, Mr. Athanasios Vourvopoulos, presented a co-authored (with myself) paper with title ‘Robot Navigation using Brain-Computer Interfaces’, to the 11th International Conference on Ubiquitous Computing and Communications (IUCC-2012). The paper focuses on the research of human-robot interaction through tele-operation with the help of brain-computer interfaces (BCIs). To accomplish that, a working system has been developed based on off-the-shelf components for controlling a robot in both the real and virtual world.

 This paper reports on the user’s adaptation on brain-controlled systems and the ability to control brain-generated events in a closed neuro-feedback loop. Using commercial Brain-Computer Interfaces (BCIs) the overall cost, set up time and complexity can be reduced. The system is divided in two prototypes based on the headset type used. The first prototype is based on the Neurosky headset and it has been tested with 54 participants. The second prototype is based on the Emotiv headset including more sensors and accuracy.

A draft version of the paper can be downloaded from here.

21 September 2012

Predict Connections Between Neurons

One of the greatest challenges in neuroscience is to identify the map of synaptic connections between neurons. Called the ‘connectome’, it is the holy grail that will explain how information flows in the brain. In a landmark paper, published the week of 17th of September in the Proceedings of the National Academy of Sciences, the EPFL's Blue Brain Project (BBP) has identified key principles that determine synapse-scale connectivity by virtually reconstructing a cortical microcircuit and comparing it to a mammalian sample. These principles now make it possible to predict the locations of synapses in the neocortex. A longstanding neuroscientific mystery has been whether all the neurons grow independently and just take what they get as their branches bump into each other, or are the branches of each neuron specifically guided by chemical signals to find all its target. To solve the mystery, researchers looked in a virtual reconstruction of a cortical microcircuit to see where the branches bumped into each other. To their great surprise, they found that the locations on the model matched that of synapses found in the equivalent real-brain circuit with an accuracy ranging from 75 percent to 95 percent.

This means that neurons grow as independently of each other as physically possible and mostly form synapses at the locations where they randomly bump into each other. A few exceptions were also discovered pointing out special cases where signals are used by neurons to change the statistical connectivity. By taking these exceptions into account, the Blue Brain team can now make a near perfect prediction of the locations of all the synapses formed inside the circuit. The goal of the BBP is to integrate knowledge from all the specialized branches of neuroscience, to derive from it the fundamental principles that govern brain structure and function, and ultimately, to reconstruct the brains of different species -- including the human brain -- in silico. To achieve these results, a team from the Blue Brain Project set about virtually reconstructing a cortical microcircuit based on unparalleled data about the geometrical and electrical properties of neurons -- data from over nearly 20 years of painstaking experimentation on slices of living brain tissue. Each neuron in the circuit was reconstructed into a 3D model on a powerful Blue Gene supercomputer. About 10,000 of virtual neurons were packed into a 3D space in random positions according to the density and ratio of morphological types found in corresponding living tissue. The researchers then compared the model back to an equivalent brain circuit from a real mammalian brain.

More information: