26 February 2015

Brain Controlled Bionic Reconstruction

Three Austrian men have become the first in the world to undergo a new technique called ‘bionic reconstruction’ enabling them to use a robotic prosthetic hand controlled by their mind. All three men suffered for many years with brachial plexus injuries and poor hand function as a result of motor vehicle and climbing accidents. The new technique was developed by researchers at the Medical University of Vienna, together with engineers from the Department of Neurorehabilitation Engineering of the University Medical Center Goettingen. It combines selective nerve and muscle transfers, elective amputation, and replacement with an advanced robotic prosthesis using sensors that respond to electrical impulses in the muscles. Following comprehensive rehabilitation, the technique restored a high level of function, in all three recipients, aiding in activities of daily living. Brachial plexus avulsion injuries represent an inner amputation, irreversibly separating the hand from neural control. Existing surgical techniques for such injuries are crude and ineffective and result in poor hand function. Researchers were able to create and extract new neural signals via nerve transfers amplified by muscle transplantation.


These signals were then decoded and translated into solid mechatronic hand function. Before amputation, all three patients spent an average of 9 months undergoing cognitive training, firstly to activate the muscles, and then to use the electrical signals to control a virtual hand. Once they had mastered the virtual environment, they practiced using a hybrid hand -- a prosthetic hand attached to a splint-like device fixed to their non-functioning hand. Three months after amputation, robotic prostheses gave all three recipients substantially better functional movement in their hands, improved quality of life, and less pain. For the first time since their accidents all three men were able to accomplish various everyday tasks such as picking up a ball, pouring water from a jug, using a key, cutting food with a knife, or using two hands to undo buttons. Brachial plexus injuries occur when the nerves of the brachial plexus -- the network of nerves that originate in the neck region and branch off to form the nerves that control movement and sensation in the upper limbs, including the shoulder, arm, forearm, and hand -- are damaged. Brachial plexus injuries often occur as a result of trauma from high speed collisions, especially in motorcycle accidents, and in collision sports.

More information:

24 February 2015

Informatics Colloquium Invited Talk

Today, I gave an invited talk at the faculty of Informatics, Masaryk University, Czech Republic as part of the ‘Informatics Colloquium’ series. The title of my presentation was ‘Procedural Generation for Interactive Virtual Environments’. The presentation provided an overview of procedural generation methods and techniques for both creating content and modelling human behaviour.


The main focus of my talk was on generating realistic terrain landscapes as well as buildings and subsequently cities. As far as human behaviour is concerned the main emphasis was on modelling accurately crowd behaviour in urban environments. Specific examples were demonstrated in the area of virtual reconstruction of cultural heritage as well as interactive computer games.

More information:

23 February 2015

Light of Paul Gauguin's Work

French artist Paul Gauguin is well known for his colourful paintings of Tahitian life - such as the painting that sold recently for nearly $300 million - but he also was a highly experimental printmaker. Little is known, however, about the techniques and materials Gauguin used to create his unusual and complex graphic works. Now a team of scientists and art conservators from Northwestern University and the Art Institute of Chicago has used a simple light bulb, an SLR camera and computational power to uncover new details of Gauguin's printmaking process - how he formed, layered and re-used imagery to make 19 unique graphic works in the Art Institute's collection. The new results establish Gauguin's use of materials and process in a chronological order, solving the puzzle of how ‘Nativity’ was made. Gauguin created the print using a layering of images created on paper by drawings, transfer of images and two different inks. The surface topography research on ‘Nativity’ and other graphic works by the artist will be part of a major Gauguin exhibit at the Art Institute in 2017.


The ‘Nativity’ findings overturn an earlier theory as to how Gauguin might have produced the print. Researchers reproduced in an Art Institute lab, what they believed to be Gauguin's process. The printmaking process the research team had identified produced a print very similar to Gauguin's original. To measure the 3D surface of the prints, they used some very accessible techniques that can be used by art conservators and historians around the world to analyze artworks. In applying these techniques to Gauguin's work, they came up with some interesting answers to questions about what his printing process was. Researchers studied ‘Nativity’ and 18 other Gauguin mono-prints in the Art Institute's collection. They used multiple wavelengths of light shining from different directions onto the prints to investigate the surface of the paper and re-evaluate how Gauguin created his works. The photometric stereo technique allowed the researchers to mathematically separate colour from surface shape, providing a much clearer view of the paper's topography.

More information:

12 February 2015

Open Source VR Platform Takes on 13 New Partners

An open source virtual reality platform reports 13 new partners. This is the Open Source Virtual Reality (OSVR) ecosystem, envisioned as the platform that can bring together companies doing work in a number of areas of virtual reality. OSVR aims to set an open standard for virtual reality input devices, games and output. Its framework offers the potential to unite developers and gamers under a single platform. As the platform is open-source, people working with hardware developmental kit designs or software plugins, for example, for motion control, game engines, and stereoscopic video output get complete access to what they need. An OSVR white paper makes a case for how this would be beneficial for gaming developers.


OSVR provides interfaces as opposed to an API to a specific piece of hardware. If there are multiple devices that provide the same type of information, these devices can be interchanged. You can reconfigure the OSVR 'plumbing' so that the game can continue to work well regardless of how where hand position is coming from. With OSVR, game developers can focus on what they want to do with the data, as opposed to how to obtain it. In short, OSVR would let you mix and match hardware and software packages. This means that companies that focus on a particular software or hardware component (e.g., gaze detection module or eye-tracking camera) are not left out of the VR eco-system: their expertise can be interconnected with components from others.

More information:

09 February 2015

Reading Ancient Scrolls

After working for more than 10 years on unlocking an ancient piece of history, what lies inside damaged Herculaneum scrolls, University of Kentucky Department of Computer Science researchers will accomplish the next step in allowing the world to read the scrolls, which cannot be physically opened. A major development in the venture, they are building software that will visualize the scrolls' writings as they would be if unrolled. A breakthrough not only in digital imaging techniques, the first-of-its-kind software could also have profound impacts on history and literature. They say that each scroll may well be the only remaining copy as of yet unknown literature from the Classical era. Each scroll is 20 to 30 feet long and estimates each to contain at least 3,000 words.


The scrolls aren't your typical 2,000-year old papyri manuscripts; they were carbonized in the Mount Vesuvius volcanic eruption of A.D. 79, and later discovered as charred clumps in the Villa of the Papyri in the ancient Italian city of Herculaneum beginning in 1752. When attempting to open, the artifacts would often shatter beyond repair. To reveal the works inside the remaining intact scrolls, researchers from the Institut de France, knew that ‘virtual unrolling’ was the only way. After successfully creating 2D images of two Herculaneum scrolls in 2009 but not being able to detect the ink in them, researchers believe they have recently identified ink in the scrolls after applying an x-ray method often used in the medical and archaeology communities.

More information:

07 February 2015

Robotic System with Emotion and Memory

Researchers at the University of Hertfordshire have developed a prototype of a social robot which supports independent living for the elderly, working in partnership with their relatives or carers. The robot uses a state of the art service platform called Care-O-bot® 3 and works within a smart-home environment. Over the past three years the project team successfully carried out a wide range of studies in the University's Robot House which included, detecting the activity and status of people in a smart-home environment as well as focusing on robots' ability to remember and recall.


Developments, which was part of an EU project called ACCOMPANY (Acceptable Robotics Companions for Ageing Years), culminated into three interaction scenarios, which were subsequently evaluated by involving elderly people and their formal/informal carers across France, the Netherlands and the UK. ACCOMPANY's results demonstrated that a social robot can potentially help to prevent isolation and loneliness, offering stimulating activities whilst respecting autonomy and independence. The project received excellent results from its final review in Brussels.

More information: