28 April 2011

Wii Key to Helping Kids Balance

By cleverly linking five Wii Balance Boards, a team of Rice University undergraduates has combined the appeal of a video game with the utility of a computerized motion-tracking system that can enhance the progress of patients at Shriners Hospital for Children-Houston. The Rice engineering students created the new device using components of the popular Nintendo game system to create a balance training system. What the kids may see as a fun video game is really a sophisticated way to help them advance their skills.



The Wii Balance Boards lined up between handrails will encourage patients age 6 to 18 to practice their balance skills in an electronic gaming environment. The active handrails, which provide feedback on how heavily patients depend on their arms, are a unique feature. Many of the children targeted for this project have cerebral palsy, spina bifida or amputations. Using the relatively inexpensive game console components improves the potential of this system to become a cost-effective addition to physical therapy departments in the future.

More information:

http://www.sciencedaily.com/releases/2011/04/110412101619.htm

26 April 2011

Romance Is Not Dead

Artists and engineers have come together to demonstrate that digital technology can be romantic as well as practical. Few people mull over a text message, however heartfelt, in the same way as a handwritten declaration of love, but a Newcastle University team is looking to prove that using digital communication doesn't necessarily mean that romance is dead. They have created digital 'Lovers' Boxes' that draw on the aesthetics of traditional wooden jewellery boxes, but actually contain the latest technology to enable couples to record romantic messages for each other. Each box consists of two halves connected by brass hinges, decorated with ornate carvings, with an antique keyhole at the front. A computer with an integrated RFID reader is hidden inside the box.


Other than the screen itself, all visible trappings of digital technology are hidden from view. Once unlocked, the box opens in a book-like manner, and a screen becomes visible. A wooden passé-partout with rounded edges frames the screen to counter the usual connotations of a digital display. When placed within the box, the RFID tag in the key fob triggers a video message stored within. To avoid evoking the sense of a wooden laptop-like device, the videos created by participants are not played in a typical 16:9 landscape format on the screen, but in a portrait orientation. The Lovers' Box has been described as akin to 'an interactive storybook or jewellery box', which the participants chose to treat carefully and stow away like a precious family heirloom.

More information:

http://www.sciencedaily.com/releases/2011/04/110426071147.htm

25 April 2011

Emotional Video Gaming

As you wake up, you realise that your building is on fire. Your heart starts pounding and the flames grow higher, but you manage to compose yourself, and as you do so the fire dies down. Smashing a window, you step outside, only to find yourself on a ledge six storeys up. As you break out into a terrified sweat, the perilous route to safety appears to shift nightmarishly before your eyes. Will you escape? Scenarios like this could soon be played out harmlessly in living rooms across the world. That's because the next generation of video game controllers will use players' emotional and physiological states to help shape and navigate their virtual worlds. This style of affective gaming aims to move video games to a new level, way beyond what is available even via motion-based controllers like the Wiimote or Kinect.


Valve Software, the developer behind titles such as Half-Life, sees a player's emotional state as an important part of any game. Researchers have been working with Valve on ways to add emotional feedback to Left 4 Dead 2, a game in which players cooperate to fight off a zombie horde. They spoke about Their work at this year's Game Developers Conference in San Francisco. In the regular form of the game, an "AI Director" responds to players' actions by adjusting the game itself. Play well and you'll face tougher opponents; play badly and the game becomes less intense. They are trying to go beyond this rough-and-ready response to the players' behaviour by assessing their emotional state more directly. By recording the physiological responses of our play testers, we can get more precise estimations of their emotional state.

More information:

http://www.newscientist.com/article/mg21028084.700-emotional-video-gaming-makes-the-action-real.html

20 April 2011

Clumsy Avatars

Willy Nilly's Surf Shack offers a cure for the idealized virtual world of Second Life. The online shop, a project of Rensselaer Polytechnic Institute researchers, endow otherwise flawless avatars with real-world foils like clumsiness. A project allowing avatars to visibly age over time is in the works. The shop is one of several projects uses to explore humanity in technology. Researchers see the dialogue between perfection and mortality as an important influence in the growing world of games and simulation. While the sell behind technology is often about achieving perfection (with a smart phone all the answers are at hand, with GPS we never lose our way, in Second Life we are beautiful), the risk is a loss of humanity.


That dialogue and tension leads researchers to believe that the nascent world of gaming and simulation could become a new cultural form as great as literature, art, music, and theater. Other recent projects include ‘Becoming’, a computer-driven video installation in which the attributes of two animated figures -- each inhabiting their own space -- are interchanged. Over time, this causes each figure to take on the attributes of the other, distorted by the structure of their digital information. In ‘Insecurity Camera’, an installation shown at art exhibits around the country, a ‘shy’ security camera turns away at the approach of subjects.

More information:

http://www.sciencedaily.com/releases/2011/04/110419151057.htm

19 April 2011

A Brain Computer Model

Scientists have moved a step closer to being able to develop a computer model of the brain after developing a technique to map both the connections and functions of nerve cells in the brain together for the first time. A new area of research is emerging in the neuroscience known as 'connectomics'. With parallels to genomics, which maps our genetic make-up, connectomics aims to map the brain's connections (known as 'synapses'). By mapping these connections -- and hence how information flows through the circuits of the brain -- scientists hope to understand how perceptions, sensations and thoughts are generated in the brain and how these functions go wrong in diseases such as Alzheimer's disease, schizophrenia and stroke. Mapping the brain's connections is no trivial task, however: there are estimated to be one hundred billion nerve cells ('neurons') in the brain, each connected to thousands of other nerve cells -- making an estimated 150 trillion synapses. Researchers at a Wellcome Trust Research Career Development Fellow at UCL try to make sense of this complexity. Nerve cells in different areas of the brain perform different functions. Researchers focus on the visual cortex, which processes information from the eye. For example, some neurons in this part of the brain specialise in detecting the edges in images; some will activate upon detection of a horizontal edge, others by a vertical edge. Higher up in visual hierarchy, some neurons respond to more complex visual features such as faces: lesions to this area of the brain can prevent people from being able to recognise faces, even though they can recognise individual features such as eyes and the nose.


In a study published online April 10 in the journal Nature, the team at UCL describe a technique developed in mice which enables them to combine information about the function of neurons together with details of their synaptic connections. The researchers looked into the visual cortex of the mouse brain, which contains thousands of neurons and millions of different connections. Using high resolution imaging, they were able to detect which of these neurons responded to a particular stimulus, for example a horizontal edge. Taking a slice of the same tissue, the researchers then applied small currents to a subset of neurons in turn to see which other neurons responded -- and hence which of these were synaptically connected. By repeating this technique many times, the researchers were able to trace the function and connectivity of hundreds of nerve cells in visual cortex. The study has resolved the debate about whether local connections between neurons are random -- in other words, whether nerve cells connect sporadically, independent of function -- or whether they are ordered, for example constrained by the properties of the neuron in terms of how it responds to particular stimuli. The researchers showed that neurons which responded very similarly to visual stimuli, such as those which respond to edges of the same orientation, tend to connect to each other much more than those that prefer different orientations. Using this technique, the researchers hope to begin generating a wiring diagram of a brain area with a particular behavioural function, such as the visual cortex. This knowledge is important for understanding the repertoire of computations carried out by neurons embedded in these highly complex circuits. The technique should also help reveal the functional circuit wiring of regions that underpin touch, hearing and movement.

More information:

http://www.sciencedaily.com/releases/2011/04/110410181302.htm

18 April 2011

Computer Vision for Heathcare

Researchers believe that there are huge opportunities for integrating computer science, and in particular computer vision, into health care and medical research, making life easier for researchers, physicians and ultimately patients. This is leading to the development of powerful tools to aid in bioengineering research. They are developing a technology that will automate the arduous process of analyzing the vast amount of data necessary for tissue engineering. In their research, seek to automate blood vessel counting in images, and to make the distinction between data collection and analysis more clear.

Tissue engineering is an interdisciplinary field that offers the promise of improving, repairing and/or replacing damaged tissue in the human body. Research in this area involves the development of various biomaterials and processes that facilitate the fabrication of such tissue. In this research project, the focus is on quantifying arteriole formation. An arteriole is one of the small terminal branches of an artery, especially one that connects with a capillary. Collecting this vast amount of data is currently done manually and requires an intensive amount of time and meticulous effort.


More information:


10 April 2011

Gadget Show Live 2011

On Tuesday, 12th April 2011, iWARG members Athanasios Vourvopoulos and Fotis Liarokapis will present a prototype that uses a brain computer interface (Neurosky Mindset) for controlling a Lego NXT Mindstorms Robot. The robot performs basic movement based on the attention levels of the user. Different attention levels will accelerate or decelerate the robot accordingly.


The Gadget Show Live is the UK's ultimate consumer electronics event and returns for its third year at the NEC, Birmingham this April. The 2011 event will bring together more exhibitors than ever before and is the best place to see, try and buy everything from HD Camcorders and 3D TVs to Games Consoles and In-Car Electronics.

More information:



03 April 2011

Gesture-Controlled Microscope

Researchers at the Institute for Molecular Medicine Finland (FIMM) have in collaboration with the Finnish company Multitouch Ltd created a hand and finger gesture controlled microscope. The method is a combination of two technologies: web-based virtual microscopy and a giant-size multitouch display. The result is an entirely new way of performing microscopy: by touching a table- or even wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen.

Biological samples are digitized using a microscopy scanner and stored on an image server. Samples displayed on the screen are then continuously read from the server over the internet and the size of a single sample can be up to 200 gigabytes. The sample viewing experience is like a combination of Google Maps and the user interface from the movie Minority Report. The developers think that the method will revolutionize microscopy teaching: a group of students can stand around the display together with the teacher and examine the same sample. The multitouch microscope can recognize the hands of multiple users at the same time.


More information: