31 January 2011

Games Help Decision Making

A prototype computer game has been developed to help improve decision making skills in all aspects of our lives. Supported by the EPSRC, a team at Queen's University Belfast has developed a prototype that could be built on by commercial games manufacturers and turned into an e-learning or training tool for professionals in all walks of Alternatively, some of its features could be incorporated into existing computer games that have a strategy element. The team has explored whether people can be trained to make better decisions by improving their ability to recognise and make allowances for their subjective opinions and biases, and to 'factor in' accurately their uncertainty over a decision's likely outcome. You're late for a train. Will you be able to catch it if you run? Or will that result in the stress of wasted effort? To maximise your chances of reaching the right decision, you'll need to take into account all information available to you.

But it also helps if, using this information, you try to make an appraisal of your chances, which will be more accurate if you take into account how you tend to interpret such information, based on previous experience. For example, maybe you know whether you tend to be over-or under-confident in similar situations. In the same way, the prototype game teaches people to take their uncertainty into account and learn from experience when faced with simple choices. In the future, games of this type could be used for both educational and entertainment purposes by public and private sector decision-makers and by private individuals in order to enhance their decision-making abilities. Over 500 members of the general public, as well as many students from Queen's and Dundalk Institute of Technology, have already tried out the prototype. The results are currently being assessed to establish the extent to which it has taught them to make better decisions.

More information:

http://quiz.worldofuncertainty.org/

http://www.sciencedaily.com/releases/2011/01/110120100945.htm

24 January 2011

Robotic Ghost Knifefish

Researchers at Northwestern University have created a robotic fish that can move from swimming forward and backward to swimming vertically almost instantaneously by using a sophisticated, ribbon-like fin. The robot -- created after observing and creating computer simulations of the black ghost knifefish -- could pave the way for nimble robots that could perform underwater recovery operations or long-term monitoring of coral reefs. The black ghost knifefish, which works at night in rivers of the Amazon basin, hunts for prey using a weak electric field around its entire body and moves both forward and backward using a ribbon-like fin on the underside of its body. Observations revealed that while the fish only uses one travelling wave along the fin during horizontal motion (forward or backward depending on the direction on the wave), while moving vertically it uses two waves. One of these moves from head to tail, and the other moves tail to head. The two waves collide and stop at the center of the fin.

Researchers then created a computer simulation that showed that when these ‘inward counter propagating waves’ are generated by the fin, horizontal thrust is cancelled and the fluid motion generated by the two waves is funneled into a downward jet from the center of the fin, pushing the body up. The flow structure looks like a mushroom cloud with an inverted jet. The robot is also outfitted with an electro-sensory system that works similar to the knifefish's, and researchers hope to next improve the robot so it can autonomously use its sensory signals to detect an object and then use its mechanical system to position itself near the object. Humans excel at creating high-speed, low-maneuverability technologies, like airplanes and cars. But studying animals provides a platform for creating low-speed, high-maneuverability technologies -- technologies that don't currently exist. Potential applications for such a robot include underwater recovery operations, such as plugging a leaking oil pipe, or long-term monitoring of oceanic environments, such as fragile coral reefs.

More information:

http://www.sciencedaily.com/releases/2011/01/110119095045.htm

22 January 2011

Computers Understanding Emotions

Having a computer that can read our emotions could lead to all sorts of new applications, including computer games where the player has to control their emotions while playing. Researchers at Bangor University are hoping to bring this reality a little nearer by developing a system that will enable computers to read and interpret our emotions and moods in real time. The work focuses on ‘hands-on’ pattern recognition and machine learning.

The plan is to combine brain wave information collected from a single electrode that sits on the forehead as part of a ‘headset’, a skin conductance response (which will detect tiny changes in perspiration as first indicators of stress) and a pulse signal, reflecting the wearer’s heart rate. This information will form the data fed into a classifier ensemble set to determine which emotion a person is experiencing.

More information:

http://www.bangor.ac.uk/news/full.php.en?nid=3235&tnid=3235

21 January 2011

3D Successful In The Classroom

Biology lessons are a distant memory for me but if they had been anything like the one I've just sat through at Abbey School in Reading, I think I may have remembered a little more. The pupils were looking at how a chest works, via 3D glasses and a 3D-enabled projector. The 3D thorax that caused the excitement "So cool", "It's huge", "I thought the diaphragm was a flat muscle," "I didn't realise it wasn't under the ribs" were just a few of the comments made when the girls put on their glasses to examine the model of the thorax in more detail.

If 2009 had a buzz word it might have been 3D. But despite the hype, there are murmurings that it is a gimmick already getting past its sell-by-date. Some reports suggest cinema audiences are starting to tire of 3D movies and, while 3D TVs are increasing sales, not everyone is impressed with the results. According to net measurement firm Nielsen, only a tiny percentage of houses have 3D TV with many others saying they have no intention of upgrading. Not so in education, where it seems 3D could have a real future, breathing new life into an ageing curriculum and offering a glimpse of how 21st education should be.

More information:

http://www.bbc.co.uk/news/technology-11891753

19 January 2011

Decoding 3D Mobility

A movement disorder can have many origins, such as a birth defect, spinal cord injury, or stroke. Rehabilitation scientists facilitate treatment of mobility disorders by studying the bodily cause of physical impairments and providing a scientific basis for therapies that can improve function. The source of physical impairment is often hidden among the complex interactions of the nervous, muscle, and skeletal systems of the human body. Simulating a patient’s movement in three-dimensional computer models can help uncover the source of the problem, whether it’s the size of a particular muscle or bone or the way these components perform. Computer models also provide a visual platform on which to test whether surgery would improve mobility for a specific patient. Two years ago, researchers introduced a free software program called OpenSim, a biomechanical research platform that simulates biological movement. The program combines data on muscle size and strength, joint motion, and recorded movements of a subject to produce a highly realistic simulation of a specific person’s maneuvering.

The team is not focusing on using OpenSim to understand and treat movement disorders, including cerebral palsy. Many children with cerebral palsy walk in a crouch-like pose, with their knees excessively bent. The cause of the crouch gait, which can be exhausting, painful and even debilitating, varies from patient to patient. In some patients, the hamstring muscles are very tight and short and pull the knees bent. If the hamstrings are surgically lengthened, these patients may be able to straighten their legs and walk more easily. However, if that surgery is used on a patient who walks in a crouch gait for a different reason, the procedure could be ineffective or, worse, harmful. By creating a computer model of a patient’s movements, researchers can non-invasively explore whether the surgery would be appropriate for a specific patient. If scientists develop a biomechanical model of the hand, for example, they can add it to the center’s database, making it available to anyone who might want to build on the work—by adding it to a model of a wrist or arm, for instance.

More information:

http://www.futurity.org/health-medicine/3-d-steps-up-to-decode-mobility/

18 January 2011

Personal Robots Are Coming

What does 2011 hold for the field of robotics? Plenty, if 2010 is any indication. This will not be the year that mobile, artificially intelligent robot nurses assume the responsibility of caring for the world's growing elderly population, but it does promise to be a pivotal time for the development of the underlying technology that will enable safe and reliable automated elder care, not to mention other services that robots are expected to perform in the coming decade. Thanks to a standardized platform introduced in 2010, roboticists can now collaborate as never before. Last May, makers of robot hardware and software, released a test version of its personal robot platform. The PR2 includes a mobile base, two arms for manipulation, a suite of sensors and two computers, each with eight processing cores, 24 gigabytes of RAM and two terabytes of hard-disk space. The out-of-the-box robot, which costs $400,000, also features an operating system that handles the robot's computation and hardware manipulation functions. Researchers at Georgia Tech's Healthcare Robotics Lab, which he formed in 2007, are focused on creating robots that can safely and effectively help care for senior citizens. The machines would go beyond current efforts to create bots able to follow the elderly around their homes to provide them with Internet access and remind them to take their medicine.

For starters, the Healthcare Robotics Lab researchers want their robots to be able to open doors and drawers to retrieve objects such as pill bottles while being guided by a laser pointer, radio signals or touch. 16 institutions are experimented with the PR2 during the latter half of 2010. South Korea's Samsung Electronics is using the PR2 to enhance the company's existing robotics research in a country that hopes to put a robot in every home by 2020. Researchers sees the combination of their PR2, named GATSBII, and a free and open-source robot operating system as a way to accelerate his lab's work with the help of a standardized platform and a budding community of roboticists working with the same tools who can now offer more practical advice to one another. The first EL-E, built in 2007 to perform assistive tasks for sufferers of amyotrophic lateral sclerosis (ALS), which impairs physical motor functions. The researchers have since built two more robots: Dusty, which has a lift tray designed to pick objects up off the floor; and Cody, whose two arms and omnidirectional mobile base resemble those of GATSBII. Unlike GATSBII, Cody is the product of many different manufacturers, including Meka Robotics, which supplied the arms, and Segway, which delivered the omni-directional mobile base. In the cases of EL-E, Dusty and Cody, researchers designed the robots and then found the parts they needed to actually build them.

More information:

http://www.scientificamerican.com/article.cfm?id=personal-robot-research

04 January 2011

Robot Teachers with Human Face

After years of hype, robot teachers have finally rolled into the classrooms. The Daegu Office of Education introduced 29 robot teachers for English classes in 21 elementary schools - one of the largest rollouts in the world - and the bots strutted their stuff at a demonstration at Hakjung Elementary School yesterday, with about 150 government officials coming to get a look at the technology employed. The 1-meter (3.28 feet) egg-shaped robot, named ‘Engkey’ (an abbreviation of English key), spoke, asked questions and conversed in English with students, and even entertained the crowd by dancing to music. In fact, the robo-teachers aren’t mere chips, wheels and gears. Within each of them, in a sense, is a real human teacher controlling the machine by remote from the Philippines.

The teachers in the Philippines have cameras to record their faces - which show up on a flat panel screen that forms the robo-teacher’s face - and they can also see the Korean students through a camera installed in the robot. Basically, the robot is a rolling Internet link between students and teacher, although the human teacher can also command the robot to make human gestures with its arms and wheels. The education office said Korean teachers can use the robots as assistant teachers for English classes, too. The robots were invented by the Center for Intelligent Robotics under the Korea Advanced Institute of Science and Technology, and the Daegu city government, the Daegu Office of Education and the Ministry of Knowledge and Economy spent roughly 1.6 billion won ($1.39 million) for the units.

More information:

http://joongangdaily.joins.com/article/view.asp?aid=2930207

03 January 2011

Virtual Girl Reads Emotions

The technology behind an attractive maths teaching avatar could one day be used to detect cancer, dispense health advice and single out serious shoppers in stores. The ‘Easy with Eve’ avatar, developed to teach maths to primary school students, uses complex algorithms to detect and respond to expressions and movements. Researchers said that technology had myriad applications. For example, a Unitec/China Medical University project was investigating the technology's potential to detect cancer cells.

The Chinese Government had put $180,000 towards the initiative and a provincial government had donated $10,000. Researchers at Massey University and Unitec were also working together to turn Eve into an intelligent sales assistant that would use customers' facial expressions and gestures captured on a camera to distinguish between serious shoppers and those just browsing. Also, the motion detection technology could be used to help monitor the elderly in their homes – alerting ambulance services if any sudden or unusual movements were detected.

More information:

http://www.stuff.co.nz/technology/digital-living/4474915/Virtual-girl-can-read-nine-emotions/