27 February 2010

3D Cinema and TV

This is the year in which 3D cinema and 3D TV will make the breakthrough. At CeBIT in Hannover, Fraunhofer researchers are presenting technologies and standards that are hastening the progress. 2010 will be the year in which cinema and television make the jump into the third dimension. Blockbusters like James Cameron's Avatar, Pixar's Ice Age and Dawn of the Dinosaurs have brought in billions box office ticket sales throughout the globe. And now, the time for 3D movies for television has also come. The industry announced the first 3D televisions will be ready for production by summer. A few games of FIFA World Cup football have already been captured in 3D. Yet before 3D technology becomes the standard equipment for the movie screen and the telly, a few questions still require some clarification. For instance, how can the recording process and post-processing be optimized, and the costs for them be reduced? Indeed, Cameron's science fiction extravaganza gobbled down 250 million US-dollars in the making, and required four years of computer work. How can the tools for the post-production of movies be improved? And the sixty-four thousand dollar question: 3D-glasses, or no 3D-glasses?

To address these issues, experts from the film industry, academia and research joined forces in the consortium ‘PRIME: Production and Projection- Techniques for Immersive Media’. Together they are exploring and developing business models and techniques for cinema, television and gaming. 3D films pose tougher challenges than their two-dimensional counterparts, since two images are always needed in order to create a spatial depiction. For this reason, at least two cameras must be used to record the film, and a 3D screen is needed to display both images. One image for the left eye, and one image for the right. Stereoscopy has evolved into a recording technology for high-resolution home theater. This process demands the utmost precision from the camera crew and post-production, because an individual film has to be produced for each eye. In editing and in post-processing, both streams must be processed together in absolute synchrony. For the movie theater, a scene is shot with two synchronized MicroHDTV cameras from IIS. One camera acts as the master -- the digital leader. Using the exact same settings, the second camera captures the calibration, color fidelity and geometry.

More information:

http://www.sciencedaily.com/releases/2010/02/100226093219.htm

25 February 2010

Brain-Controlled Cursor

Harnessing brain signals to control keyboards, robots or prosthetic devices is an active area of medical research. Now a rare peek at a human brain hooked up to a computer shows that the two can adapt to each other quickly, and possibly to the brain's benefit. Researchers at the University of Washington looked at signals on the brain's surface while using imagined movements to control a cursor. The results, published this week in the Proceedings of the National Academy of Sciences, show that watching a cursor respond to one's thoughts prompts brain signals to become stronger than those generated in day-to-day life. Bodybuilders get muscles that are larger than normal by lifting weights, researchers mentioned. We get brain activity that's larger than normal by interacting with brain-computer interfaces. By using these interfaces, patients create super-active populations of brain cells. The finding holds promise for rehabilitating patients after stroke or other neurological damage. It also suggests that a human brain could quickly become adept at manipulating an external device such as a computer interface or a prosthetic limb.

The team of computer scientists, physicists, physiologists and neurosurgeons studied eight patients awaiting epilepsy surgery at two Seattle hospitals. Patients had electrodes attached to the surface of their brains during the week leading up to the surgery and agreed to participate in research that would look at connecting brains to a computer. Asking people to imagine doing a movement -- such as moving their arm -- is commonly done to produce a brain signal that can be used to control a device. But how that process works is poorly understood. Researchers first recorded brain patterns when human subjects clenched and unclenched a fist, stuck out a tongue, shrugged their shoulders or said the word ‘move’. Next, the scientists recorded brain patterns when subjects imagined performing the same actions. These patterns were similar to the patterns for actual action but much weaker, as expected from previous studies. Finally, the researchers looked at signals when subjects imagined performing the action and those brain signals were used to move a cursor toward a target on a computer screen. After less than 10 minutes of practice, brain signals from imagined movement became significantly stronger than when actually performing the physical motion.

More information:

http://uwnews.washington.edu/ni/article.asp?articleID=55693

23 February 2010

Advances for Alzheimer's Disease

Researchers at UAB and University of Stockholm have created a computer modelling of the structural malfunctioning of the ApoE4 protein when it enters into contact with the Amyloid beta molecule, the main cause of Alzheimer's disease. The research, published in PLoS Computational Biology, supports experimental evidence that links ApoE4 with this pathology and opens up new exploration possibilities in understanding and fighting against the disease. The research proposes a three-dimensional model which simulates the interaction between the peptide Amyloid beta and the different forms of Apolipoprotein E (ApoE) and offers a first molecular base for the comprehension of this phenomenon. Three possible ApoE forms exist in humans: ApoE2, ApoE3 and ApoE4.

ApoE3 is the most common form, while ApoE4 is very closely linked to Alzheimer's disease. The developed model structurally reaffirms the experimental observations which link ApoE4 to this pathology. Researchers have observed that this protein tends to lose its functional structure in presence of the peptide Amyloid beta; this however does not occur with the ApoE2 and ApoE3 forms. According to researchers, these differences are due to subtle divergences between the structures of each form and would explain the different responses of carriers of forms 3 and 4 in the presence of Amyloid beta molecules. The loss of the structure reveals the possibility of new explorations aimed at better understanding and fighting against Alzheimer's disease.

More information:

http://www.uab.es/servlet/Satellite/latest-news/news-detail/new-advance-in-the-study-of-alzheimer-s-disease-1096476786473.html?noticiaid=1266391646189

20 February 2010

Phone Gauges Physical Activity

An iPhone application created by University of Houston researchers is providing first-of-its-kind real-time statistics of physical activity around the world. Those annual rankings of America’s fattest and fittest cities that use government statistics and a host of indirect indicators may soon have a little more muscle. The information being collected at the University of Houston provides objective data. The Walk n’ Play iPhone application, available free from Apple’s ‘App Store’, allows users to keep track of their physical activity and compete with other users. The latest version lets players compare themselves to various profiles that represent a region or a skill level. It helps individuals and groups to connect around the concept of daily physical motion– similar to a real-time Twitter where your feet do the tweeting. Anonymous data from Walk n’ Play users are sent to a server at University of Houston’s Computational Physiology Lab.

The data includes physical activity, the intensity of activity and the geographic region of the player. Using the information, researchers are able to objectively measure physical activity and break down the data by location. Researchers believe that the applications for the technology are far reaching and will result in real-world data that has previously been difficult to collect. The Walk n’ Play app tallies a users’ every movement over the course of a day, including walking and climbing stairs, and translates it into calories burned. The game gives an accurate calorie count thanks to a biomedical calibration process applied on the iPhone’s accelerometer that senses motion and can be made to measure metabolic activity. In addition, the Walk n’ Play application brings a health benefit to gamers, creating an element of competition. It allows users to employ the buddy system, whether that buddy is a real-life friend or an avatar representative of a certain population.

More information:

http://www.cpl.uh.edu/projects/walk-n-play/

http://www.uh.edu/news-events/stories/2010articles/Feb2010/02082010iPhoneApp-Pavlidis.php

12 February 2010

AI in Gaming and Virtual Worlds

On Wednesday, 10th March 2010, Serious Games Institute (SGI) is organising another workshop with title ‘Artificial Intelligence in Gaming and Virtual Worlds’. In the near future, many of us will use avatars and be a part of a virtual experience embedded with advanced artificial intelligence techniques. This event considers the role that artificial intelligence is playing in respect to how intelligent-machine driven avatars will be in the future.

The event will also look at how artificial intelligence is currently being used in educational and health settings, as well as in leading edge research projects, such as the Echoes project. The session will include presentations from Simon Colton (Imperial College London), David Burden (Daden Ltd), Stuart Slater (Wolverhampton University), Vania Dimitrova (leeds University) and Kaska Porayska-Pomsta (London Knowledge Lab).

More information:

http://www.seriousgamesinstitute.co.uk/events.aspx?item=757

10 February 2010

Augmented Reality Museum Guide

Every visitor would like to embark on a virtual time journey into the past. Researchers have already set the stage for just such a journey, as exemplified by a recent exhibition in the Allard Pierson Museum in Amsterdam, where visitors could take a stroll through historical sites. A flat screen on a rotating column stood beside the many art works, showing an extract of the image on the wall, a gigantic black and white photo of the Roman Forum ruins. When the column is rotated to the left, this correspondingly changes what the viewer sees. A camera connected to the back of the movable display provides information about the new view appearing on the monitor, in this case, the Temple of Saturn ruins. At the same time, a digital animation shows what the temple might have looked like when intact. If the screen is rotated further, it displays information, pictures and videos about other ancient buildings, including the Colosseum.

The sophisticated animation is based on software developed by the Fraunhofer Institute for Computer Graphics Research IGD in Darmstadt. The program knows where the center of the camera is pointing and can superimpose the relevant overlay: text, video or animation. The original image can always be clearly seen under the overlays, so that visitors always know where they are on the virtual tour. This technology is known as augmented reality to the experts. The Fraunhofer IGD software in the museum currently runs on a mini-computer, controlled via a touch screen. This handy console clearly indicates a trend towards mobile, virtual guidebooks. When tourists will hold their consoles in front of a baroque prince's palace, the relevant customized information will appear immediately on their screens.

More information:

http://www.sciencedaily.com/releases/2010/02/100210164838.htm

08 February 2010

Brain Computer Headset for Games

Gamers will soon be able to interact with the virtual world using their thoughts and emotions alone. A neuro-headset which interprets the interaction of neurons in the brain will go on sale later this year. It picks up electrical activity from the brain and sends wireless signals to a computer. The brain is made up of about 100 billion nerve cells, or neurons, which emit an electrical impulse when interacting. The headset implements a technology known as non-invasive electroencephalography (EEG) to read the neural activity. Emotiv, a neuro-engineering company, creates a brain computer interface that reads electrical impulses in the brain and translates them into commands that a video game can accept and control the game dynamically. This is the first headset that doesn't require a large net of electrodes, or a technician to calibrate or operate it and does require gel on the scalp. This area of immersion and control could prove to be the breakthrough gaming has longed for.

The use of Electroencephalography in medical practice dates back almost 100 years but it is only since the 1970s that the procedure has been used to explore brain computer interfaces. The Epoc technology can be used to give authentic facial expressions to avatars of gamers in virtual worlds. For example, if the player smiles, winks, grimaces the headset can detect the expression and translate it to the avatar in game. It can also read emotions of players and translate those to the virtual world. The $299 headset has a gyroscope to detect movement and has wireless capabilities to communicate with a USB dongle plugged into a computer. The Emotiv said the headset could detects more than 30 different expressions, emotions and actions. They include excitement, meditation, tension and frustration; facial expressions such as smile, laugh, wink, shock (eyebrows raised), anger (eyebrows furrowed); and cognitive actions such as push, pull, lift, drop and rotate (on six different axis). Gamers are able to move objects in the world just by thinking of the action.

More information:

http://www.emotiv.com/

http://news.bbc.co.uk/1/hi/technology/7254078.stm

06 February 2010

AngloHigher® Article

A few days ago, a co-authored article with title ‘Press Play: An experiment in Creative Computing using a novel pedagogic approach’ was published online at AngloHigher®, the magazine of global English speaking higher education. In 2012, The Faculty of Engineering and Computing at Coventry University will be moving to a purpose built £60 million pound building. Ahead of this, the Faculty Management team have requested that academics begin to think radically about the new forms of teaching and learning that might occur in symbiosis with a bespoke, technologically smart building. To improve student retention, engagement, and satisfaction, the hope is that the pedagogic approaches deployed in the new facility will respond to the expanded role of technology in teaching and learning, the diversity of the student intake, and the realities of 21st century student life.

This article is about a response to the issues: an experimental six week teaching activity conducted by the Faculty's Creative Computing subject group, in collaboration with their 2009 first year undergraduate intake. The 'six week challenge' met with positive feedback both from students and independent experts. While it generated challenges, these are more than compensated by the increased, 'hands-on' engagement of students, the integration of different modules so that students can see how they complement each other, and the leveraging of higher achievement in a shorter time. A shift in faculty thinking has given us an opportunity to develop a six-week block of 'activity-led learning' which integrates content from different modules. We have used this time to allow the students to learn by discovery, while combining skills that will become important in each of their modules.

The paper can be found online at:

http://www.anglohigher.com/magazines/viewpdf_mag/31/24