27 September 2008

Less Virtual Games

While most massively multiplayer online games (MMOs) are based on fantasy worlds, there is a growing trend for a new kind of game that merges the real world with the virtual. Rather than taking on the person of a mythical character who goes on quests, players of this new breed of game compete against one another in real sports, based in the real world. At first glance, these games resemble racing simulations, but with unparalleled realism and the ability to race against a large number of people, including professionals they represent a cut above the rest. iRacing is an internet-based auto racing simulation system in which drivers can race against dozens of other online participants on race tracks modelled on the real thing. But the makers of iRacing are keen to stress that it's more than just a game. iRacing uses laser-scanning technology to accurately replicate real racetracks, while vehicle-handling dynamics are reproduced using a physics engine and tire model so that each car feels different to drive.

Sky Challenge takes this link to reality a step further, allowing players to race against real jets. High-performance aircraft race through a virtual computer-generated obstacle course in the sky. The course is stored in onboard computers and the pilots flying the planes see the series of animated objects through which they must fly on a small screen display. The course is also dynamic. It can adjust to punish or reward competitors for penalties and bonuses, so that if a pilot hits a virtual object, the course for that pilot gets longer. While iRacing opened to the public this summer, Sky Challenge is yet to become available to internet participants. A test event is due to be held next week on October 2nd 2008 over the beaches of Barcelona. In iRacing, drivers are grouped according to skill level so that races are evenly matched and in Sky Challenge, internet participants start by practising alone, then once they've learnt the course they race against other online players, finally earning the chance to take on the real pilots in a real-time race.

More information:

http://news.bbc.co.uk/1/hi/technology/7633110.stm

25 September 2008

Virtual RFID-enabled Hospital

Students at the University of Arkansas, and at neighboring high schools, are employing avant-garde technology to help the health-care industry learn just how RFID can make a difference in the operations of a company or organization. The researchers hope the technology will provide a modeling and simulation environment that lets organizations test RFID implementations—down to such details as the number of RFID readers and tags, and where to put them—prior to physical deployment. They have digitally created a hospital in Second Life, a three-dimensional virtual world developed and owned by Linden Lab in which millions of people visit to work and play online. The project is connected with the University of Arkansas' Center for Innovation in Healthcare Logistics, in its College of Engineering, as well as with the RFID Research Center, part of the Information Technology Research Institute at the Sam M. Walton College of Business. The Center for Innovation in Healthcare Logistics, which opened in 2007, includes an interdisciplinary team of researchers who investigate supply chain networks and information and logistics systems within the broad spectrum of U.S. health care. Since 2005, the RFID Research Center has conducted studies regarding the use of radio frequency identification in retail.

The virtual world allows hospitals to model their environments in great detail. On the University of Arkansas' Second Life Island, the students have created a virtual hospital containing operating suites, patient rooms, laboratories, a pharmacy, waiting rooms, stock rooms and bathrooms. The virtual facility also includes furnishings, such as working toilets, sinks, showers, chairs and beds, along with various diagnostic and medical equipment including electrocardiogram machines, respiratory rate monitors and portable X-ray machines. The avatars (in this case, doctors, nurses, staff members and patients) and various assets are tagged with virtual RFID tags, each with its own unique number. There are also virtual RFID interrogators positioned in doorways and various other places throughout the hospital. Using the tags and readers, the researchers have modeled a variety of business processes. For instance, one process simulates the delivery of equipment and goods to the hospital: A delivery truck drives up to a warehouse, where RFID-tagged items have been placed on a smart pallet (which has scanned the items' RFID tags to create a bill of lading). The avatar loads the smart pallet onto the delivery truck, then drives to the hospital. Once the truck backs up to a dock, an RFID-enabled robot picks up the pallet, scans the items' tags and transports the goods to the appropriate locations within the hospital.

More information:

http://www.rfidjournal.com/article/articleview/4326/1/1/definitions_off

http://vw.ddns.uark.edu/index.php?page=media

23 September 2008

Google's Android Mobile Unveiled

The first mobile telephone using Google's Android software has been unveiled. The T-Mobile G1 handset will be available in the UK in time for Christmas. The first device to run the search giant's operating system will feature a touch screen as well as a Qwerty keyboard. It will be available for free on T-Mobile tariffs of over £40 a month and includes unlimited net browsing. Other features include a three megapixel camera, a 'one click' contextual search and a browser that users can zoom in on by tapping the screen. The handset will be wi-fi and 3G enabled and has built-in support for YouTube. Users will also have access the so-called Android Market, where they will be able to download a variety of applications. Google announced its plans for the Android phone software in November 2007 with a declared aim of making it easier to get at the web while on the move.

The idea behind Android is to do for phone software what the open source Linux software has done for PCs. Developers of phone software can get at most of the core elements of the Android software to help them write better applications. However, in launching Android, Google faces stiff competition from established players such as Nokia with its Symbian software and Microsoft with its Mobile operating system. More recently Apple has been gaining customers with its much hyped iPhone. The Android software is squarely aimed at the smartphone segment of the handset market which adds sophisticated functions to the basic calling and texting capabilities of most phones. Current estimates suggest that only 12-13% of the all handsets can be considered smartphones.

More information:

http://news.bbc.co.uk/1/hi/technology/7630888.stm

21 September 2008

From Xbox To T-cells

A team of researchers at Michigan Technological University is harnessing the computing muscle behind the leading video games to understand the most intricate of real-life systems. The group has supercharged agent-based modeling, a powerful but computationally massive forecasting technique, by using graphic processing units (GPUs), which drive the spectacular imagery beloved of video gamers. In particular, the team aims to model complex biological systems, such as the human immune response to a tuberculosis bacterium. During demonstration, a swarm of bright green immune cells surrounds and contains a yellow TB germ. These busy specks look like 3D-animations from a PBS documentary, but they are actually virtual T-cells and macrophages—the visual reflection of millions of real-time calculations.

Researchers from the University of Michigan in Ann Arbor, developed the TB model and gave it to Michigan's team, which programmed it into a graphic processing unit. Agent-based modeling hasn't replaced test tubes, but it is providing a powerful new tool for medical research. Computer models offer significant advantages. It is possible to create a mouse that's missing a gene and see how important that gene is but with agent-based modeling, we can knock out two or three genes at once. In particular, agent-based modeling allows researchers to do something other methodologies can't: virtually test the human response to serious insults, such as injury and infection. While agent-based modeling may never replace the laboratory entirely, it could reduce the number of dead-end experiments.

More information:

http://www.sciencedaily.com/releases/2008/09/080916155058.htm

16 September 2008

Watch And Learn

In work that could aid efforts to develop more brain-like computer vision systems, MIT neuroscientists have tricked the visual brain into confusing one object with another, thereby demonstrating that time teaches us how to recognize objects. It may sound strange, but human eyes never see the same image twice. An object such as a cat can produce innumerable impressions on the retina, depending on the direction of gaze, angle of view, distance and so forth. Every time our eyes move, the pattern of neural activity changes, yet our perception of the cat remains stable. This stability, which is called 'invariance,' is fundamental to our ability to recognize objects, it feels effortless, but it is a central challenge for computational neuroscience. A possible explanation is suggested by the fact that our eyes tend to move rapidly (about three times per second), whereas physical objects usually change more slowly. Therefore, differing patterns of activity in rapid succession often reflect different images of the same object. Could the brain take advantage of this simple rule of thumb to learn object invariance?

In this study, monkeys watch a similarly altered world while recording from neurons in the inferior temporal (IT) cortex — a high-level visual brain area where object invariance is thought to arise. IT neurons "prefer" certain objects and respond to them regardless of where they appear within the visual field. After the monkeys spent time in this altered world, their IT neurons became confused, just like the previous human subjects. The sailboat neuron, for example, still preferred sailboats at all locations — except at the swap location, where it learned to prefer teacups. The longer the manipulation, the greater the confusion, exactly as predicted by the temporal contiguity hypothesis. Importantly, just as human infants can learn to see without adult supervision, the monkeys received no feedback from the researchers. Instead, the changes in their brain occurred spontaneously as the monkeys looked freely around the computer screen. The team is now testing this idea further using computer vision systems viewing real-world videos. This work was funded by the NIH, the McKnight Endowment Fund for Neuroscience and a gift from Marjorie and Gerald Burnett.

More information:

http://www.sciencedaily.com/releases/2008/09/080911150046.htm

09 September 2008

High-Resolution GeoEye-1 Satellite

GeoEye-1 the super-sharp Earth-imaging satellite was launched into orbit on 6th of September from Vandenberg Air Force Base on the Central California coast. A Delta 2 rocket carrying the GeoEye-1 satellite lifted off at 11:50 a.m. Video on the GeoEye Web site showed the satellite separating from the rocket moments later on its way to an eventual polar orbit. The satellite makers say GeoEye-1 has the highest resolution of any commercial imaging system. It can collect images from orbit with enough detail to show home plate on a baseball diamond. The company says the satellite's imaging services will be sold for uses that could range from environmental mapping to agriculture and defence. GeoEye-1 was lifted into a near-polar orbit by a 12-story-tall United Launch Alliance Delta II 7420-10 configuration launch vehicle. The launch vehicle and associate support services were procured by Boeing Launch Services. The company expects to offer imagery and products to customers in the mid- to late-October timeframe. GeoEye-1, designed and built by General Dynamics Advanced Information Systems, is the world's highest resolution commercial imaging satellite.

Designed to take color images of the Earth from 423 miles (681 kilometers) in space and moving at a speed of about four-and-a-half miles (seven kilometers) per second, the satellite will make 15 earth orbits per day and collect imagery with its ITT-built imaging system that can distinguish objects on the Earth's surface as small as 0.41-meters (16 inches) in size in the panchromatic (black and white) mode. The 4,300-pound satellite will also be able to collect multispectral or color imagery at 1.65-meter ground resolution. While the satellite will be able to collect imagery at 0.41-meters, GeoEye's operating license from NOAA requires re-sampling the imagery to half-meter resolution for all customers not explicitly granted a waiver by the U.S. Government. The satellite will be able to see an object the size of home plate on a baseball diamond but also map the location of an object that size to within about nine feet (three meters) of its true location on the surface of the Earth without need for ground control points. Together, GeoEye's IKONOS and GeoEye-1 satellites can collect almost one million square kilometers of imagery per day.

More information :

http://www.gisdevelopment.net/news/viewn.asp?id=GIS:N_nvjzcqpdlg&Ezine=sept0808&section=News

http://geoeye.mediaroom.com/

02 September 2008

LIDAR Bringing High-Res 3D Data

To make accurate forecasts, meteorologists need data on the vertical distribution of temperature and humidity in the atmosphere. The LIDAR system developed by EPFL can collect these data continuously and automatically up to an altitude of 10km. On August 26, EPFL will officially transfer this custom-developed LIDAR to MeteoSwiss, and from this point on Swiss forecasters will have access to this source of vertical humidity data for the models they use to calculate weather predictions. The project was supported by funding from the Swiss National Science Foundation. The LIDAR system developed by EPFL is a relative of the familiar RADAR systems used widely in weather forecasting. Instead of sending radio waves out looking for water droplets, however, the LIDAR sends a beam of light vertically into the sky. The ‘echo’ here is a reflection of that light from different layers in the atmosphere. This reflection is used to build an instantaneous vertical profile of temperature and humidity.

Traditional LIDAR systems are more finicky, typically needing to be tuned on a daily basis. The new LIDAR will operate at the Center for Technical Measurements at MeteoSwiss' Payerne weather service. It will provide an ideal complement to the traditional instrumentation already in place: a ground-based measurement network, balloon launched radio-soundings, radar equipment, remotely sensed windspeed and temperature measurements, and a station of the Baseline Surface Radiation Network, part of a world-wide network that measures radiation changes at the Earth's surface. The combination of all these measurements will open up new possibilities, and weather forecasting models stand to benefit. The acquisition of the LIDAR will bring high-resolution three-dimensional humidity data to Swiss weather forecasting for the first time.

More information:

http://azooptics.com/Details.asp?newsID=2990

01 September 2008

Serious Virtual Worlds Conference 08

Building on the real success of first Serious Virtual Worlds conference in 2007 this is your invitation to be a part of the newly emerging professional community for the serious uses of virtual worlds. Serious Virtual Worlds'08 is the only event focussing on the serious uses of these environments.

SVW'08 will address the live issue of how virtual worlds will cross boundaries both between the real world and virtual worlds and between virtual worlds. As people spend increasing time in virtual worlds how will they interoperate between these virtual and real spaces? SVW'08 is the only international event that takes these leading edge issues and addresses them in a compact 2 day event.

More information:

http://www.seriousvirtualworlds.net/