Showing posts with label LBS. Show all posts
Showing posts with label LBS. Show all posts

09 July 2015

Centimeter Accurate GPS

Researchers in the Cockrell School of Engineering at The University of Texas at Austin have developed a centimeter-accurate GPS-based positioning system that could revolutionize geolocation on virtual reality headsets, cellphones and other technologies, making global positioning and orientation far more precise than what is currently available on a mobile device. The researchers' new system could allow unmanned aerial vehicles to deliver packages to a specific spot on a consumer's back porch, enable collision avoidance technologies on cars and allow virtual reality (VR) headsets to be used outdoors. The researchers' new centimeter-accurate GPS coupled with a smartphone camera could be used to quickly build a globally referenced 3-D map of one's surroundings that would greatly expand the radius of a VR game. Currently, VR does not use GPS, which limits its use to indoors and usually a two- to three-foot radius.


Centimeter-accurate positioning systems are already used in geology, surveying and mapping, but the survey-grade antennas these systems employ are too large and costly for use in mobile devices. The breakthrough by Humphreys and his team is a powerful and sensitive software-defined GPS receiver that can extract centimeter accuracies from the inexpensive antennas found in mobile devices -- such precise measurements were not previously possible. The researchers anticipate that their software's ability to leverage low-cost antennas will reduce the overall cost of centimeter accuracy, making it economically feasible for mobile devices. Researchers have spent six years building a specialized receiver, called GRID, to extract so-called carrier phase measurements from low-cost antennas. GRID currently operates outside the phone, but it will eventually run on the phone's internal processor.

More information:

21 October 2010

Lightweight Mobile AR Navigation

A lightweight pair of augmented reality glasses that overlay the world with digital content, such as directions or a travel guide, has debuted in Japan. The headset, created by Olympus and phone-maker NTT Docomo, uses augmented reality software on an attached phone. A virtual tour of Kyoto was used as the first demonstration of the technology. While AR glasses are nothing new, these are among the first to add a miniature projecting display without too causing much encumbrance to the wearer. Researchers at the two companies said they had managed to whittle an earlier "AV Walker" prototype down from 91g to no more than 20g. The retinal display projects text and images directly into the user's peripheral vision, allowing the wearer to maintain eye contact with whatever they are observing normally.

As the glasses are attached to a smartphone with AR software, an acceleration sensor and a direction sensor, the AR Walker knows approximately what you are looking at and provides augmented information relevant to where you may be. The display can also be used to give directions with arrows and if a person lifts their head up to the sky a weather forecast is automatically protected into their peripheral vision. Augmented reality apps for smartphones such as Laya and Wikitude are already having some success as guides to our immediate surroundings. But as this usually involves holding up and pointing the mobile's camera in the direction you are looking AV Walker and its like have the added benefit of accessing information about your surroundings without altering your natural behaviour. According to the developers a release date for the AR glasses has yet to be determined.

More information:

http://www.bbc.co.uk/news/technology-11494729

04 October 2010

Cars As Traffic Sensors

Data about road and traffic conditions can come from radio stations’ helicopters, the Department of Transportation’s roadside sensors, or even, these days, updates from ordinary people with cell phones. But all of these approaches have limitations: Helicopters are costly to deploy and can observe only so many roads at once, and it could take a while for the effects of congestion to spread far enough that a road sensor will detect them. MIT’s CarTel project is investigating how cars themselves could be used as ubiquitous, highly reliable mobile sensors. Members of the CarTel team recently presented a new algorithm that would optimize the dissemination of data through a network of cars with wireless connections.

Researchers at Ford are already testing the new algorithm for possible inclusion in future versions of Sync, the in-car communications and entertainment system developed by Ford and Microsoft. For the last four years, CarTel, has been collecting data about the driving patterns of Boston-area taxicabs equipped with GPS receivers. On the basis of those data, the CarTel researchers have been developing algorithms for the collection and dissemination of information about the roadways. Once the algorithms have been evaluated and refined, the CarTel researchers plan to test them in an additional, real-world experiment involving networked vehicles. The new algorithm is among those that the group expects to test.

More information:

http://web.mit.edu/newsoffice/2010/cars-sensors-0924.html

23 September 2010

VS-GAMES '11 Conference

The 3rd International Conference in Games and Virtual Worlds for Serious Applications 2011 (VS-GAMES 2011) will be held between 4-6 May, at the National Technical University of Athens (NTUA) in Athens, Greece. The emergence of serious or non-leisure uses of games technologies and virtual worlds applications has been swift and dramatic over the last few years. The 3rd International Conference in Games and Virtual Worlds for Serious Applications (VS-GAMES’11) aims to meet the significant challenges of the cross-disciplinary community that work around these serious application areas by bringing the community together to share case studies of practice, to present virtual world infrastructure developments, as well as new frameworks, methodologies and theories, and to begin the process of developing shared cross-disciplinary outputs.

We are seeking contributions that advance the state of the art in the technologies available to support sustainability of serious games. The following topics in the areas of environment, military, cultural heritage, health, smart buildings, v-commerce and education are particularly encouraged. Invited Speakers include Prof. Carol O Sullivan who is the Head of Graphics Vision and Visualisation Group (GV2) at Trinity College Dublin and Prof. Peter Comninos who is the Director of the National Centre for Computer Animation (NCCA) at Bournemouth University and MD of CGAL Software Limited. The best technical full papers will be published in a special issue of International Journal of Interactive Worlds (IJIW). The best educational papers will be submitted to the IEEE Transactions on Learning Technologies. The paper submission deadline is 1st Nov 2010.

More information:

http://www.vs-games.org/

30 July 2010

A Smoother Street View

New street-level imaging software developed by Microsoft could help people find locations more quickly on the Web. The software could also leave new space for online advertising. Services like Google Street View and Bing Streetside instantly teleport Web surfers to any street corner from Tucson to Tokyo. However, the panoramic photos these services offer provide only a limited perspective. You can't travel smoothly down a street.

Instead, you have to jump from one panoramic ‘bubble’ to the next--not the ideal way to identify a specific address or explore a new neighborhood. Microsoft researchers have come up with a refinement to Bing Streetside called Street Slide. It combines slices from multiple panoramas captured along a stretch of road into one continuous view. This can be viewed from a distance, or ‘smooth scrolled’ sideways.

More information:

http://www.technologyreview.com/web/25880/

17 April 2010

Augmented Reality City Visits

Using a combination of personalised location-based services and augmented reality, in which multimedia content is superimposed and merged with real-time images, a team of European researchers and city authorities has created a device to bring a little movie magic to city visits by tourists, cinema lovers, inquisitive local residents and film professionals. The device, which resembles a pair of binoculars with an integrated camera and LCD screen, was tested in San Sebastián, Spain, and Venice, Italy, and is continuing to be developed with a view to creating a commercial product. It uses a hybrid tracking system to provide location-based information, and cities’ wireless communications networks to download and share multimedia content. Though smart phones incorporating features such as location-awareness and augmented reality applications have come onto the market in the three years since the CINeSPACE project began, researchers note that none offer the same immersive experience provided by a dedicated platform and device. Unlike staring at the small screen of a smart phone, the CINeSPACE device is held up to the eye like a pair of binoculars, allowing users to see multimedia content superimposed on a city scene, be it a popular shopping street or an historical square.

Users are guided around a city by an intelligent sensor-fusion system incorporating GPS, WLAN tags, inertia cubes and marker-less optical tracking. Personalised location-aware services tell them where to go and where to stand for the best augmented reality experience. And maps and other multimedia content are provided via a 4.5-inch augmented reality touch panel on the binocular device, with user preferences taken into account when selecting points of interest and content. The project partners say the device could be rented out by local tourism offices. Content may consist of video, photos or audio recordings, stored on a central server of the municipality and downloaded as required, and can come from a variety of sources, including the users themselves. The CINeSPACE system was tested in San Sebastián and Venice last summer, with trial users rating highly the overall concept of the system and the quality of the augmented reality content. Further work, aimed at addressing user feedback regarding the device and interface, has since led to a third prototype being developed by German micro-electro-optical device manufacturer and project partner Trivisio, which is planning to commercialise it.

More information:

http://cordis.europa.eu/ictresults/index.cfm?section=news&tpl=article&BrowsingType=Features&ID=91251

24 August 2009

Modified 3D HDTV LCD Screens

For the first time, a team of researchers at the California Institute for Telecommunications and Information Technology (Calit2) at the University of California, San Diego, have designed a 9-panel, 3D visualization display from HDTV LCD flat-screens developed by JVC. The technology, dubbed "NexCAVE," was inspired by Calit2's StarCAVE virtual reality environment and designed and developed by Calit2 researchers. Although the StarCAVE's unique pentagon shape and 360-degree views make it possible for groups of scientists to venture into worlds as small as nanoparticles and as big as the cosmos, its expensive projection system requires constant maintenance — an obstacle DeFanti and Dawe were determined to overcome. Researchers developed the NexCAVE technology at the behest of Saudi Arabia's King Abdullah University of Science and Technology (KAUST), which established a special partnership with UC San Diego last year to collaborate on world-class visualization and virtual-reality research and training activities. The NexCAVE technology was inspired by Calit2's StarCAVE virtual reality environment. The KAUST campus includes a Geometric Modeling and Scientific Visualization Research Center featuring a 21-panel NexCAVE and several other new visualization displays developed at Calit2. Classes at the brand-new, state-of-the-art, 36-million square meter campus start Sept. 5. When paired with polarized stereoscopic glasses, the NexCAVE's modular, micropolarized panels and related software will make it possible for a broad range of UCSD and KAUST scientists — from geologists and oceanographers to archaeologists and astronomers — to visualize massive datasets in three dimensions, at unprecedented speeds and at a level of detail impossible to obtain on a myopic desktop display.

The NexCAVE's technology delivers a faithful, deep 3D experience with great color saturation, contrast and really good stereo separation. The JVC panels' xpol technology circularly polarizes successive lines of the screen clockwise and anticlockwise and the glasses you wear make you see, in each eye, either the clockwise or anticlockwise images. This way, the data appears in three dimensions. Since these HDTVs are very bright, 3D data in motion can be viewed in a very bright environment, even with the lights in the room on. The NexCAVE's data resolution is also superb, close to human visual acuity (or 20/20 vision). The 9-panel, 3-column prototype that his team developed for Calit2's VirtuLab has a 6000x1500 pixel resolution, while the 21-panel, 7-column version being built for KAUST boasts 15,000x1500-pixel resolution. The costs for the NexCAVE in comparison to the StarCAVE are also considerably cheaper. The 9-panel version cost under $100,000 to construct, whereas the StarCAVE is valued at $1 million. One-third of that cost comes from the StarCAVE's projectors, which burn through $15,000 in bulbs per year. Every time a projector needs to be relamped, the research team must readjust the color balance and alignment, which is a long, involved process. Since the NexCAVE requires no projector, those costs and alignment issues are eliminated. The NexCAVE'S tracker (the device used to manipulate data) is also far less expensive — it's only $5,000 as compared to the StarCAVE's $75,000 tracker, although its range is more limited. NexCAVE's specially designed COVISE software (developed at Germany's University of Stuttgart) combines the latest developments from the world of real-time graphics and PC hardware to allow users to transcend the capabilities of the machine itself. The machine will also be connected via 10 gigabit/second networks, which allows researchers at KAUST to collaborate remotely with UCSD colleagues. The NexCAVE uses game PCs with high end Nvidia game engines.

More information:

http://www.calit2.net/

http://www.kaust.edu.sa/

http://ucsdnews.ucsd.edu/newsrel/general/08-09NexCave.asp

29 December 2007

Workshop on Virtual Museums Article

Last month, I presented an article with title ‘A Mobile Framework for Tourist Guides’ to the workshop on Virtual Museums, which was held in conjunction with the VAST 2007 conference. This article, presents how the LOCUS multimodal mobile framework can be used for tourist guides which can be any open-air heritage exhibition. The main objective of the multimodal heritage system is to provide advanced LBS to mobile users delivered through a web-browser interface. The mobile system allows tourists to switch between three different presentation guides including map, virtual and augmented reality. Localisation of the visitors is established based on position and orientation sensors which are integrated on light-weight handheld devices. To illustrate some of the capabilities of the mobile guide two case-studies were presented: one for the national Swiss park while the second was developed for City University.

Using the City University mobile guide, pedestrians can navigate intuitively within the real environment using both position and orientation information on a mobile virtual environment. Additional functionality such as dynamic switching of camera viewpoint from the pedestrian view to a birds-eye view can be accessed from the menu buttons. Another important aspect of the guide is that the digital compass can be also used as a virtual pointer to provide useful information about the surroundings such as ‘what is the name of the building?’ or ‘how far it is located from me?’ etc. Routing tools are developed to provide advanced navigational assistance to mobile users based upon the experience of previous users, and so may suggest different routes depending on whether the journey is to be taken.

A draft version of the paper can be downloaded from here.

22 September 2007

Aslib Proceedings Journal Article

Last month, Aslib Proceedings has published a journal article I co-authored with a colleague at City University with title ‘Mixed reality (MR) interfaces for mobile information systems’. The paper presented some of the results obtained from the LOCUS research project. The purpose of this paper was to explore how mixed reality interfaces can be used for the presentation of information on mobile devices. The motivation for this work is the emergence of mobile information systems where information is disseminated to mobile individuals via handheld devices. The LOCUS project is extending the functionality of the WebPark architecture to allow the presentation of spatially referenced information via these mixed reality interfaces on mobile devices.

In particular, the LOCUS system is built on top of the WebPark mobile client-server architecture which provides the basic functionality associated with LBS including the retrieval of information based upon spatial and semantic criteria, and the presentation of this information as a list or on a map (top images). However, the LOCUS system extends this LBS interface by adding a VR (bottom left image) and an AR interface (bottom right image). We strongly believe that the most suitable interface for mobile information systems is likely to be user and task dependent, however, mixed reality interfaces offer promise in allowing mobile users to make associations between spatially referenced information and the physical world.

The abstract of the paper can be found online at:

http://www.emeraldinsight.com/Insight/viewContentItem.do;jsessionid=C6648C35B8ACA30C963A51EBCE6E16BE?contentType=Article&contentId=1626454

Also a draft version can be downloaded from here.

08 July 2007

Location Based Gaming (LBG)

The use of global positioning technology in gaming is giving developers a new area to explore with Location Based Gaming (LBG). The toys use various technologies, including GPS, motion tracking, large-scale video projection and Bluetooth. The E911 directive and the rise in distribution of GPS-enabled handsets introduced the idea that location is the next big thing. Geo-caching, a real world treasure hunt and Pac-Manhattan, a real-world version of 1980's video game sensation Pac Man are the examples of a 'little' variation in the traditional gaming. Location Based Gaming is a means of playing a video game using technology like Global Positioning Satellites (GPS) that combines player's real world with a virtual world on the handset. The physical location becomes part of the game board allowing the player to interact with his/her physical environment. Players move through the city with handheld or wearable interfaces. Sensors capture information about the players’ current context, which the game uses to deliver an experience that change according to their locations and actions. In collaborative games, this information is transmitted to other players, on the streets or online.

LBG might include tracking a phone as it moves through a city during a treasure hunt, changing the weather in the game to match the weather in the players’ location, or monitoring players’ direction, velocity and acceleration during a high-intensity “fight”. The location technology also enables bonus features like challenging players close to one’s location for the ultimate fight or seeing comparative scores by vicinity. The net result is a game that interleaves a player’s everyday experience of the city with the extraordinary experience of a game. When Sony added GPS functionality to its flagship gaming console, PSP, it raised the bar for the designers. Nintendo, X box, Gizmondo and many others were quick to follow the suit. Now the gaming software developers are expected to come up with games that blur the edges between the virtual world and the real one. Of course, enabling global positioning technology will also materialize the idea of integrating standard navigation features and geo-tagging in gaming devices, but their survival will hugely depend on the ultimate gaming experience they are expected to deliver. Some of the most characteristic LBGs include: ‘Wall Street Fighter’, ‘Can You See Me Now?’, Swordfish and ‘Torpedo Bay’ and ‘Tourality’.

Wall Street Fighter - The latest from YDreams Wall Street Fighter, powered by KnowledgeWhere’s Location Application Platform (LAP), is a location-based game (LBG) where the world of business works as the backdrop for some fun fighting antics. The objective of the game is to make it to the top of the business food-chain by fighting everybody at the Bonds Office. The location based features include Location-based scenarios that change with your real location, multiplayer game that allows the player to challenge players close to his/her location for the ultimate fight and location-based rankings that show comparative scores by vicinity. The game was a finalist in the NAVTEQ LBS challenge under Entertainment & Leisure Applications category.

Can You See Me Now? - Performed by Blast Theory, a UK based adventurous artists groups using interactive media, Can You See Me Now? is an artistic performance in the form of a game in which online players are chased across a virtual city by three performers who were running through the actual city streets. The concept for CYSMN is a chase game, played online and on the streets. Blast Theory's players are dropped at random locations into a virtual map of a city. Tracked by satellites, professional runners appear online next to your player. The runners use handheld computers showing the positions of online players to guide them in the chase. Online players try to flee down the virtual streets, send messages and exchange tactics with other online players. If a runner gets within 5 metres of you, a sighting photo is taken and the game is over. Can You See Me Now? Won the Golden Nica for Interactive Arts at the 2003 Prix Ars Electronica and was nominated for a BAFTA Award in 2002.

Swordfish and Torpedo Bay - Blister, a wholly owned subsidiary of Canadian firm Knowledge Where Corp. published location-based game called Swordfish on the Bell Mobility network across Canada in July 2004 and later on Boost Mobile. To play Swordfish, a location-based fishing game, the player uses his/her mobile phone to find virtual fish and go fishing. Using Global Positioning Systems (GPS), Swordfish simulates a deep sea fishing experience on a mobile phone turning the player’s real world into a virtual ocean. The player has to move around to play this game. Using GPS technology in the mobile phone, the player's position is determined via a fish-finder so that the player can see where the nearest school of virtual fish is located in relation to his/her current position. The fish finder also features navigational assistance by providing the direction of the closest school of fish and an optional localized street map of your current location with virtual schools of fish. Also by Blister Torpedo Bay is a location-based naval battle game in which the player uses mobile phone to shoot various aircraft carriers, destroyers and submarines. The game uses Location Application Platform (LAP) that allows users from multiple carriers and multiple networks to interact within the same gaming environment. To tackle the problem of GPS and A-GPS signal fading Torpedo Bay implements predictive positioning algorithms that improve the accuracy and availability of GPS locates within problematic areas. Apart from that the game uses real map data to assist in the locating of enemy ships, weapons, and health.

Tourality - Currently available in Austria, Tourality is mobile game that combines sporty outdoor activity with virtual gaming experience. The challenge before player is to reach geographically defined spots in reality as fast as possible. Player's movement directly influences the gaming progress. To play Tourality the player would require a mobile phone that supports Java and Bluetooth GPS receiver. The player will also require an internet connection (GPRS/UTMS connection) of the mobile network operator. The player equipped with a mobile phone and a Bluetooth GPS receiver has to reach spots before his/her opponents. A spot is a certain point on a virtual map that the player has to reach in reality. The player's real position is transmitted from the Bluetooth GPS receiver to the player's mobile phone and is shown on the display. Tourality shows the position of all participating players as well as the spots to reach on the player's mobile phone. The player will know the spots still to reach and their location.

More information:

24 January 2007

Mobile Map Interface

Pedestrian navigation and wayfinding for mobile devices is an area of continuous research with increased interest not only from the academic community but also from the rest of the world. In this blog, different approaches have been proposed in the past (i.e. Virtual Navigator, MD3DM VR Interface) but were mainly concerned with virtual reality technology. In parallel to these approaches, a mobile map interface has been developed. The interface is based on 2D visualisation technologies and can display different versions of 2D maps (raster, vector and aerial) and textual information. In addition, it can serve as a basic location-based system (LBS) by providing simple search capabilities (i.e. where is the nearest post office). The main functionality of the interface includes the following three categories: map visualisation, map navigation and finding local information as illustrated below.

The ‘map visualisation’ component provides a simple but effective way of visualising digital maps in mobile devices. To provide a multi-level visualisation framework similar to the ones existing in GIS software, four different types of maps can be displayed. Next, the ‘map navigation’ component allows users to interact with the 2D digital maps by either zooming, rotating or moving in eight directions inside the map using the controls embedded in the interface. Hotspots can be used to enlarge the map to cover the whole mobile display screen. Hotspots can be also used as hyperlinks, linking the map with web-pages that either contains relevant information about the location (i.e. City’s University website) or other types of digital maps such as Google maps. The ‘find local information’ component allows participants to search for geographical information such as street names and services (i.e. list of restaurants, bars, etc) and display information about them on the map.

11 November 2006

Location-based services for all

The LBS4all (Location-based services for all) project focuses on providing location-based services for people with mobility problems. In particular, it aims in providing navigational help for three types of users including: older, visually impaired and blind people. LBS4all is designed to exploit new mobile computing, positioning and communication technologies in order to aid navigation around urban environments.

The above screenshot illustrates how a user can navigate using the lbs4all software installed on a windows mobile 5.0 PDA. A digital compass is also used (which is inside the rectangular box) to provide orientation information to the user by updating the position on the digital map.

More information:

http://lbs4all.soi.city.ac.uk/