27 February 2007

Windows Mobile 6.0

A few days ago (12th February 2007), at the 3GSM trade show in Barcelona, Spain, Microsoft unveiled Windows Mobile® 6, formerly code-named Crossbow, the long-awaited successor to Windows Mobile 5. Windows Mobile 6 improves usability and adds support for Microsoft® Office features previously available only on PCs, and delivers to the small screen a familiar and rich experience that meets the needs of work and life while on the go, all with a single device. In brief, some of the most significant new features include the following:
  • HTML support in email
  • Windows Live for Windows Mobile
  • File transfer capability in Windows Live Messenger
  • New versions of mobile Outlook, Word, Excel, and PowerPoint with rich editing
  • Remote wipe capability for stolen and lost devices
  • Call history in contact cards
  • Tight Vista integration
  • "Calendar ribbon" for more easily viewing schedule by day or week
  • New versions of .NET Compact Framework and SQL Server built-in
However, the first devices using the software aren't expected until spring, however, with the bulk of products using the new operating system likely to come in the second half of the year. In Europe, Orange plans to deliver the SPV E650 smart phone from HTC, and in Japan, SoftBank Mobile Corp. will offer new devices from Toshiba and HTC. And in the United States, the popular T-Mobile Dash will be updated with Windows Mobile 6 and be available in the coming months. Current T-Mobile Dash owners will also be able to upgrade existing devices with Windows Mobile 5.0 to Windows Mobile 6.

Scores of additional mobile operators and device makers from around the globe, including Cingular Wireless, now the new AT&T, Chunghwa Telecom, Dopod International Corp., HP, LG Electronics, Motorola Inc., Palm Inc., Samsung, SingTel, Sprint, Telefónica, Toshiba, Verizon Wireless and Vodafone and Willcom, plan to ship Windows Mobile 6-based devices this year. Many of these partners are expanding large existing portfolios of Windows Mobile powered smartphones.

More information:



21 February 2007


Mixed Reality Geographical Information System (MRGIS) started in City University in January 2005 with the aim of presenting geographical information in a new and educational way. MRGIS is a tangible mixed reality interface designed for windows-based operating systems that allows users to superimpose three-dimensional geographical information into the real environment and interact with it in a user-friendly manner. A screenshot illustrating MRGIS augmenting a three-dimensional map representing City University campus is shown below.

MRGIS provides the necessary technology to visualise 3D geographical information in real-time performance using computers like: desktops, laptops and tabloids. Currently, the multimedia augmentation of virtual information consists of 3D and 2D maps, spatial sound and textual directions. Users can perceive a multimedia augmentation either on monitor-based displays (i.e. CRT, LCD, or TouchScreen) or on head mounted displays (HMDs). Users can interact in a number of different ways including tangible, hardware and software based solutions. Tangible interactions are performed using physical marker cards. Hardware interactions are realised through the use of I/O devices such as the keyboard and the mouse or even more sophisticated devices like magnetic and GPS sensors. Software based interactions are performed using the functionality implemented in the user-friendly graphical user interface (GUI).

06 February 2007

JVRB Article

As part of the LOCUS project, last month, the Journal of Virtual Reality and Broadcasting (JVRB) has published an article I wrote with colleagues from City University, with title ‘Exploring Urban Environments Using Virtual and Augmented Reality’. The paper proposes the use of a specific mobile architecture, for navigation in urban environments. The aim of this work is to assess how virtual and augmented reality interface paradigms can provide enhanced location based services using real-time techniques in the context of these two different technologies.

The virtual reality interface is based on faithful graphical representation of the localities of interest, coupled with sensory information on the location and orientation of the user, while the augmented reality interface uses computer vision techniques to capture patterns from the real environment and overlay additional way-finding information, aligned with real imagery, in real-time. The knowledge obtained from the evaluation of the virtual reality navigational experience has been used to inform the design of the augmented reality interface. Finally, some initial results of the user testing of the experimental system for navigation are presented.

The original article may be accessed online from here.