Immersive Games

The EPSRC visiting fellowship allows Professor Yangsheng Wang of Institute of Automaton, Chinese Academy of Sciences to visit the University of Nottingham over a three-year period (1/6/2005-31/5/2008) to work with researchers in the School of Computer Science & IT on issues relating to Immersive Games - gaming that enables the player to convincingly enter and participate in a virtual world.

For some examples of currently avaliable commercial controllers (vision based or otherwise) that attempt to enhance a users ingame experience by introducing a higher level of immersion can be seen on the commercial immersive controllers page.

Summer 2005 visit - Fp6.

Meeting with EU advisor Martin Pickard, Professor John Wilson, etc. Seeking project partners: Nottingham, Bournemouth Media School, CNAM Paris, Nokia, Siemens, Romanian University, CAS, Shanda. Proposal submitted.

Summer 2006 visit - Fp7.

Meeting with ASAP group, demos.

Meeting with MRL group, demos. Talked with Professor Steve Benford about Fp7, also about taking MRL projects to China, e.g., virtual video conference, multiplayer games.

Meeting with Vision group, demos. Talked with Dr Bai about various research issues and the future.

Meeting with Dr Mirabelle DCruz and Dr Rupert Soar about Fp7.

Phone conversations with Professor Jianjun Zhang, Bournemouth Media School, with regards to collaboration in immersive games.

Prior to the visit, talked with Professor Stephen Natkin of France about strenthening previous proposal, e.g., partners, connectivity between partners, research focus, and mention the EPSRC fellowship.

19 September 2006 - EPSRC Public Communication Training Day, all 3 investigators participated.

12th October - Blitz Games company visited us with a view to collaboration.

April 2007 visit - We visited France & Geneva partners for the Fp7 proposal.

April 2007 - Li Bai was offered by CAS a visiting fellowship to work with CAS.

April 2007 - Li Bai was successful in a new EPSRC grant application, which built on this visiting fellowship project. The title of the new grant is "Collaboration for Success in Rehabilitation, Games, and Robotics". The aim is to set up an international special interest group in advanced interfaces for applications. The project includes networking activities and also placement of Professor Wang's researchers in Nottingham.

July 2007 - NIDE proposal passed evaluation "threshold".

November 2007 - naturalinterfaces.eu website is set up.


Aims and Objectives

Methodology

Nottingham Immersive Technologies

Publications

Aims and Objectives

To investigate advanced interface methods for games

To investigate AI techniques for games

To explore collaborative research opportunities

Methodology

We are primarily concerned with the research issues for developing tools and advanced interfaces for computer games using an existing game engine. We aim to develop and demonstrate an online game prototype with intelligent game characters that can be played through various forms of interfaces, including gesturing and interacting with virtual objects. The visiting fellowship will have a ong term effect through the involvement of research students from all the participating research groups at Nottingham.

Research Theme One: Advanced Game Interfaces

Normal game interfaces provide graphical icons for the player to click on, menu systems for the player to navigate through, or game control systems for the player to steer or control the characters in the game. These not only cause inconvenience for the player but they do not ‘immerse’ the player with the game. In this visiting fellowship we aim to investigate advanced and intuitive game interfaces. The ultimate immersion is for the player to become part of the game and interact with game characters by intuitive means. We will investigate the use of hand tracking to control and interact with game characters, and augmented reality techniques to interact with virtual game objects. Interaction with virtual characters is of huge interest not only for computer games but also for possible future immersive platforms. Decreasing size and cost of virtual/augmented reality and visualisation hardware, coupled with increasing portable processing power of laptop and handheld computers, has been a driving force for mobile AR systems, which has shown a marked increase in the last few years.

Research Theme Two: Game AI

The aim of this research theme is to investigate approaches that could be used to produce a convincingly intelligent opponent, particularly the use of reasoning capabilities based upon the knowledge and beliefs of the opponent. While there has been research into intelligent agents in games, there remains a need to incorporate strategic capabilities missing in contemporary video games, whilst taking into account the conflicting requirements of efficiency and realism. We will explore how our existing AI and games research, in particular, intelligent agent, artificial life and swarm intelligence, search and optimisation may be applied to games.

A desirable property for computer game agents is the ability to blend reactive behaviours with goal directed behaviour in order to form high level plans whilst being able to react quickly to immediate danger. One approach is to use an anytime planning agent in which the planning process can be interrupted at any time and will immediately return a useable result. Adaptability is another important ability. If a human has a weakness in their tactic, they will adjust their tactic, rather than letting an opponent repeatedly exploit that weakness. This involves analysing why their plans failed to avoid repeating the same mistake. For example, Norling proposes a method to include this by combining the belief-desire-intention architecture with the recognition primed decision making. Due to the large number of factors which must be considered when a plan has failed, the search space is restricted by only considering a set of cues provided by the programmer.

Research Theme Three: Collaborative research and funding opportunities

This fellowship will have a significant impact on several of our research areas. Though there are researchers at Nottingham working on various aspects of the work described above we will need a dedicated researcher to work on each of the identified research areas. This means that an important part of this fellowship involves compiling research proposals seeking research funding support from various UK and China funding sources. In addition to the research program outlined above, we will use the opportunity provided by Prof Wang's visit to explore mutual research interests in other areas, such as face recognition. We intend that these initial investigations will form the basis of future research work and the foundation of deeper and sustained research collaboration between the two bodies.

Nottingham Immersive Technologies

Several projects withing the Nottingham Vision Research group implement techniques which are directly applicable to immersive gaming technologies. Each project is a vision based system which could be used for enhancing a users immersive experience within a gaming environment. Some of the projects (such as Martin Tosas' hand tracking system) would make ideal user interfaces for various immersive gaming worlds where as other projects (such as the showcased 3D technologies) aim to augument the gaming world with real world objects and users to aid in the creation of a more immersive gaming environment.

Conference publications

1. Qibin Hou, Li Bai, Xiangsheng Huang, Yangsheng Wang, Mesh Smoothing via Adaptive Bilateral Filtering, 4th International Workshop on Computer Graphics and Geometric Modelling, CGGM'2005, Lecture Notes on Computer Science, Springer-Verlag, Alanta, USA, May 2005.

2. Linlin Shen, Li Bai, Daniel Bardsley, Yangsheng Wang, Gabor Feature Selection for Face Recognition using Improved AdaBoost Learning, International Workshop on Biometric Recognition Systems (IWBRS2005), Beijing, China, October 2005.

3. Yi Song, Li Bai, Yangsheng Wang, 3D Object Modelling for Entertainment Applications, ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE2006), California, USA, 2006.

4. Shuchang Wang, Yangsheng Wang, Li Bai, Face Decorating System Based on Improved Active Shape Models, ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE2006), California, USA, 2006.

5. Xuetao Feng, Yangsheng Wang, Yong Gao, Li Bai, A Fast Eye Location Method Using Ordinal Features. ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE2006), California, USA, 2006.

6. International conference on Digital Interative Media Entertainment & Arts (DIME-ARTS 2006), Bangkok, Thailand on 25th-27th Oct. 2006.

7. Li Bai, Yi Song, Yangsheng Wang, 3D Modelling for Metamorphosis for Animation
Edutainment 2008, the 3rd International Conference on e-Learning and Games
Nanjing, China, June 2008. Lecture Notes in Computer Science, Springer-Verlag.

8. Yi Song, Li Bai, 3D Modelling for Deformable Objects, AMDO 2008, International Conference on Articulated Motion and Deformable Objects, Andratx, Mallorca, Spain, July 2008. Lecture Notes in Computer Science, Springer-Verlag.

9. Jituo Li, Li Bai, Yangsheng Wang, Animating unstructured 3D hand models
IVA08, International Conference on Intelligent Virtual Agents, Tokyo, Japan, September 2008.

Journal publications

1. Jituo Li, Yangsheng Wang, Xia Zhou, Li Bai, Personal Human Modeling from Images
Journal of Computer Aided Design & Computer Graphics, 2008 (in press)

2. Jituo Li, Yangsheng Wang, Li Bai, Rapidly Generating 3D Virtual Human Models from Images in Orthogonal Views, Journal of Computer-Aided Design, Impact factor: 1.446 (Submitted 2008)

External examiner

Exchange of students/researchers: Jituo Li, Xiaolong Zheng

 

(DIME-ARTS 2006 programme committee)

 

 

Black & White Game using Visual Interaction
Hand Tracking
Demo Video 1 Demo Video 2

This work proposes the concept of virtual touch screen, as a means of HCI for the AR environment. Windows, icons, menus, buttons, or other type of GUI controls can be superimposed on the virtual touch screen. Users see these interfaces floating in front of them on an HMD, and can interact with these virtual interfaces directly by touching them. One or multiple cameras can be used to track the hand and interpret its motion so to detect 'clicking’, and buton 'pressing' actions on the virtual screen.

Such HCI systems used within a gaming environment will allow the player interact with game characters in a novel and more natural way than a conventional keyboard/mouse or gamepad based systems. Our system has more advanced features than commercial systems such as the Eyetoy.

3D Tracking
Demo Video (36,662kb)

Using the same techniques as in the 3D Scanning project it is possible to track the 3D position of a number of world points observed through independant cameras. USing such a system controlling a character in a 3D world would appear more natural as real world motion could be transfered directly to ingame movement. Furthermore the system would have the ability to "recognise" a greater degree of movement than items such as the EyeToy which functions purely in 2 dimensions. The up and coming Nintendo controller also has the ability to detect its own 3D position suggesting that sensing a users real world position and movement then converting these into the game environment has a promising future as games attempt to become more immersive.

3D Scanning
Demo Video (2,275kb)

The ability to construct accurate and believable 3D models for use within the gaming environment is essential. 3D reconstruction processes would allow games players to place themselves and friends directly within the gaming world. Familiarity with the appearance of ingame characters would allow a greater degree of immersion that with generic character models.

The technology could also be adapted towards gesture recognition technologies which could potentially allow an immersive game to react based on facial expressions of the games player.

3D Morphing
Demo Video 1 (964kb) Demo Video 2 (1,516kb)
This work focuses on developing technology to enable dynamic, automatic manipulation of 3D models. By mapping 3D models into a unique parameter space it becomes possible to automatically morph from one 3D model to another without any time consuming user interaction.

The work has potential applicability to the entertainment and special effects industry as well as in an immersive gaming environment. A simple example of how this morphing technology could be used in an immersive environment would be to allow a user to tailor there own in-game appearance based on either in-game cues (for example eating a lot in the game could make the player appear fatter) or to allow the user to tailor their looks to suit there mood.

Demo 1 shows a 3D model of a face morphing between three different subjects. Demo 2 show a model of a single subject being manipulated first to increase the "weight" of the subject and secondly to increase the nose size.