I teach at the School of Computer Science, University of Nottingham.

Student Projects (UG and PGT level)

My projects typically focus on HCI and therefore involve people, but in these times your project should be practical while following social distancing guidelines. Typical activities in a human-centred computing project include

  • Requirements gathering through interviews or ethnography, to inform your design,
  • Testing your design with users, for example through trialling your prototype with users,
  • Evaluation of your design through analysis of system use or user feedback.


Indoor Positioning for Telepresence Robots (with Microsoft Research) 

To start in Autumn 2021

Telepresence robots allow remote users to move around a space autonomously, but they do not allow for remote manipulation of objects and the environment like door handles and lifts.

In this project you will build an indoor positioning system for a telepresence robot located in the University’s new Co-bot Maker Space Lab on Jubilee Campus, you will have regular access to a Double 3 telepresence robot. You will then connect the positioning to a web or proximity-based system to actuate and interact with objects and the building / environment as well as triggering digital actions, such as sending notifications. This could use a system such as “If This Then That” to allow space users to script rules to trigger events when the robot is in a certain location, or actuation via bluetooth or proximity beacons such as iTabs. The project thus combines active research topics of telepresence, indoor positioning, and the Internet of Things.

The project is connected to a PhD project sponsored by Microsoft Research Cambridge. Successful completion of the project may provide an advantage when applying for open positions within Microsoft Research Cambridge post graduation.


Chatty Car –  voice interaction for autonomous vehicles. 

In this project you will create a voice interaction prototype to allow in-car interaction with the autopilot of an (semi-) autonomous vehicle. The prototype is envisioned to provide ‘explainable AI’, and go beyond current in car voice control of in-car entertainment and navigation. For instance the prototype should allow the in-car occupants to query decisions of the autopilot. This is a human-centred design project, not a pure software engineering project. Thus it will involve working with ‘users’, for instance to develop design requirements and to evaluate the system in use.

The project will relate to the national centre UKRI-funded Trustworthy Autonomous Systems Hub (tas.ac.uk), which involves multidisciplinary research from Engineering (esp. Human Factors), Computer Science (e.g., HCI), Social Sciences and Law.

Interested MSc students should study CS, CS(AI) or HCI and have an interest in Human-Computer Interaction and some experience in voice/chatbot design and development. An interest in ‘explainable AI’ and participating in the Human-AI interaction module is a bonus.


Voice UX

Voice user interfaces (VUI) are becoming a pervasive feature on mobile devices, in cars, and in standalone “smart speakers”, yet, the user experience (UX) of these is often lacking.  Potential project can focus on design and evaluation, or on studying and critiquing Voice UX. This may draw on existing platforms such as Google Speech API, Amazon Alexa Skills, etc. Projects may focus on

  • User-centred design and evaluation of a VUI of your choice,
  • Study a pre-existing VUI and critique it in depth, with “implications for design”,
  • Explore how to support Voice UX designers’ practices.


Social media sentiment analysis (available again from Autumn 2021)

How trustworthy are autonomous systems such as robots, driverless cars, and contact-tracing apps? This project involves making use of NLP (Natural Language Processing) to analyse `sentiment’ of publicly available statements by the news media and the general public, such as from Twitter or Facebook. Such a project involves,

  • obtaining a data set that includes relevant data for your topic;
  • pre-processing of the data, such as to ‘clean’ the data;
  • develop appropriate (context/domain-specific) classification model for the data (e.g., feature selection);
  • analyse the sentiment of the data following machine learning-based techniques.


Human-Robot Collaboration (not on offer in 2021)

NOTE: For students interested in human-robot collaboration, see the Indoor Positioning project above. 

There is the opportunity for a project centred on human-robot interaction/collaboration, in conjunction with the  Nottingham Advanced Robotics Laboratory. The student would have the opportunity to work and implement their project on a real-life collaborative robot (cobot), such as Baxter. Capabilities include movement, grasping of objects, and Computer Vision. Some project ideas:

  • Voice-based interaction: design a voice interface enabling the robot to understand and execute verbal (spoken) commands, and talk back;
  • Vision-based interaction: implement Computer Vision to enable the robot to recognise certain objects, and a way for the user to instruct the robot.
  • A combination of the two above.



Human-AI Interaction (COMP3074).

I am convening this new Level 3, 20 credit module every autumn, starting in 2020/21. It is also open to MSc students in the School of Computer Science with a background in Programming. 

Previous teaching

I have previously convened Design Ethnography in 2019/20, 2018/19, 2017/18, in 2016 (spring and autumn), and and together with Andy Crabtree in the spring semester 2015.

I have also co-convened Understanding Users in Computer Science (G54MET) together with Andy Crabtree in the fall semester 2014.


PhD Supervision

I’m currently co-supervising Andriana BoudourakiGustavo BerumenTeresa Castle-GreenGisela Reyes Cruz, and Elaine Venancio Santos.

Past students include:

Tommy Nilsson

Martin Porcheron  (graduated in 2019).

Huseyin Avsar (graduated in 2017).

Wenchao Jiang (graduated in 2016).

Leave a Reply