Research Fellow, University of Nottingham
Joy Egede is a Research Fellow in the Computer Vision Lab at the University
of Nottingham. Her research focuses on understanding human behaviour via
analysis of audio-visual and biosignals using machine learning and computer
Previously, her PhD thesis explored the application of traditional and deep- learned algorithms to continuous pain estimation in adults and neonates. This work primarily focused on the analysis of facial expression changes in response to pain stimuli. Her PhD research led to the creation of the large- scale Acute Pain in Neonates (APN-db) and the Neonatal Face and Limbs Pain Scale (NFLAPS). Details of these can be found in the Projects section.
Currently, she works with the Biomedical Research Center in developing methods for detecting and interpreting medical conditions from expressive human behaviour as well as designing models for user interfaces that adapt content delivery based on social signals read off the user. This includes projects relating to the automatic objective assessment of comorbid mental health issues and pain, and using virtual agents to deliver health advice to mothers and mothers-to-be in sub-saharan Africa.
In addition to her main research responsibilities, She actively takes part in the organisation and promotion of research activities such as conferences and data challenges within the Affective computing and Computer vision community. For example, she is one of the organisers of the first-ever international multimodal pain estimation challenge (EMOPAIN2020), to be hosted at the IEEE Face and Gesture Recognition conference in Argentina in May 2020. She also volunteers her time on programs aimed at promoting the inclusion of women in science through knowledge transfer, mentoring and personal campaign.
Acute Pain in Neonates Database (APN_db)
Understanding and interpreting pain mechanisms in newborn babies is a complex yet vital aspect of neonatal healthcare. However, current pain assessment methods in NICUs are plagued by high subjectivity and a lack of support for continuous real-time pain monitoring. Technology-assisted methods have been explored to address these problems. However, their clinical adoption has been limited by subpar performance due to insufficient representative training data required to drive such life-critical predictive technological models.
The APN_db project aims to create a large-scale dataset of behavioural and physiological newborn responses to pain stimuli to support the development of automated multi-modal pain assessment method. As the Project PI, I have completed the first phase of the project which involved the recording of over 200 babies going through painful/painless medical procedures at the National Hospital Abuja. The first phase of the project was supported by the Horizon Impact Research Grant. See project page for more details. Also, see publication here.
The Interactive Agents with Literacy, Trust and Comprehension-aware Artificial Intelligence (Inter-ALTCAI) project is a Unilever and DFID-Transform funded project which aims to explore the potential of emerging Interactive Artificial Intelligent Agent technologies as a route to creating rich, trusting and effective digital interactions that are effective in engaging the user's low literacy, low connectivity, low base expertise contexts. The project uses an ethnographic and participatory design approach in addressing these environmental and culture-specific constraints. Currently, the project is focussed on design, implementation and evaluation of a prototype virtual interactive agent that will give informal health advice to pregnant women and nursing mothers.
I am conducting both the machine learning and computer vision-driven aspects of the virtual agent's design as well as facilitating the ethnographic studies. The participatory design aspects have involved a trip to Lagos Nigeria, to organize workshop sessions with perinatal mothers and Patent and Propriety Medicine Vendors (also known as Chemists) to find out information of relevance to mothers, design considerations and to get feedback on an early prototype.
I currently involved in MSc project supervisions related to the application of machine learning and computer vision and social signal processing to healthcare management and information delivery.
In the past, I assisted in the teaching as well as facilitating programming lab sessions for the following modules: Introduction to Computer Engineering (C-programming), Software Quality Metrics, Software Engineering Methodologies and Introduction to Software Engineering and Object-oriented Programming.
Egede, J., Valstar, M., Torres, M.T. and Sharkey, D., 2019, September. Automatic Neonatal Pain Estimation: An Acute Pain in Neonates Database. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1-7). IEEE.
Egede, J.,Song, S., Valstar, M., Williams, A., Olugbade, T., Wang, C., Meng, H., Aung, M., Lane, N. and N. Berthouze 2020. Emopain challenge 2020: Multimodal pain evaluation from facial and bodily expressions. arXiv preprint arXiv:2001.07739.
Jaiswal, S., Egede, J. and Valstar, M., 2018, May. Deep Learned Cumulative Attribute Regression. In 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018) (pp. 715-722). IEEE.
Egede, J., Valstar, M. and Martinez, B., 2017, May. Fusing deep learned and hand-crafted features of appearance, shape, and dynamics for automatic pain estimation. In 2017 12th IEEE international conference on automatic face & gesture recognition (FG 2017) (pp. 689-696). IEEE.
Egede, J.O. and Valstar, M., 2017, November. Cumulative attributes for pain intensity estimation. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (pp. 146-153).
Room B49 School of Computer science
University of Nottingham
Email: joy dot egede at nottingham dot ac dot uk