Join Us
Current Openings
Funded PhD Research Opportunities
We have two fully funded PhD studentships for talented candidates to join us from the 1st of October 2023.
You will recieve an annual tax-free stipend based on the enhanced UKRI rate (currently £17,668 + £2223) plus fully-funded PhD tuition fees for the four years (Home/UK students only)
The students will work with the Boston Dynamics Spot robot to solve real life inspection problems in extreme and factory scenarios. We will develop incremental learning methodologies to develop context-based policies, not only for navigation, but error recovery in long term automation. Human-in-the-loop and teleoperated control methods will be used as the backbone strategy to ensure increasing levels of autonomy during inspection. We will look at human-robot interaction methodologies for the day-to-day operation of the Boston Dynamics Spot mobile inspection robot.
The two projects will be in collaboration with RACE (https://race.ukaea.uk/) and Reckitt (https://www.reckitt.com/).
For more information and to apply, please visit https://cdt.horizon.ac.uk/apply/current-opportunities/featured-theme-human-robot-interaction/
The deadline to have completed and submitted your application to NottinghamHub is 15 May 2023
Fully funded PhD Studentships in the Doctoral Training Centre in Artificial Intelligence - NOW CLOSED
We have fully funded interdisciplinary PhD students in the Faculty of Science. We are looking for talented and enthusiastic students to join us from the 1st of October 2023.
You will recieve an annual tax-free stipend based on the UKRI rate (currently £17,668) plus fully-funded PhD tuition fees for the four years (Home/UK students only)
For more information and to apply, please visit https://www.nottingham.ac.uk/computerscience/documents/applicant-guide-2023.pdf
The deadline to have completed and submitted your application to NottinghamHub is 31st March 2023
The CHART Research Team are offering several projects in collaboration with their partners:
Intelligent sensing and data fusion in a smart environment for human activity recognition to support self-management of long-term conditions
To discuss this project please email: praminda.caleb-solly@nottingham.ac.uk
Given the pressure on health and social care resources, there is a growing incentive to explore methods for self-management for long-term conditions. Smart environments, realised through a range of ambient integrated sensors and service robotics, could people with long-term conditions improve their quality of life. There is emerging research on intelligent data fusion to combine a range of ambient and wearable data sensors for modelling and analysing physiological and behavioural data collected over time. This can be used to provide early warning or guidance for the patient themselves, or their healthcare professionals.
The research challenges lie in developing person-specific machine learning models, which are verifiable and robust in the face of noisy real-world sensor data that will change over time, as the person’s condition changes. There is also a gap in knowledge on how best to select and integrate multiple types of sensor data, in a way that preserves the integrity of the different streams of information, while also providing a meaningful representation of the person’s activity.
This research will address the challenges noted, and also explore the design of interactive systems that can incorporate user input for semantic labelling and modelling, using an active learning approach. Keeping the user in the loop can improve engagement, while offering improved reasoning and confidence in sensor selection and fusion techniques. This research will explore multi-modal user-input approaches for eliciting and integrating user input for semantic labelling, using a combination of supervised, un-supervised and self-learning techniques to address the challenges of noisy data and reliably tracking changes in long-term conditions over time.
This research will be informed by, and related to, ongoing preclinical work being conducted by members of the interdisciplinary supervisory team, exploring behavioural and physiological changes in response to pregnancy, the ageing process and age-related diseases such as stroke, diabetes and cardiovascular dysfunction.
Prospective PhD applicants are expected to have a degree in Computer Science or Maths with knowledge of Data Science, Machine Learning and AI. This project will require excellent programming skills with evidence of proficient working knowledge in one or more of the following: C++, C, Java, Python, ROS.
Supervisors: Prof Praminda Caleb-Solly (School of Computer Science), Dr Matthew Elmes (School of Biosciences), Prof Claire Gibson (School of Psychology).
Clinical partners: Alison Wildt (National Rehabilitation Centre Clinical Support Manager), Chrishanti Thornton (Extracare Charitable Trust)
For further details and to arrange an interview please contact Prof Praminda Caleb-Solly.
Modelling Human-Robot Interaction in Social Spaces
To discuss this project please email: praminda.caleb-solly@nottingham.ac.uk
Robotics and related AI technologies are rapidly gaining presence in different areas of our everyday life, e.g. cleaning robots vacuuming floors, warehouse robots carrying pallets, robotic vehicles with cruise control. An exciting use of robotics is social and telepresence robots, which are intended to work in public and social contexts, including educational and museum settings, and to provide support for older adults and populations with accessibility issues.
This PhD project will study and quantify human interactions with commercially available robots in different contexts (participants/robots/places/functions) with a view to creating models of human-robot interaction (HRI) in these contexts. These models will help to improve design of spaces optimising human-robot interaction and also inform the development of best practice guidelines for robot embodiment, interaction strategies and autonomous behaviour.
In line with this goal, this PhD project aims to model sustainable human-robot interaction strategies for socially capable robots designed to function in public spaces. The project will target technological and psycho-sociological challenges related to AI to investigate the following overarching research questions:
How can social and telepresence robots be used to connect groups of remote humans and mediate the interaction between them?
What kind of personalisation methods and input/output modalities are useful to improve the interaction between humans and robots and enable long term sustainability of the communications?
How do the attitudes and perceptions toward robots change in children and adults over time?
Are these attitudes and perceptions affected by cultures, communities and the interaction environments?
This PhD project will benefit from a strong multidisciplinary approach at the interface of Computer Science, Robotics, and Psychology. Applicants are expected to develop technological advancements in AI and Interaction Design, including using machine-learning for generating personalised user models for children and adults, adaptive motion planning in social environments, feedback generation. In addition, the successful student will design, conduct and analyse experiments to investigate the socio-psychological effects of the technologies.
Supervisors: Prof. Praminda Caleb-Solly (School of Computer Science), Dr Emily Burdett (School of Psychology), Dr Ayse Kucukyilmaz (School of Computer Science).
For further details and to arrange an interview please contact Prof. Praminda Caleb-Solly.
The Horizon CDT is looking for 15 talented and enthusiastic students to join the Horizon Centre for Doctoral Training (CDT) in September 2023. The Horizon CDT is an interdisciplinary Centre welcoming applicants from a wide range of backgrounds including computer science, engineering, mathematics, human factors, human-robot interaction, psychology, sociology, business, geography, social science, medicine/health sciences and the arts. Applicants must demonstrate an enthusiasm for transdisciplinary research, with a 2:1 honours degree, or a combination of qualifications and/or experience equivalent to that level.
You will receive a generous enhanced, tax-free stipend of £19,891 per annum (this is the current 2022/23 rate – awaiting confirmation of rate for 2023/24).
For more information and to apply, please visit cdt.horizon.ac.uk/apply/current-opportunities/.
The closing date for this recruitment round has now passed. Please keep a lookout for announcements for upcoming rounds.
The CHART Research Team are offering several projects in collaboration with their partners:
Multimodal interfaces to enable multisensory accessible interaction in remote cultural environments through telepresence robots
To discuss this project please email: praminda.caleb-solly@nottingham.ac.uk
Partner: Screen South https://screensouth.org/
Telepresence robots offer a significant digital opportunity for people to remotely access social, work and cultural spaces, autonomously moving around them, giving a feeling of connection and presence. As such, telepresence robots can be a transformative tool in enabling engagement with museums and galleries, making connections and improving wellbeing. For a number of disabled people, and those shielding due to lowered immunity due to long-term conditions, having the choice to access cultural spaces and interact with people and objects through telepresence robots, can offer more freedom and flexibility to be ‘present’ in locations.
However, the interfaces to control telepresence robots can be cumbersome and inaccessible, particularly for those with sensory and/or physical impairments, making it difficult or impossible for them to use these effectively. We are also interested in exploring how by combining telepresence robots with other digital devises, such as VR and haptics, we can enable truly immersive multisensory experiences that are accessible to a variety of participants.
The aim of this research is to co-design and test a range of different input and output devices and modalities to develop multisensory interfaces that will enable accessible, smooth and enjoyable control and remote interaction. You will explore the integration and use of speech, head and ear-switches, electromyograms, and gaze, amongst other modalities, for control, and visual, haptic and aural modalities for feedback of information to enable rich and creative experiences of the remote space, people and objects. You will study and develop metrics for evaluating usability and user experience for accessible teleoperation using these modalities and custom devices, as well as developing a best practice framework to support future accessible design. The research will also offer the opportunity to draw on disability studies research to understand the lived experience of using telepresence in different contexts, understanding impact on self-efficacy, identity, social relationships and agency in interactions. This research offers several technical and non-technical strands to explore, based on the candidate’s background, skills and experience.
Ambient and Augmented Reality Information Visualisation of Smart Sensor Data for Real-Time Clinical Decision Making
To discuss this project please email: praminda.caleb-solly@nottingham.ac.uk
Partner: Queen’s Medical Centre, University Hospital, Nottingham
In busy clinical environments, particularly where patients have a high-level of staff dependency, providing support for clinical staff to improve patient monitoring, triage and management can not only help to ease level of staff stress, but also potentially improve patient safety. This research will investigate how information to assist with clinical decision making can be presented through creative ambient and/or augmented information displays and the impact that different modes and modalities have on user cognitive load, attention and efficiency. This research is situated in the use of tangible devices, and ambient and augmented reality displays, exploring topics in information visualisation, sensory substitution, human factors and user experience design. Considering the context of high-pressured environments, such as dementia wards, you will begin the research with a qualitative observational study, scoping the requirements using co-design with clinical and care professionals, before designing, developing and evaluating a range of approaches for representing the required information.
Based on the candidate’s academic background, skills and experience, the research focus can be either on developing intelligent sensing to capture and represent the key information required for decision-making, or design and development of the approaches for displaying it through different means and modalities, or a combination of both.
Intelligent sensing and machine learning to adapt social robot assistance to support independent living
To discuss this project please email: praminda.caleb-solly@nottingham.ac.uk
Partner: Robotics For Good CIC https://www.roboticsforgood.co.uk/
Assistive technologies, such as smart home environments, integrated sensors and service robotics are recognised as emerging tools in helping people with long-term conditions improve their quality of life and live independently for longer. A key aspect of the research into assistive robotics for assisted living is developing contextual and social intelligence for the robot to interact appropriately, safely, and reliably in real-time. This research relates to developing assistive robot behaviour by incorporating both environmental and user data, and behaviour, as part of an overall intelligent control system architecture.
In addition to having a ‘memory’ of previous interactions and situations, assistive robots need access to information that is current and one that provides a dynamic world view of the user (including their emotional state) so that they can provide information and responses that are contextually appropriate. Typical activities for which support can be provided is support with rehabilitation, medication management, cognitive and social stimulation, nutrition management etc. Drawing on information from environmental and activity sensors instrumented into a smart home, and information about the user’s current physical and emotional state, assistive robots can potentially create value through provision of interventions that are more socially intelligent regarding how, and what advice and support they provide. To create a more holistic service, that takes into consideration prioritisation of events based on aspects of health and social circumstance requires an adaptable, intelligent learning system. Building on existing research on intelligent control system architectures, the aim of this research will be to design and test modular semantic memory architectures that can be adapted over time. You will investigate optimal combinations of contextual data comprising implicit (emotional, physiological) and explicit user data (interaction), as well as behavioural activity data assimilated from a range of wearable and smart home sensors, to develop adaptive, intelligent and emotionally engaging robot behaviour to support independent living.
Learning, user modelling and assistive shared control to support wheelchair users
To discuss this project please email: Ayse.Kucukyilmaz@nottingham.ac.uk
This PhD project will develop on the Nottingham Robotic Mobility Assistant, NoRMA (https://github.com/HCRLabRepo/NoRMA) to study triadic learning methodologies for developing effective assistance policies for wheelchair users to support their day to day activities.
Long term autonomy and mobile inspection of extreme environments with a quadruped robot
To discuss this project please email: Ayse.Kucukyilmaz@nottingham.ac.uk
This PhD project will be in collaboration with RACE (https://race.ukaea.uk/) and aims to develop incremental learning methodologies to develop context-based policies, not only for navigation, but error recovery in long term automation. Human-in-the-loop and teleoperated control methods will be used as the backbone strategy to ensure increasing levels of autonomy during inspection. We will look at human-robot interaction methodologies for efficient management and optimisation of parallel tasks encountered in day-to-day operation of the Boston Dynamics Spot mobile inspection robot.
Exploring Bilateral Trustworthiness in Human-Robot Collaborative
To discuss this project please email: Ayse.Kucukyilmaz@nottingham.ac.uk
This PhD studentship will investigate trust from a theory of mind point of view to model a robot’s trustworthiness from the perspective of a human, and vice versa.
Lifelong learning with robotic vacuum cleaners in social spaces: In collaboration with Beko Plc. (https://www.bekoplc.com/), this PhD project will focus on these challenges by targeting multiple strands of research in perception, planning, human-in-the-loop learning, and shared control for service robots. The ability to detect and recover from errors during navigation is an essential ability for an autonomous service robot that can run for extended periods of time. In addition, functioning in human settings, these robots should be programmed to adhere to social cues in a context- dependent manner, not only to enable safe, but also acceptable functionality.
Fully Funded Studentships in the School of Computer Science
We offer a range of fully-funded PhD Studentships in the School of Computer Science. Please get in touch if you are interested in any of the topics listed below or any others based on your interests that you want to discuss with the CHART team that correspond to their research focus.
Closing Date: Sunday 12 February 2023 - Now Closed
Reference: SCI2122
Applications are invited from International and Home students for fully-funded PhD studentships offered by the School of Computer Science at the University of Nottingham, starting on 1st October 2023.
The studentships available are fully funded for 3.5 years and include a stipend of (minimum) £16,062 per year and tuition fees.
The topics for the studentships are open, but your research proposal should relate to the interests of one of the CHART research groups' Topics of Interest as listed below.
Entry Requirements:
Qualification Requirement: Degree 2:1 or masters in computer science or another relevant area
International and EU equivalents: We accept a wide range of qualifications from all over the world. For information on entry requirements from your country, see our country pages.
IELTS 6.5 (6.0 in each element)
English language requirements As well as IELTS (listed above), we also accept other English language qualifications. This includes TOEFL iBT, Pearson PTE, GCSE, IB and O level English.
Application process:
Please check your eligibility against the entry requirements prior to proceeding.
If you are interested in applying, please contact potential supervisors to discuss your research proposal.
If the supervisor wishes to support your application post interview, they will direct you to make an official application through the MyNottingham system. You will be required to state the name of your supervisor and the studentship reference number in your application.
Do not submit your application via the My Nottingham platform without having confirmed support of a supervisor first. Please email the person/people named next to the topic you are interested in with an up-to-date copy of your CV, marks transcripts, and a cover email explaining why you will be suitable for the selected PhD topic. We will then be able to advise you whether to proceed with a formal application on My Nottingham or not.
Topics
Safety
Analysis of the impact of cognitive loading and distractions during human-robot collaboration for assistive tasks (Praminda Caleb-Solly)
Embodied intelligence and sensing
Intelligent sensing and machine learning to improve the diagnosis and treatment of children with movement disorders (Alex Turner)
Design of smart actuated sensing devices and environments to support cognitive function/diagnostics in assisted living contexts (Praminda Caleb-Solly/Armaghan Moemeni)
Cyber-physical Space in Personalised Ambient Assisted Living (AAL) - Digital Twin/Blockchain/Machine Learning (Armaghan Moemeni)
Intelligent sensing to measure human trust using physiological sensing in virtual reality - for application of cognitive training and support (Armaghan Moemeni)
Accessible Interaction
Enhancing usability of augmented reality interfaces for cognitive support (Praminda Caleb-Solly/Armaghan Moemeni)
Modular robotics
Reconfigurable modular rehabilitation robots to monitor and manage frailty (Praminda Caleb-Solly)
Telepresence and Teleoperation
Multimodal real-time feedback (haptic, auditory, visual) for teleoperation of assistive and rehabiliation tasks (Praminda Caleb-Solly)
Autonomous and tele-manipulation
Improving autonomous complex robot manipulation capabilities that go beyond just grasping (Ayse Kucukyilmaz)
Shared and traded control
Modulation of levels of autonomy in human-robot teamwork through shared and traded autonomy paradigms (Ayse Kucukyilmaz)
Assisted Mobility
Designing and developing learning-based methodologies for wheelchair driving assistance (Ayse Kucukyilmaz)
Enhancing driving performance and safety using AR and haptics technologies in robotic wheelchairs (Ayse Kucukyilmaz)
Multimodal feedback for shared control of Early Years Powered Mobility (children's wheelchairs) to support independent mobility (Praminda Caleb-Solly)
Where to find us
We are located in the Cobot Maker Space in the Nottingham Geospatial Institute On Jubilee Campus, University of Nottingham