Mobility and Navigation

Physically assistive robots, those which offer assistance for people's mobility, such as smart wheelchairs or intelligent walking frames, or systems that provide assistance with aiding people to get in and out of bed, or chairs, as well as robots that can physically move to provide assistance, such as those with wheeled bases, or walking (legged) robots, have to deal with the fundamental challenges of mapping and navigation. This also includes being able to adapt their behaviour and mobility based on the terrain, obstacles which might be static or dynamic and other people or machines in the environment.

Our research in these areas covers shared autonomy for mobility robots such as wheelchairs, as well as fully autonomous systems for providing movement in fetch and carry tasks. We are researching the use of a range of sensors to provide situational awareness, as well as as using the information from sensors to provide real-time feedback to users of the robot to enable them to make decisions.

An example of this is Intelligent Data Fusion to support real-time accessible feedback for shared autonomy in powered wheelchair for children, such as Designability's Wizzybug. The aim of this research is to design and test a cost-effective sensor architecture incorporating adaptive machine learning to provide real-time accessible feedback for supporting shared control. The research has been considering the requirements emerging from sensory impairments and use of the powered mobility device to define the feedback modalities that will be non-intrusive, yet informative and intuitive to support decision making and learning. In a shared control scenario, our research aims to determine the sensors and machine learning techniques best suited for decision-making in different contexts by the smart wheelchair and the user.

Research Topics

  • Vision-based robot localisation and navigation

  • Models for different types of mobility device usage in different scenarios of use

  • Trajectory planning and dynamic simultaneous localisation and mapping

  • Indoor localisation techniques

  • Understanding human-robot proxemics and the social use of space

  • Low cost sensing solutions to maximise safety and richness of feedback for control and human-decision making

Related Past Research Projects with CHART team member involvement

This was originally an AHRC funded project developed as a Disability and Community D4D workstream, which is now being continued as an internal research project.

Catch Me If You Can explored how technological interventions could positively impact on the lives of very young disabled children. Through practical and investigative research, Caleb-Solly analysed how technology can support children, what these machines add to their experience when playing, and how they can enhance their experience of social belonging in communities. From an engineering perspective, the research team also wanted to find out if they could quantitatively measure the impact of a child having access to a Wizzybug. They settled on using a smart phone with a specially created app attached to the Wizzybug that would record the sensor data on the smart phone, such as the accelerometer. The app recorded how much the Wizzybug was being used, and when and where it was being used. This provided insights into everyday usage, as well as a quantitative indication of the level and scope of independent movement. Caleb-Solly is now working to progress the work to develop a shared-control Wizzybug and explore real-time intelligent sensor data processing and data fusion techniques to provide information about the environment around a wheelchair for shared-autonomy and to generate feedback (e.g. haptic feedback) to the child for safe navigation. Feedback could be used to alert the child, and other people nearby, to signal environmental hazards or provide richer information that supports mobility for children with a wider range of impairments. Please get in touch if you are interested in working in this area.

Project Lead: Praminda Caleb-Solly


Related Research Publications

  • Stephenson, A., Eimontaite, I., Caleb-Solly, P. and Alford, C., 2020, July. The impact of a biological driver state monitoring system on visual attention during partially automated driving. In International Conference on Applied Human Factors and Ergonomics (pp. 193-200). Springer, Cham. https://link.springer.com/chapter/10.1007/978-3-030-50943-9_25

  • P Caleb-Solly, J Leake, R Baines, S Gaertner, N Evans, K Sinclair, S Battle, T Adlam, 2019, Low-Cost Sensing and Data Analytics for Understanding Usage Patterns of Early Years Powered Mobility Devices, 31st Annual Meeting of the European Academy of Childhood Disability (EACD) https://edu.eacd.org/node/647

  • He, J., van Maris, A. and Caleb-Solly, P., 2020, March. Investigating the effectiveness of different interaction modalities for spatial human-robot interaction. In Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction (pp. 239-241). https://dl.acm.org/doi/abs/10.1145/3371382.3378273

  • Caleb-Solly, P., 2016, Person-Environment Interaction, Active and Assisted Living: Technologies and Applications, pp 143-162, IET, https://digital-library.theiet.org/content/books/10.1049/pbhe006e_ch8