Ethics and Responsible Research

User concerns regarding an assistive robot, art work created by Sam Church for the University of Nottingham, all rights reserved.

Responsible Research and Innovation is ‘a moral imperative. Research and Innovation should be environmentally protective, answering needs, ethical values, and expectations of society and beneficial to the widest range of actors’ (de Saille, S. 2015[i]).

In our research of service robotics technologies, we ensure that we take a responsible Innovation approach at every stage in developing cyber-physical health and assistive robotics technologies.

According to EPSRC [ii], a Responsible Innovation approach should be one that continuously seeks to:

  • Anticipate – describing and analysing the impacts, intended or otherwise, (for example economic, social, environmental) that might arise. This does not seek to predict but rather to support an exploration of possible impacts and implications that may otherwise remain uncovered and little discussed.

  • Reflect – reflecting on the purposes of, motivations for and potential implications of the research, and the associated uncertainties, areas of ignorance, assumptions, framings, questions, dilemmas and social transformations these may bring.

  • Engage – opening up such visions, impacts and questioning to broader deliberation, dialogue, engagement and debate in an inclusive way.

  • Act – using these processes to influence the direction and trajectory of the research and innovation process itself.

[i] De Saille, S., 2015. Innovating innovation policy: the emergence of ‘Responsible Research and Innovation’. Journal of Responsible Innovation, 2(2), pp.152-168.

[ii] EPSRC Framework for Responsible Innovation:

Our research objectives in this area include:

1. Developing contextually relevant and actionable ethical frameworks for cyber-physical health and assistive robotics technologies

2. Reviewing existing ethical guidelines and ensuring appropriateness and relevance to emerging technologies

3. Understanding ethical issues for real-world use of emerging technologies

4. Ensuring equitable use and access to cyber-physical health and assistive robotics technologies

Research Topics

  • Ethical Frameworks for cyber-physical health and assistive robotics technologies in different contexts

  • Ethical risk assessment and analysis approaches and methods

  • Understanding stakeholders (end-users, carers, relatives, therapists, clinicians) concerns and needs

  • Transparent and Accessible AI

Related Research Projects

Empowering Future Care Workforces aims to understand how health and social care professionals can benefit from using assistive robotics on their own terms. Empowering health and social care professionals through digital technologies has long been a goal in health and care policy. As governments invest in post-pandemic digital transformation, ensuring workers are empowered and not excluded by technology is more urgent than ever.

UoN Project Lead: Praminda Caleb-Solly

The aim of Accessible AI@Nottingham and our network activities is to build public trust through promoting transparency of AI decision making. We have designed a series of activities for pro-active engagement and aim to empower people to be confident in accessing, understanding and exploiting data.

UoN Project Lead: Praminda Caleb-Solly

The BS 8611 gives guidance on the identification of potential ethical harm and provides guidelines on safe design, protective measures, and information for the design and application of robots. BS 8611 builds on existing safety requirements for different types of robots; industrial, personal care, and medical. BS 8611 describes ethical hazards associated with the use of robots and provides guidance to eliminate or reduce the risks associated with them. Significant ethical hazards are presented, and guidance is given on how they are to be dealt with for various robot applications. The AMT/10/1 Technical Committee is currently doing a review of the standard and updating its content.

UoN Project Member: Praminda Caleb-Solly

Related Publications

  • van Maris, A., Zook, N., Dogramadzi, S., Studley, M., Winfield, A. and Caleb-Solly, P., 2021. A New Perspective on Robot Ethics through Investigating Human–Robot Interactions with Older Adults. Applied Sciences, 11(21), p.10136.

  • Winkle, K., Caleb-Solly, P., Leonards, U., Turton, A. and Bremner, P., 2021, March. Assessing and addressing ethical risk from anthropomorphism and deception in socially assistive robots. In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (pp. 101-109).

  • Van Maris, A., Zook, N., Caleb-Solly, P., Studley, M., Winfield, A. and Dogramadzi, S., 2020. Designing ethical social robots—a longitudinal field study with older adults. Frontiers in Robotics and AI, 7, p.1.

  • Van Maris, A., Zook, N., Caleb-Solly, P., Studley, M., Winfield, A. and Dogramadzi, S., 2018, August. Ethical considerations of (contextually) affective robot behaviour. In Hybrid Worlds: Societal and Ethical Challenges-Proceedings of the International Conference on Robot Ethics and Standards (ICRES 2018). CLAWAR Association Ltd (pp. 13-19).