Publications and useful links

Here is a list of our current project publications

Winfield AFT, van Maris A, Salvini P, Jirotka M. (2022). An Ethical Black Box for Social Robots: a draft Open Standard. International Conference on Robot Ethics and Standards (ICRES 2022), Seoul (South Korea), 18-19 July 2022

Access it here


Grieman K. (2022). How to investigate when a robot causes an accident – and why it’s important that we do. The Conversation. Published: March 24, 2022 3.06pm GMT.

Access it here

Winfield AFT, Watson N, Egawa T, Barwell E, Barclay I, Booth S, Dennis LA, Hastie H, Hessami A, Jacobs N, Markovic M, Muttram RI, Nadel L, Naja I, Olszewska J, Rajabiyazdi F, Rannow RK, Theodorou A, Underwood MA, von Stryk O, Wortham RH. (2022) IEEE Standard 7001-2021 for Transparency of Autonomous Systems. The Institute of Electrical and Electronics Engineers, Inc. ISBN978-1-5044-8311-7.

External link


Winfield AFT, Booth S, Dennis LA, Egawa T, Hastie H, Jacobs N, Muttram RI, Olszewska JI, Rajabiyazdi F, Theodorou A, Underwood MA, Wortham RH, Watson E. (2021) IEEE P7001: A Proposed Standard on Transparency. Frontiers in robotics and AI 8:665729 DOI=10.3389/frobt.2021.665729

Access it here


Webb, H., Dumitru, M., Van Maris, A., Winkle, K., Jirotka, M. and Winfield, A.F.T. (2021) Role-Play as Responsible Robotics: The Virtual Witness Testimony Role-Play Interview for Investigating Hazardous Human-Robot Interactions Frontiers in Robotics and AI 8:644336 DOI=10.3389/frobt.2021.644336

Access it here


Winfield A.F.T., Winkle K., Webb H., Lyngs U., Jirotka M., Macrae C. (2021) Robot Accident Investigation: A Case Study in Responsible Robotics. In: Cavalcanti A., Dongol B., Hierons R., Timmis J., Woodcock J. (eds) Software Engineering for Robotics. Springer, Cham. https://doi.org/10.1007/978-3-030-66494-7_6

Access pre-print version here

Winfield, A.F.T., Winkle, K. (2020). RoboTed: a case study in Ethical Risk Assessment. 5th International Conference on Robot Ethics and Standards (ICRES 2020), 28-29 September 2020.

Access arvix version here


Brandao, M, Jirotka, M, Webb, H. and Luff, K. (2020) Fair navigation planning: a resource for characterizing and designing fairness in mobile robots. Artificial Intelligence, March 2020, 282, doi.org/10.1016/j.artint.2020.103259

Access it here


Winkle, K., Jirotka, M., Lyngs, U. Macrae, C., Webb, H. and Winfield, A. F.T. (2020) “What Could Possibly Go Wrong?” Logging HRI Data for Robot Accident Investigations,. HRI’ 20: Companion of the 2020 ACM/IEEE International Conference of Human-Robert Interaction, March 2020, 517-519, doi/10.1145/3371382.3378296

Access it here

Webb, H., Jirotka, M., Winfield, A.F.T., and Winkle, K. (2019) Human-robot relationships and the development of responsible social robots. ACM Proceedings of the Halfway to the Future Symposium. November 2019, Article 12. doi.org/10.1145/3363384.3363396

Access it here

Webb, H., Jirotka, M., Inglesant, P., and Patel, M. (2019) Human Centred Computing approaches to embed Responsible Innovation in HCI. Paper presented at the Responsible Innovation in HCI workshop of the 2019 CHI Conference on Human Factors in Computing Systems.

Access it here

Winfield, A.F.T. and Jirotka, M. (2018). Ethical governance is essential to building trust in robotics and artificial intelligence systems, Phil. Trans. R. Soc. A. 376:20180085. https://doi.org/10.1098/rsta.2018.0085

Access it here


Winfield, A.F.T. and Jirotka, M. (2017). The Case for an Ethical Black Box. In: Gao Y., Fallah S., Jin Y., Lekakou C. (eds) Towards Autonomous Robotic Systems. TAROS 2017. Lecture Notes in Computer Science, vol 10454. Springer, Cham. https://doi.org/10.1007/978-3-319-64107-2_21

Access it here

Other project links

Find us on Twitter @Robotips4

CSI robot. This funny public engagement activity was part of the UK-RAS Network Festival of Robotics 2021

Listen to our fun podcast about the use of the Ethical Black Box! Broadcast by Oxford Sparks

RoboTIPS team member Helena Webb talks to the RAS Network podcast about human-robot collaboration

Alan Winfield’s blog


Other links you might find interesting

Our sister project RoboTIPS International - part of the Responsible Technology Institute at Oxford

Human Centred Computing at Oxford blog





Doctoral student: Daniel Omieza

Daniel is a doctoral student in the Department of Computer Science at the University of Oxford. He receives his doctoral funding via RoboTIPS and his work focuses on categorising explanations and developing post-hoc explanations for autonomous driving. Daniel has produced a number of relevant publicatons:


Assessing and Explaining Collision Risk in Dynamic Environments forAutonomous Driving Safety

Richa Nahata, Daniel Omeiza, Rhys Howard, Lars Kunze

In press - IEEE 2021 International Conference on Intelligent Transportation


Access it here


Towards accountability: providing intelligible explanations in autonomous driving

Daniel Omeiza, Helena Webb, Marina Jirotka, Lars Kunze

In press - IEEE 2021 Intelligent Vehicles Symposium


Access it here


Why Not Explain? Effects of Explanations on Human Perceptions of Autonomous Driving

Daniel Omeiza, Helena Webb, Konrad Kollnig, Marina Jirotka, Lars Kunze

In press - IEEE 2021 International Conference on Advanced Robotics and its Social Impacts


Access it here


Towards Explainable and Trustworthy Autonomous Physical Systems

Daniel Omeiza, Sule Anjomshoae, Konrad Kollnig, Oana-Maria Camburu, Kary Fr amling, Lars Kunze

CHI 2021 Conference on Human Factors in Computing Systems


Access it here


A fait accompli? an empirical study into the absence of consent to third-party tracking in android apps

Konrad Kollnig, Reuben Binns, Pierre Dewitte, Max Van Kleek, Ge Wang, Daniel Omeiza, Helena

Webb, Nigel Shadbolt

In press - Proceedings of the 2021 Symposium on Usable Privacy and Security, in press.


Access it here


Realizing the Potential of AI in Africa: It All Turns on Trust

Charity Delmus Alupo, Daniel Omeiza, David Vernon

In press - Towards Trustworthy Artificial Intelligence Systems, M. I. Aldinhas Ferreira (Ed.), Springer


Access it here