Biography:Marina Jirotka

From HandWiki
Revision as of 08:56, 7 February 2024 by BotanyGa (talk | contribs) (fix)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Professor at the University of Oxford

Marina Denise Anne Jirotka[1] is professor of human-centered computing at the University of Oxford, director of the Responsible Technology Institute, governing body fellow at St Cross College,[2] board member of the Society for Computers and Law[3] and a research associate at the Oxford Internet Institute.[4] She leads a team that works on responsible innovation, in a range of ICT fields including robotics, AI, machine learning, quantum computing, social media and the digital economy. She is known for her work with Alan Winfield on the 'Ethical Black Box'.[5][6] A proposal that robots using AI should be fitted with a type of inflight recorder, similar to those used by aircraft, to track the decisions and actions of the AI when operating in an uncontrolled environment and to aid in post-accident investigations.[5][7]

Education

Jirotka obtained her BSc in psychology and social anthropology from Goldsmiths College in 1985 and her Master's in Computing and Artificial Intelligence from the University of South Bank in 1987. Her doctorate in Computer Science, An Investigation into Contextual Approaches to Requirements Capture,[8] was completed at the University of Oxford in 2000.

Career

In 1987 Jirotka was appointed research fellow in the University of Surrey Social and Computer Sciences Research Group. In 1991 she joined the University of Oxford as a senior researcher in the Department of Computer Science, becoming a university lecturer and governing body fellow of St Cross College in 2003. In 2008 she became reader in requirements engineering and was promoted to professor of human-centred computing in 2014.[citation needed]

Research

Jirotka's most recent work centres around the 'ethical black box' - a way of making algorithmic decisions explainable after an unexpected event or accident.[5][9] The model for this approach is the aviation industry that uses inflight recording systems to provide evidence after an accident. This approach to explainability in robotics is developed from Jirotka's earlier work on Responsible Research and Innovation (RRI), a research method supported by the EU's[10] Horizon 2020 theme and the UK research councils, particularly the EPSRC. RRI includes the use of reflection, stakeholder involvement and anticipatory governance to try to ameliorate potential negative effects of research and development. It is considered to be particularly applicable in ICT disciplines where failing to consider possible negative outcomes can have serious repercussions. Jirotka contributed significantly to the evolution of RRI after her work on the Framework for Responsible Research and Innovation in ICT project (FRRIICT) that was adopted for rollout in the UK by EPSRC.[11]

Jirotka's other projects include:

  • UnBias[12] and ReEnTrust[13] - research projects that examine the way algorithms are used online. The projects seek to raise awareness of algorithmic bias and to find ways of reducing bias such that public trust in algorithmic calculations can be justified.
  • RoboTIPS[14] - a project in partnership with the Bristol Robotics Lab looking at how robots can be developed using the 'ethical black box' concept.

Expert opinion

Jirotka has frequently given evidence to Select Committees, Advisory Boards, All-Party Parliamentary Groups and industry bodies. She sits on the Steering Committee of the APPG on Data Analytics[15] and the Advisory Board of the Society for Computers and Law. She regularly appears on expert panels to discuss ethical approaches to innovation[16] and is also an international speaker on the issues arising from a lack of diversity in science.[17]

Novel contributions

Along with members of her team, Jirotka formulated the concept of the Ethical Hackathon.[18] This is a technique based on the traditional type of hackathon that additionally incorporates a focus on ethical issues such as assessing the work's impact on (for example) minority groups or vulnerable users. The concept was trialled during work on the UnBias project and has since been used in a Zimbabwe LabHack [19] and in training doctoral students at Oxford.

Publications

References

  1. "Marina Jirotka". http://www.cs.ox.ac.uk/marina.jirotka/. 
  2. "Professor Marina Jirotka". 13 January 2023. https://www.stx.ox.ac.uk/people/marina-jirotka. 
  3. "SCL: Home". https://www.scl.org/. 
  4. "Dr Marina Jirotka — Oxford Internet Institute". https://www.oii.ox.ac.uk/people/marina-jirotka/. 
  5. 5.0 5.1 5.2 Sample, Ian (19 July 2017). "Give robots an 'ethical black box' to track and explain decisions, say scientists". https://www.theguardian.com/science/2017/jul/19/give-robots-an-ethical-black-box-to-track-and-explain-decisions-say-scientists. 
  6. Lant, Karla. "Experts Want Robots to Have an "Ethical Black Box" That Explains Their Decision-Making" (in en). https://futurism.com/experts-want-robots-to-have-an-ethical-black-box-that-explains-their-decision-making. 
  7. Winfield, Alan F. T.; Jirotka, Marina (20 July 2017). "The Case for an Ethical Black Box". Towards Autonomous Robotic Systems. Lecture Notes in Computer Science. 10454. In Proc. Towards autonomous robotic systems : 18th Annual Conference, TAROS 2017, Guildford, UK, July 19-21, 2017, Springer. pp. 262–273. doi:10.1007/978-3-319-64107-2_21. ISBN 978-3-319-64106-5. 
  8. Jirotka, Marina (2001). An investigation into contextual approaches to requirements capture (Thesis). University of Oxford.
  9. "Experts Want Robots to Have an "Ethical Black Box" That Explains Their Decision-Making". https://futurism.com/experts-want-robots-to-have-an-ethical-black-box-that-explains-their-decision-making. 
  10. kamraro (1 April 2014). "Responsible research & innovation". https://ec.europa.eu/programmes/horizon2020/en/h2020-section/responsible-research-innovation. 
  11. "Framework for Responsible Research and Innovation in ICT - EPSRC website". 18 October 2023. https://epsrc.ukri.org/funding/calls/frrict/. 
  12. EPSRC. "Grant: UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy". https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/N02785X/1. 
  13. EPSRC. "Grant: ReEnTrust: Rebuilding and Enhancing Trust in Algorithms". https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/R033633/1. 
  14. EPSRC. "Grant: RoboTIPS: Developing Responsible Robots for the Digital Economy". https://gow.epsrc.ukri.org/NGBOViewGrant.aspx?GrantRef=EP/S005099/1. 
  15. "All-Party Parliamentary Group on Data Analytics launches landmark enquiry into data and technology ethics - Press Releases - Orbit RRI". 27 November 2018. https://www.orbit-rri.org/blog/2018/11/27/party-parliamentary-group-data-analytics-launches-landmark-enquiry-data-technology-ethics/. 
  16. "University of Oxford". https://www.facebook.com/the.university.of.oxford/videos/10160428814050107/. 
  17. "Marina Jirotka". http://www.iopblog.org/author/marina-jirotka/. 
  18. "The Ethical Hackathon encapsulates ethics and design challenge". https://www.cs.ox.ac.uk/news/1495-full.html. 
  19. "School Newsletter 2018 - Projects - School of Anthropology & Museum Ethnography". https://www.anthro.ox.ac.uk/school-newsletter-2018-projects.