Skip to main content

Explaining model decisions through dialogue

As Machine Learning and AI models underpin many of the recommendation/decision tools used to in day-to-day life these need to be transparent and explainable. The aim of this research is to explore, develop and evaluate methods that allow an end-user to have a dialogue with the recommendation/decision and to better understand the reason and rationale for that recommendation/decision. Examples of such models range from health risk to eligibility for credit or loans.

Skills and Experience

You should have experience of applying AI, Machine Learning, or related (e.g., Natural Language Processing) techniques. Skills and experience in software development, UX development, chat bot development or running user studies is an advantage. You should be a highly motivated individual and possess a strong sense of curiosity. The ability to study independently, think critically and collaborate with others is essential.

References

Miller, T., 2019. Explanation in artificial intelligence: Insights from the social sciences. Artificial intelligence, 267

Sassoon, I., Kökciyan, N., Sklar, E. and Parsons, S., 2019, May. Explainable argumentation for wellness consultation. In International Workshop on Explainable, Transparent Autonomous Agents and Multi-Agent Systems

Sokol, K. and Flach, P.A., 2018, January. Glass-Box: Explaining AI Decisions With Counterfactual Statements Through Conversation With a Voice-enabled Virtual Assistant. In *IJCAI* (pp. 5868-5870).

Sassoon, I., Kökciyan, N., Modgil, S. and Parsons, S., 2021. Argumentation schemes for clinical decision support. *Argument & Computation*

How to apply

If you are interested in applying for the above PhD topic please follow the steps below:

  1. Contact the supervisor by email or phone to discuss your interest and find out if you would be suitable. Supervisor details can be found on this topic page. The supervisor will guide you in developing the topic-specific research proposal, which will form part of your application.
  2. Click on the 'Apply here' button on this page and you will be taken to the relevant PhD course page, where you can apply using an online application.
  3. Complete the online application indicating your selected supervisor and include the research proposal for the topic you have selected.

Good luck!

This is a self funded topic

Brunel offers a number of funding options to research students that help cover the cost of their tuition fees, contribute to living expenses or both. See more information here: https://www.brunel.ac.uk/research/Research-degrees/Research-degree-funding. The UK Government is also offering Doctoral Student Loans for eligible students, and there is some funding available through the Research Councils. Many of our international students benefit from funding provided by their governments or employers. Brunel alumni enjoy tuition fee discounts of 15%.

Meet the Supervisor(s)


Isabel Sassoon - Dr Isabel Sassoon is a Senior Lecturer in Computer Science at Brunel University. Isabel was Brunel Lead Investigator on IMMUNE (Immunity Passport Service Design) an UKRI Arts and Humanities Research Council funded project. IMMUNE's aim is to research the unintended consequences and risks related to immunity passports for COVID-19 with a view to inform their design in way that mitigates these. Before joining Brunel Isabel was Research Associate on the CONSULT (Collaborative Mobile Decision Support for Managing Multiple Morbidities), an EPSRC funded project in the Department of Informatics in King’s College London. This project developed a collaborative mobile decision-support system to help patients suffering from chronic diseases to self-manage their treatment, by bringing together and reasoning with wellbeing sensor data, clinical guidelines and patient data. Prior to that Isabel was Teaching Fellow in the Department of Informatics in King’s College London, primarily on the Data Science MSc. Isabel's research interests are in data-driven automated reasoning, and its transparency and explainability. Her PhD research developed a computational argumentation based system to support the appropriate selection of statistical model given a research objective and available data. Her current research continues to explore how computational argumentation can assist in model explainability and trust. Prior to joining King's College London Isabel worked for more than 10 years as a data science consultant in industry, including 8 years in SAS UK. Isabel read Statistics, Operations Research and Economics at Tel Aviv University and received her Ph.D. in Informatics from King's College London. Isabel is a fellow of the Royal Statistical Society and Editorial Board Member of Real World Data Science, 

Related Research Group(s)

Intelligent Data Analysis

Intelligent Data Analysis - Concerned with effective analysis of data involving artificial intelligence, dynamic systems, image and signal processing, optimisation, pattern recognition, statistics and visualisation.