Powered by OpenAIRE graph

iSee

Intelligent Sharing of Explanation Experience by users for users
Funder: French National Research Agency (ANR)Project code: ANR-21-CHR4-0004
Funder Contribution: 202,198 EUR
Description

A right to obtain an explanation of the decision reached by a machine learning (ML) model is now an EU regulation. Different stakeholders (e.g. patients, clinicians, developers, auditors, etc.) may have different background knowledge, competencies and goals, thus requiring different kinds of explanations. Fortunately, there is a growing armoury of ways of interpreting ML models and explaining their decisions. Let us use the phrase ‘explanation strategy’ to refer collectively to interpretable models, methods for visualisation, and algorithms for explaining the predictions of models that have been built by Machine Learning (ML). As these explanation strategies mature, practitioners will gain experience that helps them know which strategies to use in different circumstances. Whilst there are existing libraries that provide interfaces to a limited number of explanation strategies, these efforts remain disconnected and provide no easy route to reusability at scale. Our aim goes well beyond the development of a library. We aim to transform the ML explanation landscape through an open platform that can assist a spectrum of users (knowledge engineers, domain experts, novice users) in the selection and application of appropriate explanation strategies. We hypothesise that episodes of explanation strategy experience can be captured and reused in similar future task settings. The iSee Project will show how end-users of AI can capture, share and re-use their explanation experiences with other users who have similar explanation needs. Our idea is to create a unifying platform, underpinned by case-based reasoning (CBR), in which successful experiences of applying an explanation strategy to an ML task can be captured as cases and retained in a case base for future reuse. Our cases will encode knowledge about the decisions made by a user and the effectiveness of the strategy, so that our CBR system can recommend how best to explain ML predictions to other users in similar circumstances. We recognise that explanation strategies can be foundational, of the kind found in the research literature, and these can seed the case base. However, user needs are often multi-faceted. We will show how new cases that capture composite strategies can be composed from foundational ones, by extending the CBR technique of constructive reuse. Our proposal describes how we will develop an ontology for describing a library of explanation strategies; develop metrics to evaluate their acceptability and suitability and use these in a case representation that we will develop to capture experiences of using explanation strategies. Cases record the objective and subjective experience of different users of different ML explanation strategies, so that they can be shared and re-used. We include a number of high-impact use cases, where we work with real-world users to co-design the representations and algorithms described above, and to evaluate and validate our approach. These use cases will also seed the case base. Additionally, the iSee project fosters explanation strategy evaluation; promotes experiment reproducibility; exhibits international collaboration; is driven by co-creation of representations and evaluation criteria with our use case partners; meets the best research standards in terms of open access to software and published results; and may ultimately provide a route to much-needed policies, procedures and technologies for certifying compliance with ML regulations and guidelines.

Data Management Plans
Powered by OpenAIRE graph

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

All Research products
arrow_drop_down
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::331f39a6af9b3dbe1e8d18c929f3f99c&type=result"></script>');
-->
</script>
For further information contact us at helpdesk@openaire.eu

No option selected
arrow_drop_down