March 8th - Isabel Rio-Torto

Isabel Rio-Torto (FCUP/INESC TEC)

From Captions to Natural Language Explanations


The growing importance of the Explainable Artificial Intelligence (XAI) field has resulted in the proposal of several methods for producing visual heatmaps of the classification decisions of deep learning models. However, visual explanations are not enough since different end-users have different backgrounds and preferences. Natural language explanations (NLEs) are inherently understandable by humans and, thus, can complement visual explanations. In the literature, the problem of generating NLEs is usually framed as traditional supervised image captioning, where the model learns to produce some human collected ground-truth explanations. In this talk, the audience is invited to navigate the state-of-the-art in image captioning and NLE generation, from the very first approaches with LSTMs to more recent Transformer-based architectures. The last part of the talk will encompass the speaker’s ongoing research on the topic, particularly focusing on the distinction between image captioning and NLE generation and on how we can go from one to the other without requiring human collected NLEs for training.


Bio: Isabel Rio-Torto is a second-year PhD candidate at Faculdade de Ciências da Universidade do Porto (FCUP) and a researcher at Instituto de Engenharia, Sistemas e Computadores, Tecnologia e Ciência (INESC TEC). Her PhD, “Self-explanatory computer-aided diagnosis with limited supervision”, supervised by professors Luís F. Teixeira and Jaime S. Cardoso, focuses on tackling the two main issues that prevent deep learning models from being deployed to the clinics: the need for large amounts of labelled data and lack of explainability. Currently she is working on the generation of Natural Language Explanations for Computer Vision applications. She is also an invited teaching assistant at Faculdade de Engenharia da Universidade do Porto (FEUP). Isabel previously studied Electrical and Computers Engineering at FEUP, where she developed her master thesis “Producing Decisions and Explanations: A Joint Approach Towards Explainable CNNs”, which won her the 1st place at 2020 Fraunhofer Portugal Challenge (MSc category) and the Best 2020 Master Thesis Award by Associação Portuguesa de Reconhecimento de Padrões (APRP).