Seminars


The Priberam Machine Learning Lunch Seminars are a series of informal meetings which occur every two weeks at Instituto Superior Técnico, in Lisbon. It works as a discussion forum involving different research groups, from IST and elsewhere. Its participants are interested in areas such as (but not limited to): statistical machine learning, signal processing, pattern recognition, computer vision, natural language processing, computational biology, neural networks, control systems, reinforcement learning, or anything related (even if vaguely) with machine learning.

The seminars last for about one hour (including time for discussion and questions) and revolve around the general topic of Machine Learning. The speaker is a volunteer who decides the topic of his/her presentation. Past seminars have included presentations about state-of-the-art research, surveys and tutorials, practicing a conference talk, presenting a challenging problem and asking for help, and illustrating an interesting application of Machine Learning such as a prototype or finished product.

Presenters can have any background: undergrads, graduate students, academic researchers, company staff, etc. Anyone is welcome both to attend the seminar as well as to present it. Ocasionally we will have invited speakers. See below for a list of all seminars, including the speakers, titles and abstracts.

Note: The seminars are held at lunch-time, and include delicious free food.

Feel free to join our mailing list, where seminar topics are announced beforehand. You may also visit the group webpage. Anyone can attend the seminars. If you would like to present something, please send us an email.

The seminars are usually held every other Tuesday, from 1 PM to 2 PM, at the IST campus in Alameda. This sometimes changes due to availability of the speakers, so check regularly!

Tuesday, April 20th 2021, 13h00 - 14h00

António Farinhas (IST/IT)

Visual Attention with Sparse and Continuous Transformations

Location (webinar): (Zoom)

Abstract:

Visual attention mechanisms have become an important component of neural network models for Computer Vision applications, allowing them to attend to finite sets of objects or regions and identify relevant features. A key component of attention mechanisms is the differentiable transformation that maps scores representing the importance of each feature into probabilities. The usual choice is the softmax transformation, whose output is strictly dense, assigning a probability mass to every image feature. This density is wasteful, given that non-relevant features are still taken into consideration, making attention models less interpretable. Until now, visual attention has only been applied to discrete domains - this may lead to a lack of focus, where the attention distribution over the image is too scattered. Inspired by the continuous nature of images, we explore continuous-domain alternatives to discrete attention models. We propose solutions that focus on both the continuity and the sparsity of attention distributions, being suitable for selecting compact and sparse regions such as ellipses. The former encourages the selected regions to be contiguous and the latter is able to single out the relevant features, assigning exactly zero probability to irrelevant parts. We use the fact that the Jacobian of these transformations are generalized covariances to derive efficient backpropagation algorithms for both unimodal and multimodal attention distributions. Experiments on Visual Question Answering show that continuous attention models generate smooth attention maps that seem to better relate with human judgment, while achieving improvements in terms of accuracy over grid-based methods trained on the same data.

--

Bio: António Farinhas is a first year PhD student at Instituto Superior Técnico (IST), who is interested in Machine Learning and Natural Language Processing, being advised by André Martins. He previously obtained his MSc degree in Aerospace Engineering at IST. The work in his MSc thesis, advised by André Martins and Pedro Aguiar, focused on continuous visual attention mechanisms and was part of the NeurIPS 2020 paper “Sparse and Continuous Attention Mechanisms”.

Save your place. Register now in Eventbrite