GASP: Gated Attention for Saliency Prediction

Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI-21), pages 584--591, doi: 10.24963/ijcai.2021/81 - Aug 2021 Open Access
Associated documents :  
<!doctype html> <p> Saliency prediction refers to the computational task of modeling overt attention. Social cues greatly influence our attention, consequently altering our eye movements and behavior. To emphasize the efficacy of such features, we present a neural model for integrating social cues and weighting their influence. Our model consists of two stages. In the first stage, we detect two social cues by following gaze, estimating gaze direction, and recognizing affect. These features are then transformed into spatiotemporal maps through image processing operations. The transformed representations are propagated to GASP where we explore various techniques of late fusion for integrating social cues and introduce two sub-networks for directing attention to relevant stimuli. Our experiments indicate that fusion approaches achieve better results for static integration methods, whereas non-fusion approaches result in better outcomes when coupled with recurrent models for dynamic saliency prediction. We show that gaze direction and affective representations contribute a prediction to ground-truth correspondence improvement of at least 5% compared to dynamic saliency models without social cues. Furthermore, affective representations improve GASP, supporting the necessity of considering affect-biased attention in predicting saliency. </p> <p align="center"> <iframe width="560" height="315" src="https://www.youtube.com/embed/e4HFTmEgirk" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe> </p> <p> Here you can find the <a href="https://github.com/knowledgetechnologyuhh/gasp" >code</a> and <a href="https://www.youtube.com/watch?v=e4HFTmEgirk&t=7s&ab_channel=KnowledgeTechnology%2CUniversityofHamburg" >video</a> associated with this paper.</p>

 

@InProceedings{AWW21, 
 	 author =  {Abawi, Fares and Weber, Tom and Wermter, Stefan},  
 	 title = {GASP: Gated Attention for Saliency Prediction}, 
 	 booktitle = {Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence (IJCAI-21)},
 	 editors = {},
 	 number = {},
 	 volume = {},
 	 pages = {584--591},
 	 year = {2021},
 	 month = {Aug},
 	 publisher = {International Joint Conferences on Artificial Intelligence},
 	 doi = {10.24963/ijcai.2021/81}, 
 }