The paper "Temporal Activity Detection in Untrimmed Videos with Recurrent Neural Networks" was recognized with the Best Poster Award at the NIPS 2016 Workshop on Large Scale Computer Vision Systems. This paper is based on the bachelor thesis of CFIS-ETSETB student Alberto Montes, who worked together with Amaia Salvador and Xavier Giró from the Image Processing Group (GPI), together with Santiago Pascual from the Center for Language and Speech Technologies and Applications (TALP). The presented technique used deep learning to detect and recognize human activities in videos from YouTube. Alberto analyzed more than 65 million images, that correspond to 660 hours of video content. The other presented posters in the workshop came from Stanford University, UC Berkeley, Yahoo Research and Seoul National University. Facebook and Google, sponsors of the event, have awarded Alberto Montes with an Oculus Rift, a virtual reality headset. The project and models have been open sourced and are available from the project page. Photos from the bachelors thesis defence and the NIPS workshop are available here.