Human-Centered Artificial Intelligence and Interactive Machines Laboratory (Aiim Lab)
Our mission is to develop novel artificial intelligence and machine learning algorithms for analyzing, interpreting, and generating human-focused data such as video, audio, text, and wearable signals. The overarching objective of the lab is to understand human behaviour, identity, health, and emotional states with the goal of improving user experience, safety and security, health outcomes, and overall quality of life.
News
[9/2024] Our work on time-series representation learning has been accepted to NeurIPS'24.
[9/2024] I will be serving as an Area Chair for WACV'25, AISTATS'25, and ICLR'25.
[7/2024] Our work on human pose estimation with cross-view and temporal cues has been accepted to ECCV'24.
[4/2024] The Aiim lab is now renamed from Ambient Intelligence and Interactive Machines to Human-Centered Artificial Intelligence and Interactive Machines.
[3/2024] I have been awarded the Queen's University Prize for Excellence in Research.
[3/2024] I gave invited talks at University of Cambridge and Nokia Bell Labs.
[1/2024] Our work on tuning vision-language models has been accepted to ICLR'24.
[12/2023] 3 papers accepted to AAAI'24.
[11/2023] Pritam Sarkar has received the IEEE Research Excellence Award (first place) in the PhD category, and Debaditya Shome has received the IEEE Research Excellence Award (second place) in the Master's category! Congrats to both Aiim members!
[10/2023] Our paper entitled "EEG-based Cognitive Load Classification using Feature Masked Autoencoding and Emotion Transfer Learning" has won the Best Paper Award at ACM ICMI'23.
[10/2023] I have been appointed as an Associate Editor for IEEE Transactions on Affective Computing.
[10/2023] Our work on video self-supervised learning has been accepted to NeurIPS as a Spotlight paper.
[08/2023] Our team, led by Patrick Zhang, is the 2nd Place Winner in the Emotion Physiology and Experience Challenge (EPiC), organized at the International Conference on Affective Computing and Intelligent Interaction (ACII), MIT Media Lab, USA.
[11/2022] Two papers have been accepted to AAAI'23, one focusing on self-supervised learning and the other on audio-video affect and cognitive load analysis.
[07/2022] Two papers have been accepted to ECCV'22 (both oral), one focusing on object detection and the other on object pose estimation.
[06/2022] I have been promoted to the rank of Associate Professor.
[05/2022] I gave a talk at Google Research.
[03/2022] Along with colleagues from Facebook AI, Rice U, Socure, Philips Research, and University of Cambridge, we organized and hosted the AAAI'22 workshop on Human-Centric Self-Supervised Learning (link to workshop website).
[01/2022] I am been appointed as an Associate Editor for IEEE Transactions on Artificial Intelligence.
[12/2021] Our work on smart homes is featured in IEEE Spectrum magazine (link to article).
[11/2021] I was on the Casgrain Lecture on Artificial Intelligence panel.
[10/2021] Pritam Sarkar won the best poster award and Setareh Rahimi won the second place, at the Robotics and AI Symposium (@ Ingenuity Labs in Queen’s).
[07/2021] Our paper on generating depth images from RGB using a novel teacher-student GAN got accepted to ICCV'21.
Pritam Sarkar was selected for the Postgraduate Affiliate Award from Vector Institute!
Sayantan Das was awarded the Vector Scholarship in Artificial Intelligence!
Our paper on a novel LSTM cell architecture for multi-perspective data was accepted to CVPR'21.
I gave a talk at the Canada Artificial Intelligence, Machine Learning, Data Science, Engineering & Analytics Digital Forum (CAMDEA) on affective computing with deep learning.
I was on a panel at the Neurotech Workshop, discussing the role of AI in neurtech and BCI.
Our work on generating affordable ECG from PPG with a novel GAN architecture got accepted to AAAI'21.
Divij Gupta was awarded the Vector Scholarship in Artificial Intelligence!
Partners and Sponsors: