Chinnadhurai Sankar

Email  /  Google Scholar  /  LinkedIn

I am currently a Research Scientist at Facebook AI working on conversational AI.

[Research] I am broadly interested in Deep learning, NLP, Dialog systems (Open-domain & Task-Oriented), Recurrent Architectures, Deep RL, learning efficient text representations.

[Education] I completed my Ph.D. at the University of Montreal, MILA lab advised by Prof. Yoshua Bengio. I earned my bachelor’s degree at IIT Madras, where I majored in Electrical engineering and minored in Physics and my master's degree in ECE at Purdue University.

profile photo

[Apr 2020] Our recent works on the robustness of LSH based text representations and on-device LSH based Transformers are out on arxiv.

[Mar 2020] Joined as a Research Scientist at Facebook AI to work on conversational AI.

[Jan 2020] Two of our research efforts cited as important conversational AI papers from 2019. Check out this Forbes article

[Sep 2019] Our paper on modelling chit-chat dialog with discrete attributes using Deep RL won Best Paper Award at SIGDIAL 2019 (oral presentation).

[Aug 2019] Our paper, TaskMaster Dialog Corpus: Toward a Realistic and Diverse Dataset, accepted to EMNLP 2019 (oral presentation). Check out Google AI blog post, Venturebeat

[Jul 2019] Our paper on analyzing context representation in neural dialog systems has been accepted to ACL 2019 (oral presentation, Best Paper Award nomination).

[Mar 2019] Our paper on training embedding-less word representations has been accepted to NAACL 2019.

[Feb 2019] Our paper which proposes a new RL based data augmentation for modelling open domain dialogs has been accepted to JAIR 2019.

[Jan 2019] Started internship with Google Brain to work on Task oriented dialog systems. Papers in review at EMNLP 2019 and NeurIPS 2019.

[Jan 2019] Gave a contributed talk at Deep-Dial, AAAI 2019 about our work on modelling chit-chat dialog. Also, accepted to Conv-AI, NeurIPS 2018.

[Fall2018] Interned at Google AI, hosted by Sujith Ravi working on memory efficient embedding-less text representations.

[Oct 2018] Our paper which proposes a NEW simple recurrent architecure for modelling long term dependencies has been accepted to AAAI 2019 (spotlight presentation).

[Dec 2017] Our dialog bot, MILABOT won 2nd prize in NeurIPS 2017 demonstration track.

Recent 1st Author Publications

Deep Reinforcement Learning For Modeling Chit-Chat Dialog With Discrete Attributes.
[SIGDIAL 2019, oral, Best Paper Award ][arxiv]

TaskMaster-1 Dialog Corpus: Toward a Realistic and Diverse Dataset.
[EMNLP 2019, oral][Google AI blog post][arxiv][data][Media Coverage]

Do Neural Dialog Systems Use the Conversation History Effectively? An Empirical Study.
[ACL 2019, oral, Best Paper Award nomination][arxiv][code]

Transferable Neural Projection Representations.
[NAACL 2019][arxiv]

Towards Non-saturating Recurrent Units for Modelling Long-term Dependencies.
[AAAI 2019, spotlight][arxiv]

Please refer to my Google Scholar page for a detailed list of my publications.