This week we are joined by Kyunghyun Cho. He is an associate professor of computer science and data science at New York University, a research scientist at Facebook AI Research and a CIFAR Associate Fellow. On top of this he also co-chaired the recent ICLR 2020 virtual conference.
We talk about a variety of topics in this weeks episode including the recent ICLR conference, energy functions, shortcut learning and the roles popularized Deep Learning research areas play in answering the question \u201cWhat is Intelligence?\u201d.
Underrated ML Twitter: https://twitter.com/underrated_ml
Kyunghyun Cho Twitter: https://twitter.com/kchonyc?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor
Please let us know who you thought presented the most underrated paper in the form below:
https://forms.gle/97MgHvTkXgdB41TC8
Links to the papers:
\u201cShortcut Learning in Deep Neural Networks\u201d - https://arxiv.org/pdf/2004.07780.pdf
"Bayesian Deep Learning and a Probabilistic Perspective of Generalization\u201d - https://arxiv.org/abs/2002.08791
"Classifier-agnostic saliency map extraction" - https://arxiv.org/abs/1805.08249
\u201cDeep Energy Estimator Networks\u201d - https://arxiv.org/abs/1805.08306
\u201cEnd-to-End Learning for Structured Prediction Energy Networks\u201d - https://arxiv.org/abs/1703.05667
\u201cOn approximating nabla f with neural networks\u201d - https://arxiv.org/abs/1910.12744
\u201cAdversarial NLI: A New Benchmark for Natural Language Understanding\u201c - https://arxiv.org/abs/1910.14599
\u201cLearning the Difference that Makes a Difference with Counterfactually-Augmented Data\u201d - https://arxiv.org/abs/1909.12434
\u201cLearning Concepts with Energy Functions\u201d - https://openai.com/blog/learning-concepts-with-energy-functions/