Audio player

Comments

118 - Coreference Resolution, with Marta Recasens

NLP Highlights

Science & Medicine

In this episode, we talked about Coreference Resolution with Marta Recasens, a Research Scientist at Google. We discussed the complexity involved in resolving references in language, the simplification of the problem that the NLP community has focused on by talking about specific datasets, and the complex coreference phenomena that are not yet captured in those datasets. We also briefly talked about how coreference is handled in languages other than English, and how some of the notions we have about modeling coreference phenomena in English do not necessarily transfer to other languages. We ended the discussion by talking about large language models, and to what extent they might be good at handling coreference.


More Episodes  


Listen to 120 - Evaluation of Text Generation, with Asli Celikyilmaz

120 - Evaluation of Text Generation, with Asli Celikyilmaz

Oct 2, 2020
Listen to 119 - Social NLP, with Diyi Yang

119 - Social NLP, with Diyi Yang

Sep 3, 2020
Listen to 117 - Interpreting NLP Model Predictions, with Sameer Singh

117 - Interpreting NLP Model Predictions, with Sameer Singh

Aug 12, 2020
Listen to 116 - Grounded Language Understanding, with Yonatan Bisk

116 - Grounded Language Understanding, with Yonatan Bisk

Jul 2, 2020
Listen to 115 - AllenNLP, interviewing Matt Gardner

115 - AllenNLP, interviewing Matt Gardner

Jun 17, 2020