A list of recent papers regarding natural language understanding and spoken language understanding.
It contains sequence labelling, sentence classification, dialogue act classification, dialogue state tracking and so on.
- A review about NLU datasets for task-oriented dialogue is here.
- There is an implementation of joint training of slot filling and intent detection for NLU, which is evaluated on ATIS, SNIPS, the Facebook’s multilingual dataset, MIT corpus, E-commerce Shopping Assistant (ECSA) dataset and CoNLL2003 NER datasets.
- Variant networks for different semantic representations
- Robustness to ASR-error
- Zero-shot learning and domain adaptation
- Using Recurrent Neural Networks for Slot Filling in Spoken Language Understanding. Grégoire Mesnil, et al.. TASLP, 2015. [Code+data]
- Attention-based recurrent neural network models for joint intent detection and slot filling. Bing Liu and Ian Lane. InterSpeech, 2016. [Code1] [Code2]
- Encoder-decoder with Focus-mechanism for Sequence Labelling Based Spoken Language Understanding. Su Zhu and Kai Yu. ICASSP, 2017. [Code]
- Neural Models for Sequence Chunking. Fei Zhai, et al. AAAI, 2017.
- End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF. Xuezhe Ma, Eduard Hovy. ACL, 2016.
- A Bi-model based RNN Semantic Frame Parsing Model for Intent Detection and Slot Filling. Yu Wang, et al. NAACL 2018.
- A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding. Changliang Li, et al. EMNLP 2018. [from Kingsoft AI Lab]
- Joint Slot Filling and Intent Detection via Capsule Neural Networks. Chenwei Zhang, et al. 2018.
- BERT for Joint Intent Classification and Slot Filling. Qian Chen, et al.. Arxiv 2019.
- A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling. Haihong E, et al. ACL, 2019.
- A Stack-Propagation Framework with Token-Level Intent Detection for Spoken Language Understanding. Libo Qin, et al. EMNLP-IJCNLP, 2019.
- Improving Slot Filling in Spoken Language Understanding with Joint Pointer and Attention. Lin Zhao and Zhe Feng. ACL, 2018.
- A Hierarchical Decoding Model for Spoken Language Understanding from Unaligned Data. Zijian Zhao, et al. ICASSP 2019. [SJTU]
- Semantic Parsing for Task Oriented Dialog using Hierarchical Representations. Sonal Gupta, et al. EMNLP 2018. [from Facebook AI Research]
- Discriminative spoken language understanding using word confusion networks. Matthew Henderson, et al.. SLT, 2012. [Data]
- Using word confusion networks for slot filling in spoken language understanding. Xiaohao Yang and Jia Liu. Interspeech, 2015.
- Joint Online Spoken Language Understanding and Language Modeling with Recurrent Neural Networks. Bing Liu and Ian Lane. SIGDIAL, 2016. [Code]
- Robust Spoken Language Understanding with unsupervised ASR-error adaptation. Su Zhu, et al.. ICASSP, 2018.
- Neural Confnet Classification: Fully Neural Network Based Spoken Utterance Classification Using Word Confusion Networks. Ryo Masumura, et al.. ICASSP, 2018.
- From Audio to Semantics: Approaches to end-to-end spoken language understanding. Parisa Haghani, et al.. SLT, 2018. [Google]
- Robust Spoken Language Understanding with Acoustic and Domain Knowledge. Hao Li, et al.. ICMI, 2019. [SJTU]
- Adapting Pretrained Transformer to Lattices for Spoken Language Understanding. Chao-Wei Huang and Yun-Nung Chen. ASRU, 2019. [Code]
- Learning ASR-Robust Contextualized Embeddings for Spoken Language Understanding. Chao-Wei Huang and Yun-Nung Chen. ICASSP, 2020.
- A model of zero-shot learning of spoken language understanding. Majid Yazdani and James Henderson. EMNLP, 2015.
- Zero-shot Learning Of Intent Embeddings For Expansion By Convolutional Deep Structured Semantic Models. Yun-Nung Chen, et al.. ICASSP 2016.
- Online Adaptative Zero-shot Learning Spoken Language Understanding Using Word-embedding. Emmanuel Ferreira, et al. ICASSP 2015.
- Label Embedding for Zero-shot Fine-grained Named Entity Typing. Yukun Ma et al. COLING, 2016.
- Towards Zero-Shot Frame Semantic Parsing for Domain Scaling. Ankur Bapna, et al. Interspeech, 2017.
- Concept Transfer Learning for Adaptive Language Understanding. Su Zhu and Kai Yu. SIGDIAL, 2018.
- An End-to-end Approach for Handling Unknown Slot Values in Dialogue State Tracking. Puyang Xu and Qi Hu. ACL, 2018.
- Large-Scale Multi-Domain Belief Tracking with Knowledge Sharing. Osman Ramadan, et al.. ACL, 2018. [Data]
- Zero-Shot Adaptive Transfer for Conversational Language Understanding. Sungjin Lee, et al.. Arxiv 2018. [Microsoft]
- Robust Zero-Shot Cross-Domain Slot Filling with Example Values. Darsh J Shah, et al.. ACL, 2019.
- Few-shot classification in Named Entity Recognition Task. Alexander Fritzler, et al. SAC, 2019.
- Few-Shot Text Classification with Induction Network. Ruiying Geng, et al. Arxiv 2019.
- Few-Shot Sequence Labeling with Label Dependency Transfer and Pair-wise Embedding. Yutai Hou, et al.. Arxiv 2019.
- Domain Attention with an Ensemble of Experts. Young-Bum Kim, et al.. ACL, 2017.
- Adversarial Adaptation of Synthetic or Stale Data. Young-Bum Kim, et al.. ACL, 2017.
- Fast and Scalable Expansion of Natural Language Understanding Functionality for Intelligent Agents. Anuj Goyal, et al. NAACL, 2018. [from Amazon Alexa Machine Learning]
- Bag of Experts Architectures for Model Reuse in Conversational Language Understanding. Rahul Jha, et al.. NAACL, 2018. [from Microsoft Corporation]
- Data Augmentation with Atomic Templates for Spoken Language Understanding. Zijian Zhao, et al. EMNLP-IJCNLP, 2019. [SJTU]
- Prior Knowledge Driven Label Embedding for Slot Filling in Natural Language Understanding. Su Zhu, et al.. TASLP, 2020. [SJTU]
- Investigating Meta-Learning Algorithms for Low-Resource NLU tasks. Zi-Yi Dou, Keyi Yu, Antonios Anastasopoulos. EMNLP 2019[short]. [pdf]
- Enhanced Meta-Learning for Cross-lingual Named Entity Recognition with Minimal Resources. Qianhui Wu, Zijia Lin, Guoxin Wang, Hui Chen, Börje F. Karlsson, Biqing Huang, Chin-Yew Lin. AAAI 2020. [pdf]