Interaction-driven speech input: a data-driven approach to the capture of both local and global language constraints
Recommendations
On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? 🦜
FAccT '21: Proceedings of the 2021 ACM Conference on Fairness, Accountability, and TransparencyThe past 3 years of work in NLP have been characterized by the development and deployment of ever larger language models, especially for English. BERT, its variants, GPT-2/3, and others, most recently Switch-C, have pushed the boundaries of the possible ...
Pre-train, Prompt, and Predict: A Systematic Survey of Prompting Methods in Natural Language Processing
This article surveys and organizes research works in a new paradigm in natural language processing, which we dub “prompt-based learning.” Unlike traditional supervised learning, which trains a model to take in an input x and predict an output y as P(y|x), ...
Recent Advances in Natural Language Processing via Large Pre-trained Language Models: A Survey
Large, pre-trained language models (PLMs) such as BERT and GPT have drastically changed the Natural Language Processing (NLP) field. For numerous NLP tasks, approaches leveraging PLMs have achieved state-of-the-art performance. The key idea is to learn a ...
Comments
Please enable JavaScript to view thecomments powered by Disqus.Information & Contributors
Information
Published In
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Qualifiers
- Article
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 221Total Downloads
- Downloads (Last 12 months)86
- Downloads (Last 6 weeks)5
Other Metrics
Citations
View Options
Get Access
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in