Machine Learning and NLP
@ML_NLP
Followers
55K
Following
24K
Media
7
Statuses
2K
We share news, discussions, videos, papers, and tutorials related to Machine Learning and NLP. Subscribe on Reddit!
Reddit
Joined July 2014
Are you interested in Machine Learning and NLP? Subscribe and contribute to the Natural Language Processing community on Reddit https://t.co/Ic0TTWXBCl
#NLProc #MachineLearning
reddit.com
Welcome to /r/TextDataMining! We share news, discussions, papers, tutorials, libraries, and tools related to NLP, machine learning and data analysis.
11
11
49
Building a Faster and Accurate Search Engine on Custom Dataset with Transformers 🤗 https://t.co/1m4pKR04QE
#NLProc
snrspeaks.medium.com
In this article, we will build a search engine on a huge corpus of custom dataset with Transformers
0
84
47
A visual guide to regular expressions
amitness.com
A mental model of how various components of a regular expression work from the bottom-up.
1
23
48
GraphGlove: embedding words in non-vector space with unsupervised graph learning https://t.co/yviw9mItwx
#NLProc
arxiv.org
It has become a de-facto standard to represent words as elements of a vector space (word2vec, GloVe). While this approach is convenient, it is unnatural for language: words form a graph with a...
1
4
11
The return of nearest neighbor models —or memory-based learning— to NLP: strong gains on neural MT, especially for domain adaptation https://t.co/PLmVXT6GFP
#NLProc
0
5
7
How can we make language models be less data-hungry? https://t.co/fX8m4Yd2r4
#NLProc
arxiv.org
Language models have emerged as a central component across NLP, and a great deal of progress depends on the ability to cheaply adapt them (e.g., through finetuning) to new domains and tasks. A...
0
2
7
Advancing NLP with efficient projection-based model architectures https://t.co/riHn8jSYeQ
#NLProc
research.google
Posted by Prabhu Kaliamoorthi, Software Engineer, Google Research Deep neural networks have radically transformed natural language processing (NLP)...
0
1
12
It's not just size that matters: small language models are also few-shot learners https://t.co/at3Ii2R9JE
#NLProc
arxiv.org
When scaled to hundreds of billions of parameters, pretrained language models such as GPT-3 (Brown et al., 2020) achieve remarkable few-shot performance. However, enormous amounts of compute are...
0
2
6
A Comparison of LSTM and BERT for Small Corpus https://t.co/lnbRGKYkVs
#NLProc
0
4
21
Language Interpretability Tool (LIT): a visual, interactive model-understanding tool for NLP models https://t.co/NQxNks6Y9O
#NLProc
github.com
The Learning Interpretability Tool: Interactively analyze ML models to understand their behavior in an extensible and framework agnostic interface. - PAIR-code/lit
0
2
10
Paraphrase Generation as Zero-Shot Multilingual Translation https://t.co/BegzVJEMGM
#NLProc
arxiv.org
Recent work has shown that a multilingual neural machine translation (NMT) model can be used to judge how well a sentence paraphrases another sentence in the same language (Thompson and Post,...
0
2
6
Word2vec Skip-gram Dimensionality Selection via Sequential Normalized Maximum Likelihood https://t.co/7lxWQUBnnz
#NLProc
arxiv.org
In this paper, we propose a novel information criteria-based approach to select the dimensionality of the word2vec Skip-gram (SG). From the perspective of the probability theory, SG is considered...
0
2
9
Large-scale Transfer Learning for Low-resource Spoken Language Understanding https://t.co/HO9gmzsq6a
#NLProc
0
1
3
Zero-Shot Learning in Modern NLP https://t.co/UKq8ROLUWG
#NLProc
joeddav.github.io
State-of-the-art NLP models for text classification without annotated data
0
3
13
Sentiment Analysis using Deep Learning https://t.co/5RffiRg0T8
#NLProc
0
6
17
Big Bird: Transformers for Longer Sequences https://t.co/l5isSLk0gW
#NLProc
0
2
9
Numpy Cheat Sheet https://t.co/yJSWtsm59C
#DataScience
2
6
14
GPT-3, a Giant Step for Deep Learning and NLP https://t.co/77iO8LdDoj
#NLProc
anotherdatum.com
Can intelligence emerge simply by training a big enough language model using lots of data? OpenAI tries to do so, using 175 billion parameters.
0
5
13
Trends in Integration of Vision and Language Research: A Survey of Tasks, Datasets, and Methods https://t.co/9KoptxYimk
#NLProc
arxiv.org
Interest in Artificial Intelligence (AI) and its applications has seen unprecedented growth in the last few years. This success can be partly attributed to the advancements made in the sub-fields...
0
1
6
Advances of transformer-based models for news headline generation https://t.co/lSr38lfQRS
#NLProc
0
0
5