Phat Hoang
@tcs2711
Followers
52
Following
445
Media
0
Statuses
39
/ 7月7日(月)09:59まで \ #楽天マジ得フェスティバル開催中 フォロー&リポストで10,000ポイントを当てよう💰 さらに 【楽天カード会員様対象】楽天モバイル初めてお申し込みで20,000ポイント✨ ※期間限定ポイントなど条件あり 詳細👉 https://t.co/fD0fov0wRt
164
12K
3K
#RT @LanguageLog: Vietnamese without diacritics: From Reddit: [Click to embiggen] From Bill Hannas: My [Vietnamese] wife Jennifer read the passages in less time than it took me to read the corresponding English. She says both are completely intelligible…
0
1
2
Curios how does the state of the art image classification training pipeline looks? Check this video by @kaggle Grandmaster Arthur Kuzin. "Bag of Tricks for Image Classification". Slides: https://t.co/7Wg7uCYne3 Presentation:
0
44
185
Meta learning, i.e. partially-observable, sample-constrained multi-task learning, is a must-have knowledge. And it's an area Bayesian perspective is helpful (because of learning in few-shot regime).
Want to learn about meta-learning? Lecture videos for CS330 are now online! https://t.co/taJ5yyIWVQ Topics incl. MTL, few-shot learning, Bayesian meta-learning, lifelong learning, meta-RL & more: https://t.co/mJ1v71huD7 + 3 guest lectures from Kate Rakelly, @svlevine, @jeffclune
0
20
101
Happy to announce that we've released a number of models trained with Noisy Student (a semi-supervised learning method). The best model achieves 88.4% top-1 accuracy on ImageNet (SOTA). Enjoy finetuning! Link: https://t.co/eC3O6tnFsJ Paper: https://t.co/ZYDaef6sdp
11
276
1K
New dataset: 4.5B parallel sentences in 576 langage pairs.
With 4.5B parallel sentences in 576 language pairs, CCMatrix is the largest data set of high-quality, web-based bitexts for training translation models. Now Facebook AI is sharing tools for other researchers to use this corpus for their work. https://t.co/uvBbfjPTk5
7
305
1K
We have a new research award opportunity on the topic of on-device AI! Academic faculty can submit proposals on enabling execution of AI-based capabilities within the constraints of edge devices. Applications are open until February 3.
research.facebook.com
AI has the potential to transform almost everything around us. It can change the way humans interact with the world by making the objects around us "smart" — capable of constantly learning, adapting,...
4
68
186
⚙️Release: CCNet is our new tool for extracting high-quality and large-scale monolingual corpora from CommonCraw in more than a hundred languages. Paper: https://t.co/zwKGU3tnIv Tool: https://t.co/L7b5frErqk By G. Wenzek, M-A Lachaux, @EXGRV, @armandjoulin
0
56
157
Facebook AI has achieved superhuman performance on English to German translation tasks, according to #WMT2019 organizers. Read more: https://t.co/C5peI4OjfI
#NLP
10
121
400
Only a weekend after @ACL2019_Italy, there are already awesome reviews available on various topics: - Trends in NLP by @mihail_eric
https://t.co/w2onhvmBxx - Knowledge graphs by @michael_galkin
https://t.co/9wMPKLId38 - MT by @noecasas
https://t.co/7eMlbhHWrL 👏#acl2019nlp
mgalkin.medium.com
Hello, ACL 2019 has just finished and I attended the whole week of the conference talks, tutorials, and workshops in beautiful Florence…
2
128
366
An amazing poster from a member of @vietaiorg, Bao-Dai at @seamlschool
quick look at this poster by Bao Dai made me realize how uncool AlphaGo was. how dare they didn't use pixel level representation of a board! #seaml2019
1
0
4
ICML2019Tutorial資料集(1/2) Attention in Deep Learning https://t.co/I8ZqSqZAeu Meta-Learning: from Few-Shot Learning to Rapid Reinforcement Learning https://t.co/mFH5lCUorS Never-Ending Learning https://t.co/ktTVP3u1oT A Primer on PAC-Bayesian Learning https://t.co/NrZk8qpJRR
sites.google.com
Abstract In recent years, high-capacity models, such as deep neural networks, have enabled very powerful machine learning techniques in domains where data is plentiful. However, domains where data is...
0
40
172
Here are the materials for our @NAACLHLT tutorial on Transfer Learning in NLP with @Thom_Wolf @swabhz @mattthemathman: Slides: https://t.co/54KVG0K85z Colab: https://t.co/iqWPtVFSVg Code: https://t.co/bka5EsuYtP
#NAACLTransfer
10
279
812
Interesting developments happened in 2018/2019 for natural language generation decoding algorithms: here's a thread with some papers & code So, the two most common decoders for language generation used to be greedy-decoding (GD) and beam-search (BS). [1/9]
12
277
919
This is a photo of an actual living arachnid in Ecuador called the Bunny Harvestman (Metagryne bicolumnata). It’s immediately noticeable for its odd head that looks weirdly like a dog (photo: Andreas Kay)
153
2K
8K
Accepted to #ICML2019! :) (didn't expect this, we got both a strong accept and a strong reject)
Our paper on fine-tuning BERT is out: https://t.co/k3WaeAu0QR :) We found that for BERT a surprisingly small # of params per-task is needed. Applications: e.g. multi-task learning and production.
1
12
109
I wrote a starter pack for @kaggle Landmark Retrieval 2019 competition. Since the competition data is too huge to iterate quickly, I have used SfM120k dataset from GeM paper. All the validation, fast NN-search code is already there. Library is @fastdotai
https://t.co/HWsxQvhsNi
github.com
fast.ai starter kit for Google Landmark Retrieval 2019 challenge - ducha-aiki/google-retrieval-challenge-2019-fastai-starter
1
29
120
This is really neat! You take a screenshot of an equation, it gives you the LaTeX code, you can directly modify in the taskbar, copy, paste, done. https://t.co/VMZfoNpasn
87
2K
7K