Hao Tang Profile
Hao Tang

@larryniven4

Followers
134
Following
24
Media
19
Statuses
146

Lecturer at the University of Edinburgh working in speech and language processing

Joined December 2011
Don't wanna be here? Send us removal request.
@Michael_J_Black
Michael Black
2 months
This applies to science and scientific writing. No amount of experiments will make up for a bad story. Don't go do all the work and then try to tell a mediocre story with the results. Michal Irani once told me that she writes the introductions to her papers long before she has
@mrexits
prayingforexits 🏴‍☠️
2 months
Steve Jobs on the art of storytelling
13
65
451
@EdinburghNLP
EdinburghNLP
11 months
The School of Informatics @InfAtEd at the University of Edinburgh @EdinburghUni is hiring a lecturer or reader in embodied natural language processing. Apply by 31 Jan 2025 at https://t.co/caiETxTUfL
0
8
15
@Maureendss
Maureen de Seyssel
1 year
I'm looking for a PhD research intern focused on Speech/NLP to join Apple Machine Learning Research in Copenhagen, Denmark! You can apply via : https://t.co/M5Qs1fR6bK Please share with anyone who might be interested
7
136
499
@tomhosking
Tom Hosking
1 year
📣 New paper! 📣 Hierarchical Indexing for Retrieval-Augmented Opinion Summarization (to appear in TACL) Discrete latent variable models are scalable and attributable, but LLMs are wayyy more fluent/coherent. Can we get the best of both worlds? (yes) 👀 https://t.co/kK5OhqTwlw
Tweet card summary image
arxiv.org
We propose a method for unsupervised abstractive opinion summarization, that combines the attributability and scalability of extractive approaches with the coherence and fluency of Large Language...
2
10
51
@uililo1
Oli Danyi Liu
1 year
Happy to share that our paper won the @cogsci_soc computational modeling award for perception and action! #CogSci2024 @Edin_CDT_NLP
@uililo1
Oli Danyi Liu
1 year
Paper with @larryniven4, Naomi Feldman, and Sharon Goldwater to appear in CogSci 2024: "A predictive learning model can simulate temporal dynamics and context effects found in neural representations of continuous speech" https://t.co/Nz4y1Rhrsh (1/7)
4
8
55
@uililo1
Oli Danyi Liu
1 year
Paper with @larryniven4, Naomi Feldman, and Sharon Goldwater to appear in CogSci 2024: "A predictive learning model can simulate temporal dynamics and context effects found in neural representations of continuous speech" https://t.co/Nz4y1Rhrsh (1/7)
2
5
39
@tomhosking
Tom Hosking
2 years
HRQ-VAE is now available in Pythae! Hierarchical Residual Quantization learns a recursive clustering of a dense vector space, end-to-end. And it learns more meaningful clusters than doing quantization post-hoc. 📝 Paper: https://t.co/3tu9cLBY1V 🤖 Code:
github.com
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022) - clementchadebec/benchmark_VAE
2
12
42
@tomsherborne
Tom Sherborne
2 years
🚨New TACL paper 🚨 Can we use explicit latent variable alignment for cross-lingual transfer? Minotaur uses optimal transport for explicit alignment across languages. w/ @tomhosking, Mirella Lapata. To be presented @emnlpmeeting + @mrl2023 in-person 🇸🇬 https://t.co/PdoDPEIP97
1
14
57
@docmilanfar
Peyman Milanfar
3 years
"InDI" is our new supervised image restoration method that's (a) user-friendly (alternative) way to explain diffusion & improve it; and (b) an effective way to avoid a common pitfall of one-step inference used in non-diffusion restoration methods. https://t.co/h1smVnyGDV 1/8
4
40
284
@oisinmacaodha
Oisin Mac Aodha
2 years
Calling all Machine Learning researchers on the academic job market for 2024. 📢📢📢 Applications are invited for a full time academic position in Machine Learning in the School of Informatics at the University of Edinburgh.
7
45
113
@mits_ota
Mits Ota
2 years
@kennysmithed and I are looking for a postdoc for our ESRC project titled "Language learning, communication and the emergence of phonotactic constraints" (32 months; see link for details). Come work with us!
elxw.fa.em3.oraclecloud.com
The Linguistics and English Language Department in the School of Philosophy, Psychology and Language Sciences is seeking a Postdoctoral Researcher on a full time, fixed term basis for 32 months. This...
0
35
31
@vernadankers
Verna Dankers @ EMNLP25
2 years
Tomorrow at Language Lunch @InfAtEd and in December at @emnlpmeeting: my EMNLP findings short paper with Chris Lucas on non-compositionality in sentiment! Sentiment computations largely adhere to the principle of compositionality, but ... (1/3)
1
3
28
@sivareddyg
Siva Reddy
2 years
My student sent me this list saying they have to improve themselves in many areas. Such a list can do more harm than good. While I appreciate author's intention to motivate one for greatness, I don't think it can be planned. But you can plan to be a "good researcher."
@_jasonwei
Jason Wei
2 years
Enjoyed visiting UC Berkeley’s Machine Learning Club yesterday, where I gave a talk on doing AI research. Slides: https://t.co/KjyqkLn2hO In the past few years I’ve worked with and observed some extremely talented researchers, and these are the trends I’ve noticed: 1. When
13
57
548
@wtgowers
Timothy Gowers @wtgowers
2 years
My son has just started calculus, and I asked him what the relationship was between the gradients of the tangent and the normal to a curve at a given point. His first reply was, "They are perpendicular." I've noticed many times that something one gains with experience ... 1/7
43
183
2K
@tomhosking
Tom Hosking
2 years
🚨 New paper 🚨 I’m excited to share the findings from my internship at @cohere with @max_nlp tl;dr Human feedback under-represents the factuality of LLM output, and annotators are less likely to spot factual errors in more assertive outputs! https://t.co/vGETMkW4NU
Tweet card summary image
arxiv.org
Human feedback has become the de facto standard for evaluating the performance of Large Language Models, and is increasingly being used as a training objective. However, it is not clear which...
8
58
306
@tomsherborne
Tom Sherborne
2 years
🚨 new paper 🚨 Can we train for flat minima with less catastrophic forgetting? We propose Trust Region Aware Minimization for smoothness in parameter+representations. TL;DR representations matter as much as parameters! https://t.co/SNCaglqioE w/@nsaphra @pdasigi @haopeng_nlp
2
16
90
@TTIC_Connect
TTIC
2 years
In honor of TTIC's 20th anniversary, we wanted to share our story and highlight the outstanding accomplishments of our faculty, staff, and students. View a 20-year timeline of the history of TTIC: https://t.co/sxCWBCF43o
0
3
11
@masonderegger
Morgan Sonderegger
2 years
Materials for my three-hour workshop "Quantitative analysis for corpus phonetics and phonology" given last week are online. Thanks to all who attended and to @LabPhon for sponsoring! #RStats #labphon https://t.co/QdsEyJxmKg
osf.io
Materials for online workshop "Quantitative analysis for corpus phonetics and phonology". Hosted on the Open Science Framework
0
33
59
@prafull7
Prafull Sharma
2 years
“Make yourself be lucky!”- Alyosha Efros
6
79
547
@SimonKirby
Simon Kirby
2 years
We’re hiring at Linguistics and English Language @SchoolofPPLS Extra time for research in the first three years. Dream job! Especially interested in corpus/computational linguists or expertise in endangered languages, but honestly open to any area. Pls RT!
elxw.fa.em3.oraclecloud.com
The College of Arts, Humanities and Social Sciences is appointing up to 10 Chancellor’s Fellows, who will have a demonstrable track record of innovative research, teaching and/or knowledge exchange,...
0
62
57