James Henderson
@JamieBHenderson
Followers
523
Following
24
Media
1
Statuses
25
Head of Natural Language Understanding group at Idiap Research Institute
Joined April 2019
I am excited to announce our @iclr_conf 2023 paper: "A VAE for Transformers with Nonparametric Variational Information Bottleneck" https://t.co/2LIMCaxMJF We propose to model Transformer embeddings as nonparametric mixture distributions using Dirichlet processes. @FabioFehr
0
6
28
Multiple PhD positions in deep learning for NLP and neuroscience are available in my Natural Language Understanding group at Idiap, Switzerland @idiap_ch ( https://t.co/Jr76w2gkkZ). Good salaries, beautiful location, and a world-class AI research community. #NCCR_EvolvingLanguage
idiap.ch
The following positions are currently available at the Idiap research institute.
0
5
21
I'm really excited about our new work on modelling the latent space of Transformers as nonparametric mixture distributions, and defining a Nonparametric Variational AutoEncoder with a Transformer encoder-decoder. Paper to be presented at #ICLR: https://t.co/5VI5SzW7Ce
lnkd.in
This link will take you to a page that’s not on LinkedIn
1
16
104
Idiap Research Institute has several attractive open positions to head research groups in AI, with a preference for cross-disciplinary and industry-driven research. Consideration of applications will start soon. : https://t.co/KWwT9rM9Wr
0
4
8
Our new paper (w/ @JamieBHenderson): A novel way of encoding syntactic structure for SRL task by inputting graph relations directly into the attention mechanism of Transformer: Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling, https://t.co/Y2Elz6X9LD
@Idiap_ch
0
3
9
Come and work with us! We have seven permanent jobs and one postdoc! Check us out!
#Jobs Lots of openings based in Zurich: Join our Transversal Task Force! 👉 https://t.co/QiP3nmoSEV Data Science positions (Machine Learning & Statistics) | Database Engineer | Database Aggregator/Manager 📲Share it and apply if you want to join our #NCCE_EvolvingLanguage
0
6
10
Our Findings of @emnlp2020 paper “Graph-to-Graph Transformer for Transition-based Dependency Parsing” (w/ @JamieBHenderson) is now available: @Idiap_ch @EPFL Arxiv: https://t.co/97kRn1mi5G Presentation: https://t.co/OD01C4cr9Y Code:
github.com
Pytorch implementation of Graph-to-Graph Transformer for Transition-based Dependency Parsing accepted to EMNLP 2020 - alirezamshi/G2GTr
G2G Transformer:combines attention-based mechanism for conditioning on graphs with an attention-like mechanism for predicting graphs. It can be easily applied to several NLP tasks: Older arxiv version: https://t.co/97kRn14He8 Updated preprint and code will be available very soon
0
6
17
2. "Multilevel text alignment with cross-document attention" (w/ @nlpxuhui, @nlpnoah). 3. "Plug and play autoencoders for controlled text generation" (w/, @fmai, @ivanspmontero, @nlpnoah, @JamieBHenderson). More details coming soon. (2/2)
1
3
12
Our paper (w/ @JamieBHenderson) titled 'Graph-to-Graph Transformer for Transition-based Dependency Parsing' has been accepted as a long paper in Findings of @emnlp2020 . #emnlp2020 updated preprint and code will be available soon.
1
3
21
The announcent and call for papers for EACL 2021 is now online. Plan to submit, plan to attend!! See you there.
The 16th conference of the European Chapter of the Association for Computational Linguistics (EACL 2021) will be held in Kyiv, Ukraine from 19 to 23 April, 2021. More information now available on the website https://t.co/fZ7Dtqk3n0
#NLProc #aclnlp #eacl2021nlp
0
2
10
Senior postdoctoral position open in my group (3+years). Some French required. You'll be involved in a visible computational linguistics group and in the new exciting Evolving language project! Check us out. For all information see announcement at https://t.co/AKheyhlMc5
0
26
30
Grateful that my #acl2020nlp theme paper "The Unstoppable Rise of Computational Linguistics in Deep Learning" was accepted. Traces how the nature of language has and will impact neural network architectures, including variable binding in Transformer.
arxiv.org
In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of...
4
60
287
I am excited to share that our paper, "End-to-End Bias Mitigation by Modelling Biases in Corpora", is accepted to #acl2020 #acl2020nlp #nlproc Joint work with @boknilev and @JamieBHenderson
2
4
44
A Postdoc and multiple PhD positions are available in my Natural Language Understanding group at Idiap, Switzerland @idiap_ch ( https://t.co/P7BcO5b8hb). Good salaries, beautiful location, and a world-class AI research community. #NCCR_EvolvingLanguage
idiap.ch
The following positions are currently available at the Idiap research institute.
4
42
71
We have great new SoTA syntactic parsing results with RNG Transformer: Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement, https://t.co/qTBovee4Jv. Retrains BERT to correct syntactic structures, graph2graph.
1
15
57
We stand by Italy during these trying times. Share your Support for our Italian friends, They are our colleagues, friends and family. Cari amici, siamo con voi. #COVID19 #WeStandWithItaly
0
1
0
Great seeing the launch of Scotland's leading AI startup - Alana @alanathebot spinning out from @HeriotWattUni and the development of their Conversational AI Interfaces Congrats to @oliverlemon and @verena_rieser and team on the launch
2
8
16
The border between Geneva and France. I feel safer already.
0
0
6