MichaelProulx Profile Banner
Michael J. Proulx Profile
Michael J. Proulx

@MichaelProulx

Followers
2K
Following
21K
Media
516
Statuses
15K

Threads @mikeproulx2 + Research Scientist @RealityLabs + Professor @BathPsychology (Tweets are my own) Formerly @QMUL & @HHU_de + alum of @JohnsHopkins & @ASU

Redmond, WA
Joined January 2012
Don't wanna be here? Send us removal request.
@MichaelProulx
Michael J. Proulx
2 years
I'm thrilled to share our new paper just out in @PNASNews ! Motor “laziness” constrains fixation selection in real-world tasks -- amazing work by former @RealityLabs intern @csburlingham with @olegkomo @tsmurdison & Naveen Sendhilnathan https://t.co/hP0yCZldED 1/n
Tweet card summary image
pnas.org
Humans coordinate their eye, head, and body movements to gather information from a dynamic environment while maximizing reward and minimizing biome...
7
13
54
@KateInclusiveEd
Dr.Katrina Tavoulari, CPsychol, BA, MEd, PhD, FHEA
2 months
#ImagesOfResearch2025 are back at #BathSpaTrainStation celebrating #RealImpact. Our poster from @UniofBath, @BathPsychology captures a part of a journey exploring how a #VR event can be #accessible for youth with #VisionImpairment. @newcollworc @MichaelProulx #KarinPetrini
0
2
2
@MichaelProulx
Michael J. Proulx
6 months
Such an exciting set of open source models and data!
@AIatMeta
AI at Meta
6 months
🚀New from Meta FAIR: today we’re introducing Seamless Interaction, a research project dedicated to modeling interpersonal dynamics. The project features a family of audiovisual behavioral models, developed in collaboration with Meta’s Codec Avatars lab + Core AI lab, that
0
0
2
@meta_aria
Project Aria @Meta
6 months
Using the dataset, the team developed lightweight and flexible model for reading recognition with high precision and recall by utilizing RGB, eye gaze, and head pose data. Detailed performance analysis and capabilities of the model are available in our technical report.
0
3
4
@meta_aria
Project Aria @Meta
6 months
Reading Recognition in the Wild features video, eye gaze, and head pose sensor outputs, created to help solve the task of reading recognition from wearable devices. Notably, this is the first egocentric dataset to feature high-frequency eye-tracking data collected at 60 Hz.
1
2
4
@MichaelProulx
Michael J. Proulx
6 months
Reading Recognition in the Wild -- open source! What's great about @RealityLabs? Interdisciplinary collaboration, industry-academic partnerships, breakthrough findings, and advancing the field by sharing open source data. It's all here with @meta_aria - check it out!
@meta_aria
Project Aria @Meta
6 months
Reading Recognition in the wild is a large-scale multimodal dataset comprising 100 hours of reading and non-reading videos captured in diverse and realistic scenarios using Project Aria from 100+ participants. Download the dataset and model at https://t.co/FJEuPtJ7w8.
0
1
6
@RealityLabs
Reality Labs at Meta
6 months
‘Marvel’s Deadpool VR’ Announced for Meta Quest 3 and 3S https://t.co/kP6vC6oywo
2
22
78
@AbdrabouYasmeen
Yasmeen Abdrabou, PhD
7 months
Starting session 3 of the #GenEAI workshop at @ETRA_conference with a paper presentation by Oleg Komogortsev titled Device-Specific Style Transfer of Eye-Tracking Signals #ETRA2025 🇯🇵
0
4
7
@AbdrabouYasmeen
Yasmeen Abdrabou, PhD
7 months
Stein Dolan giving a workshop Keynote at #GenEAI workshop at @ETRA_conference titled "Attention" is All You Need #ETRA2025 🇯🇵
0
4
8
@MichaelProulx
Michael J. Proulx
7 months
Great to be able to have @RealityLabs sponsor a great meeting like @ETRA_conference !
@AbdrabouYasmeen
Yasmeen Abdrabou, PhD
7 months
Thanks to @ETRA_conference sponsors #ETRA2025 🇯🇵
0
0
3
@MichaelProulx
Michael J. Proulx
7 months
Such a cool conference shirt!
@ko_watanabe_jp
Ko Watanabe 🇩🇪
7 months
ETRAのアメニティで貰ったTシャツが本当にイケてる!ヘビロテしたい!#ETRA2025
1
1
3
@marknb00
Mark Billinghurst
7 months
This is the Sensorama - the first multisensory personal immersive theatre, from 1962. There is only one left in the world.. a priceless piece of VR history. An exclusive tour to see it is one of many items in the Virtual World Society auction. See https://t.co/DZqZjzI9Yb
1
7
22
@ETRA_conference
ETRA
8 months
🔬 8 specialized workshops at #ETRA2025: • Eye Movements in Programming • Generative AI meets Eye Tracking • Pervasive Eye Tracking • and more! Dive deep into your area of interest with global experts in Tokyo, May 26-29. See full list: https://t.co/bSSmAm5aY9 #Research
Tweet card summary image
etra.acm.org
The 2025 ACM Symposium on Eye Tracking Research & Applications (ETRA) will be held in Tokyo, Japan from May 26 to May 29, 2025.
1
4
3
@seeingwithsound
The vOICe vision BCI 🧠🇪🇺
8 months
Visual experience affects neural correlates of audio-haptic integration: A case study of non-sighted individuals https://t.co/BM2nKqOFQV by @meike_scheller @MichaelProulx @AnnegretNoor et al.
0
2
2
@RevealCentre
REVEAL
8 months
We're thrilled to announce that our paper "RetroSketch: A Retrospective Method for Measuring Emotions and Presence in VR" received an Honorable Mention Award at CHI 2025! 🎊 📖 Feel free to check out the lead author’s website (@d_potts2) to learn more:
0
1
4
@seeingwithsound
The vOICe vision BCI 🧠🇪🇺
8 months
Workshop at University of Exeter (UK), September 8, 2025: Immersive VR to understand human cognition, perception & action https://t.co/FiPh8sdm7k via @DrGBuckingham with @MichaelProulx; #AR/#VR
eventbrite.co.uk
Join us for a free, day-long workshop exploring the cutting-edge intersection of immersive virtual reality (VR) and experimental psychology.
1
3
3
@RevealCentre
REVEAL
8 months
🎉 We're delighted to introduce EmoSense - an innovative tool, designed to enhance Extended Reality (XR) experiences through advanced emotion recognition technology. #Research #Innovation 📖 If you want to learn more, please check out our website below: https://t.co/wHEkYjnzrT
0
1
5
@ETRA_conference
ETRA
8 months
🔍 Explore Tokyo while attending #ETRA2025! From the futuristic attractions of Odaiba to historic Asakusa, Tokyo offers endless experiences. Conference venue Miraikan is located in vibrant Odaiba with easy access to city highlights! #TokyoTravel Photo by Moiz K. Malik on Unsplash
0
2
3
@RealityLabs
Reality Labs at Meta
9 months
Imagine walking through a city, asking your AI glasses, “What’s this building?”—and instantly getting accurate, real-time information. 📍 No hallucinations. No misinformation. Just facts. 🚀 That’s what the Meta CRAG-MM Challenge aims to build. 🔗
Tweet card summary image
aicrowd.com
Improve RAG with Real-World Benchmarks | KDD Cup 2025
7
26
100