abramschonfeldt Profile Banner
(a)bram Profile
(a)bram

@abramschonfeldt

Followers
75
Following
162
Media
2
Statuses
30

PhD student exploring machine learning in population health @OxWearables

Oxford | Cape town | Joburg
Joined July 2011
Don't wanna be here? Send us removal request.
@FazlBarez
Fazl Barez 🔜 @NeurIPS
9 days
We’re hiring! Looking for Interns, Research Assistants, and Postdocs to work on Automated Interpretability--building systems that can analyse, explain, and intervene on large models to make them safe! Work with me @Oxford, or remotely. Apply by Nov 15: https://t.co/KEqXwpxgyb
20
115
860
@pixl_oxford
PIXL @ Oxford
5 months
Introducing the X account for the Perceptual Intelligence and Extended Reality Lab (PIXL) at @CompSciOxford! We’ll share discussions, interesting talks and research projects from our group. Check our work at: https://t.co/DVtKeMUgsX 📸From our spring outing @HawkConservancy
4
6
20
@abramschonfeldt
(a)bram
6 months
This is the first benchmark comparing VLMs & fine-tuned models for activity intensity from wearable cameras in free-living settings. We hope it contributes to scalable, ethical tools for population health research. 📄 https://t.co/08Klw511JV 🧵7/7 #AI #HealthData #Wearables
Tweet card summary image
arxiv.org
Introduction: Data from wearable devices collected in free-living settings, and labelled with physical activity behaviours compatible with health research, are essential for both validating...
0
0
0
@abramschonfeldt
(a)bram
6 months
✅ Takeaway: Free, open-source VLMs can help label sedentary behaviour in wearable studies right now, reducing human workload. But we need better generalisation and sequence-aware models to replace full human annotation. 🧵6/7
1
0
0
@abramschonfeldt
(a)bram
6 months
🤖 Interesting finds: Rephrasing labels (e.g. “walking” vs “light intensity activity”) helped VLMs a lot. VLMs performed similarly to humans on single images — but annotators normally get full-day context. Sequential models show further promise. 🧵5/7
1
0
0
@abramschonfeldt
(a)bram
6 months
📉 But: All models struggled to generalise to the Sichuan dataset (different population, sparser data). Cohen’s κ dropped from: ViT: 0.67 → 0.19 LLaVA: 0.54 → 0.26 So: good within-study, poor across-study. 🧵4/7
1
0
0
@abramschonfeldt
(a)bram
6 months
🧪 Key results (Oxfordshire, UK data): VLMs (e.g. LLaVA) predict sedentary behaviour nearly as well as fine-tuned models. F1-scores: Sedentary: 0.89 (VLM) vs 0.91 (ViT) Light: 0.60 (VLM) vs 0.70 (ViT) MVPA: 0.66 (VLM) vs 0.72 (ViT) 🧵3/7
1
0
0
@abramschonfeldt
(a)bram
6 months
💡 Why this matters: Labelled data-sets are vital in training and testing wearable models of physical activity. However, annotating wearable camera data is expensive, slow, and privacy-sensitive. We test if generalist computer vision models (like CLIP, LLaVA) can help. 🧵2/7
1
0
0
@abramschonfeldt
(a)bram
6 months
🚨 New preprint on arXiv from @OxWearables @pixl_oxford ! Can vision-language models (VLMs) help automatically annotate physical activity in large real-world wearable datasets (⌚️+📷, 🇬🇧 + 🇨🇳). 📄 https://t.co/08Klw511JV 🧵1/7
1
4
12
@JerredChen
Jerred Chen
8 months
Motion blur typically breaks SLAM/SfM algorithms - but what if blur was actually the key to super-robust motion estimation? In our new work, Image as an IMU, @ronnieclark__ and I demonstrate exactly how a single motion-blurred image can be used to our advantage. 🧵1/9
9
58
477
@aiden1doherty
Aiden Doherty
1 year
Interested in learning more about wearables in large-scale biomedical studies? We're running a residential short course from 22 - 26 September at Oxford Includes: inspirational speakers, great tutors, and hands-on data analysis. https://t.co/5tB4QKfHvo
1
15
35
@angerhang
Hang Yuan
2 years
Happy to announce our foundation model for wearables published at npj Digital Medicine today. This model sets a new standard in #Wearables, significantly outperforming human activity recognition benchmarks in diverse conditions. 🚀 https://t.co/RfIRY5UMTz
4
18
74
@angerhang
Hang Yuan
2 years
The much waited CAPTURE-24 description and benchmark paper has finally come live. CAPTURE-24 contains activity tracker data in the wild from 151 participants, aka 3883 hours of accelerometer. The associated code and benchmark are also released! https://t.co/ttNpdZAzsy
1
4
16
@bwpapiez
Bartek Papiez
2 years
⏳One month to apply for a DPhil position in our group! ➡️Details here: https://t.co/5imSn4JjUX ➡️Please contact to discuss!
ndph.ox.ac.uk
@bwpapiez
Bartek Papiez
2 years
‼️PhD/DPhil opportunity‼️ A broadly defined project on "Multimodal learning for population health studies" advertised via: https://t.co/5imSn4JjUX ⏳Please contact me for details! @Oxford_NDPH @bdi_oxford @OxfordBioMedIA @MICCAI_Society @MiccaiStudents @miua2024
0
3
9
@bdi_oxford
Big Data Institute
2 years
Last month, three DPhil students from the BDI were selected to participate in the Data Science Ideathon organised by @wellcometrust. They used their skills and knowledge to map methane emissions and health outcomes.💻🌍 Read more on the BDI website 👉 https://t.co/GT192kQEIM
0
4
13
@aiden1doherty
Aiden Doherty
2 years
Job alert - researcher position for those interested in -omics discovery with respect to device-measured activity, sleep, and circadian rhythms @OxWearables @bdi_oxford @Oxford_NDPH @UniofOxford https://t.co/XRvRdn30fq Please feel to email me to find out more.
0
4
15
@abramschonfeldt
(a)bram
2 years
How does methane impact health? Had fun exploring this question over the past few days @wellcometrust with my team @F_Reitzug and Eloise Ockenden for the #ideathon. Lovely getting to know everyone involved!
1
1
23
@milankloewer
Milan Klöwer
2 years
Global sea ice extent currently 5 standard deviations below climatology. That’s the same significance level that was used to prove that the Higgs Boson exists.
1
4
15
@ScottinOxford
Scott Small
3 years
Pleased to share our preprint on an open-source method that offers the most robust way to measure steps from wrist-worn accelerometers in large-scale studies such as the @uk_biobank. https://t.co/L2B1LsoIXH 🧵
Tweet card summary image
medrxiv.org
Background Step count is an intuitive measure of physical activity frequently quantified in a range of health-related studies; however, accurate quantification of step count can be difficult in the...
1
15
41
@angerhang
Hang Yuan
3 years
Our lab @OxWearables is offering an exciting fully funded UNIQ+ summer internship for students in the UK to work on machine learning heterogeneity in population health in collaboration with @ten_photos and Yifan Yu. Apply before Feb 17 and select project population health 1.
2
6
9