
Nick Vincent
@nickmvincent
Followers
934
Following
6K
Media
58
Statuses
1K
Assistant professor @SFU_CompSci, HCI+ML for widely beneficial "AI". Especially focused on "data leverage". See more here: https://t.co/NEgvVV7WkC
West Coast of North America
Joined July 2017
Interesting post. Agree that "Full-time contributors, not contractors" would be good for AI progress. But. Will labs offer such jobs benevolently? I think unlikely without bargaining power. In fact, I think current trajectory is that the AI field will make work more precarious.
Previously, AI progress relied heavily on monotonous, low-skill labeling from third-party contractors producing basic text, visual, and audio data at scale. But models have outgrown simple tasks, demanding richer context and deeper expertise. The era of sweatshop data is over.
1
0
2
RT @TheDRC_: Here's a weekly roundup of three pieces of helpful content within the field of decentralization: ⬇️. 1. Launch: July 9th: How….
0
3
0
RT @m_t_prewitt: Delighted to share my paper with @nickmvincent and @hanlinliii . Once you see it, you can’t unsee it. Information markets….
0
3
0
RT @Miles_Brundage: New blog post! . While recognizing the limitations of analogies, I explore the idea that AI is more like a liquid than….
0
5
0
RT @SFU_CompSci: Congratulations to #SFU PhD student Haidan (Afra) Liu, who won the Best Poster Award in HCI at Graphics Interface 2025 for….
0
2
0
RT @sj_manning: There is still a lot of uncertainty and disagreement about when/if such large scale labor market disruption from AI might o….
0
4
0
Is there public napkin math on this claim? My impression is this is possible (and some actors will pursue) but won't be easy or trivial barring a social/regulatory environment that massively suppresses labour leverage (and turns most white collar work into surveilled/precarious).
"Even if AI progress totally stalls, it's sufficiently easy to collect data on all these different white collar job tasks that we should expect to see them automated within the next 5 years."
0
0
0
RT @kylelostat: we released OLMo 2 1B, showing again how well our OLMo 2 pretrain & post train recipe works!. Our small 1B model is compara….
0
3
0