Jasper Xian
@jsprxian
Followers
160
Following
76
Media
1
Statuses
7
parsnip farming @uwaterloo; internships @GoogleDeepMind @cohere
a farm upstate
Joined August 2023
Excited to finally announce PATH! Curious what it does under the hood? Here's a visualization of the data-synthesis prompts before and after optimization with the COPRO optimizer in DSPy.
🧵If you have just 10 labels & want to train a model, what should you do? Announcing PATH: Prompts as Auto-optimized Training Hyperparameters. Training on synthetic data *whose quality improves over time* via DSPy. Training SoTA-level IR Models w/ 10 Labels, by @jsprxian &team
3
10
63
dspy can help in surprising places
🧵If you have just 10 labels & want to train a model, what should you do? Announcing PATH: Prompts as Auto-optimized Training Hyperparameters. Training on synthetic data *whose quality improves over time* via DSPy. Training SoTA-level IR Models w/ 10 Labels, by @jsprxian &team
2
6
51
Today we are announcing PATH, Prompts as Auto-optimized Training Hyperparameters Creating and training on synthetic data for IR is now so easy
🧵If you have just 10 labels & want to train a model, what should you do? Announcing PATH: Prompts as Auto-optimized Training Hyperparameters. Training on synthetic data *whose quality improves over time* via DSPy. Training SoTA-level IR Models w/ 10 Labels, by @jsprxian &team
1
6
28
An exciting @UWCheritonCS x @stanfordnlp x @IBM collab led by @jsprxian on building tiny yet effective IR rerankers with PATH. Using just 10 labels, we can iteratively improve synthetic data generation pipelines to train rerankers with DSPy!
🧵If you have just 10 labels & want to train a model, what should you do? Announcing PATH: Prompts as Auto-optimized Training Hyperparameters. Training on synthetic data *whose quality improves over time* via DSPy. Training SoTA-level IR Models w/ 10 Labels, by @jsprxian &team
1
8
18