
Calgary ML Lab
@UCalgaryML
Followers
44
Following
35
Media
0
Statuses
13
University of Calgary Machine Learning Lab
Calgary, AB
Joined June 2021
RT @yanii: @UCalgaryML will be at #ICML2025 in VancouveršØš¦ next week: our lab has 6 different works being presented by 5 students across boā¦.
0
9
0
RT @adnan_ahmad1306: 1/10 š§µ.šCan weight symmetry provide insights into sparse training and the Lottery Ticket Hypothesis?. š§We dive deep inā¦.
0
26
0
RT @yanii: I'm proud that the @UCalgaryML lab will have 6 different works being presented by 6 students across #NeurIPS2024, in workshops (ā¦.
0
10
0
RT @JainRohan16: For more details, check out our full paper ⦠šš¼. š: Joint work w/ @adnan_ahmad1306, @EkanshSh, anā¦.
0
2
0
RT @yanii: Really excited to be presenting this work from my lab at the @NeurIPSConf 2024 @unireps workshop! š. If you are at NeurIPS and iā¦.
0
4
0
RT @yanii: Congratulations @adnan_ahmad1306 on receiving the @RBCBorealis Fellowship! .Adnan is a PhD student in the @UCalgaryML lab at @Scā¦.
0
3
0
RT @yanii: Happy to share our work on Knowledge Distillation temperatureš”ļø and fairnessāļø. Must-read for AI practitioners given the prevaleā¦.
0
2
0
RT @mikelasby: "Dynamic Sparse Training with Structured Sparsity" ( was accepted at ICLR 2024! DST methods learn stā¦.
0
7
0
RT @yanii: Iām excited to share that Iāve received an Amazon Research Award for my proposal "Addressing Catastrophic Forgetting with Dynamiā¦.
amazon.science
Awardees represent more than 30 universities in eight countries. Recipients have access to Amazon public datasets, along with AWS AI/ML services and tools.
0
5
0
RT @yanii: I was happy to recently be informed my 2022 DG proposal was also awarded a @NSERC_CRSNG and @NationalDefence DND Discovery Grantā¦.
0
1
0
RT @utkuevci: Key takeaways:.- Initialization is important for older NNs: use sparsity aware init. - For modern NNs, init doesn't matter: iā¦.
0
4
0
RT @lab_ai2: Work recently published by our close collaborators Drs. @marianapbento and @richardfrayneca!. Deep Learning in Large and Multiā¦.
0
1
0