
Hongkang Li
@LiHongkang_jntm
Followers
38
Following
87
Media
5
Statuses
24
Ph.D student at Rensselaer Polytechnic Institute
Troy, NY
Joined October 2019
This work as been accepted by #ICLR2025. Please see this link . We will update our final version soon.
openreview.net
Chain-of-Thought (CoT) is an efficient prompting method that enables the reasoning ability of large language models by augmenting the query using multiple examples with multiple intermediate steps....
🚀Excited to share our new preprint on the theoretical analysis of training and generalization of chain-of-thought. The Arxiv link can be found at We have the following results. [1/n].
0
0
4
Our follow-up work on the LLM theory---- the learning and generalization mechanism of Chain-of-Thought (CoT), will be presented in the next two days of the @icmlconf workshops. 1. Fri 26 Jul., Straus 2, HiLD Workshop. 2. Sat 27 Jul., Straus 2, TF2M Workshop.
0
2
4
Thanks @IBMResearch for posting a blog about our work on in-context learning. Please see this link:
research.ibm.com
A team at IBM Research and RPI figured out why in-context learning improves foundation model predictions, adding transparency to machine learning.
🔥Excited to share our poster at #ICML2024. This work studies the training dynamics of nonlinear Transformers, together with the In-Context Learning generalization capability of the model. Time: Jul 23rd, Tuesday, 1:30-3:00 pm. Location: Hall C 4-9 #403.
0
0
3
RT @pinyuchenTW: Are you a big fan of in-context learning (ICL)? Check out our @IBMResearch blog post highlighting our @icmlconf paper demy….
research.ibm.com
A team at IBM Research and RPI figured out why in-context learning improves foundation model predictions, adding transparency to machine learning.
0
4
0
RT @sijialiu17: The 3rd AdvML-Frontiers Workshop (@AdvMLFrontiers is set for #NeurIPS 2024 (@NeurIPSConf)! This ye….
0
8
0