Guanghui Wang Profile
Guanghui Wang

@__waaagh__

Followers
278
Following
765
Media
1
Statuses
35

Machine Learning PhD student @GeorgiaTech. Interested in online learning.

Atlanta, GA
Joined November 2017
Don't wanna be here? Send us removal request.
@__waaagh__
Guanghui Wang
2 months
RT @bremen79: I have an opening for a post-doc position: I am looking for smart people with a strong CV in optimization and/or online learn….
0
34
0
@__waaagh__
Guanghui Wang
1 year
RT @mathOCb: Guy Kornowski, Ohad Shamir: Open Problem: Anytime Convergence Rate of Gradient Descent .
0
4
0
@__waaagh__
Guanghui Wang
1 year
RT @HazanPrinceton: Time to pass on my 500$ prize! .with my student @XinyiChen2 , we drafted this open problem in optimization: . https://t.….
0
5
0
@__waaagh__
Guanghui Wang
2 years
RT @__waaagh__: @damekdavis @jasondeanlee Thanks for the informative discussion. I would like to mention that in online learning there also….
0
1
0
@__waaagh__
Guanghui Wang
2 years
RT @gtcomputing: It was 138 years ago today, on Oct. 13, 1885, that the Georgia Legislature passed a bill appropriating $65K to create the….
0
5
0
@__waaagh__
Guanghui Wang
2 years
RT @SametOymac: @gautamcgoel Thanks Gautam! Picture below is the TL;DR and shows basic connection to prior work. Theory builds on our earli….
0
1
0
@__waaagh__
Guanghui Wang
2 years
RT @bremen79: New blog post: Yet Another ICML Award Fiasco. The story of the @icmlconf 2023 Outstanding Paper Award to the D-Adaptation pap….
0
100
0
@__waaagh__
Guanghui Wang
2 years
RT @hayou_soufiane: There is an "Experiments Paradox" for theory papers in major conferences: A theory paper with toy experiments is more l….
0
4
0
@__waaagh__
Guanghui Wang
2 years
Cool.
@Stone_Tao
Stone Tao
2 years
next level advertising of an #ICML2023 oral presentation 😂
Tweet media one
0
0
3
@__waaagh__
Guanghui Wang
2 years
RT @savvyRL: My timeline is 100% #ICML #Hawaii right now. Need a NotAtICML support group. .
0
3
0
@__waaagh__
Guanghui Wang
2 years
RT @neu_rips: Happy to share the results of our latest project with @lugosi_gabor!. TL;DR: Online-to-PAC conversions allow translating comp….
0
28
0
@__waaagh__
Guanghui Wang
2 years
RT @bremen79: I'm currently looking for 1-2 post-docs with a very strong CV (i.e., at least 4-5 papers at top ML conferences). I'll also hi….
0
14
0
@__waaagh__
Guanghui Wang
2 years
RT @zdhnarsil: Well, I guess my imagination is quite limited.
Tweet media one
0
3
0
@__waaagh__
Guanghui Wang
3 years
RT @roydanroy: You may have received an email today, asking you to split your NeurIPS paper into two separate PDFs: one "main" paper (~9 pa….
0
27
0
@__waaagh__
Guanghui Wang
3 years
RT @jbhuang0604: How can I get my paper on the top?. Understand that arXiv is a "stack" data structure (first in last out, FILO)! Submit yo….
0
34
0
@__waaagh__
Guanghui Wang
3 years
RT @neu_rips: hot news from arxiv: a new computationally efficient algorithm with *optimal* regret for the online portfolio problem! .congr….
0
16
0
@__waaagh__
Guanghui Wang
3 years
Sounds interesting🤔.
@arXiv__ml
Machine Learning | arXiv
3 years
#arXiv #machinelearning [csLG] Adam Can Converge Without Any Modification on Update Rules. (arXiv:2208.09632v1 [cs.LG]) #mw. Ever since Reddi et al. 2018 pointed out the divergence issue of Adam, many new variants have been designed to obtain convergence….
0
0
1
@__waaagh__
Guanghui Wang
3 years
A small step🥳.
3
0
5
@__waaagh__
Guanghui Wang
3 years
RT @paulwgoldberg: always a bad sign when a research collaborator says "it depends what you mean by non-trivial".
0
1
0
@__waaagh__
Guanghui Wang
4 years
RT @thirdreviewer: This paper is more appropriate for a speciality journal. Like, one that specializes in terrible papers.
0
210
0