
IDA
@IDA_CTU
Followers
80
Following
70
Media
11
Statuses
112
Intelligent Data Analysis Lab | FEE CTU
Prague, Czech Republic
Joined January 2019
"Leveraging Gradient Noise for Detection and Filtering of Byzantine Clients" by Kungurtsev and colleagues, exploring the use of higher order moment statistics for identifying Byzantine clients in distributed learning, has been accepted for publication in IEEE Access.
0
0
0
"Probabilistic Iterative Hard Thresholding for Sparse Learning", by Kungurtsev and co-authors, presenting a method for sparsity-promoting estimation with strong convergence guarantees, has been accepted for publication in Computational Optimization and Applications.
0
0
0
"Optimal Control of Two-Phase Membrane Problem" by Bozorgnia and Kungurtsev, studying the challenging obstacle problem exhibiting operator nonsmoothness, has been accepted for publication in the journal of Applied Mathematics and Optimization!.
0
0
1
"Truss topology design under harmonic loads: Peak power minimization with semidefinite programming" by Shenyuan Ma et al, including IDA's Vyacheslav Kungurtsev @DrKungOptimizer, has been accepted for publication in Structural and Multidisciplinary Optimization!.
0
0
1
"State Encodings for GNN-based Lifted Planners" by R. Horcik, our @GustavSheer, V. Šimek, and T. Pevný, providing a comparative study of diverse state encodings for graph neural networks in classical planning, was accepted for oral presentation (top 20%) at AAAI25! @RealAAAI.
0
2
3
"A Stochastic-Gradient-Based Interior-Point Algorithm for Solving Smooth Bound-Constrained Optimization Problems" by Qi Wang, Frank E. Curtis, Daniel P. Robinson, and our V. Kungurtsev @DrKungOptimizer, has been accepted for publication in the SIAM Journal of Optimization!.
0
0
1
"Group Distributionally Robust Dataset Distillation with Risk Minimization", co-authored by our V. Kungurtsev, accepted for ICLR 2025! The paper uses Distributionally Robust Optimization techniques to improve generalization and subgroup coverage in Dataset Distillation techniques.
0
0
2
Towards Diverse Device Heterogeneous Federated Learning via Task Arithmetic Knowledge Integration by M Morafah, V Kungurtsev (IDA), H M Chang, C Chen, B Lin, presenting an algorithm for performing knowledge distillation across distinct computing devices accepted for NeurIPs 2024!.
0
0
2
"Unlocking the Potential of Federated Learning: The Symphony of Dataset Distillation via Deep Generative Latents" by Jia, Vahidian, Sun, Zhang, Vyacheslav Kungurtsev (of IDA), Gong, and Chen was accepted to the ECCV.
0
0
1
ICAPS'24 Best Paper Runner-up Award goes to R. Horcik and our G. Sir (@GustavSheer) for their work on "Expressiveness of Graph Neural Networks in Planning Domains" - - find the paper at or the author(s) at @ICAPSConference in Banff!.
openreview.net
Graph Neural Networks (GNNs) have recently become the standard method of choice for learning with structured data, demonstrating particular promise in classical planning. Their inherent invariance...
0
1
5
"Federated SGD with Local Asynchrony" by B. Chatterjee, our V. Kungurtsev, and D. Alistarh, studying the use of both parallel shared memory and distributed computation with asynchronous communication, was accepted for the IEEE Intl. Conference for Distributed Computing Systems.
0
0
2
"Efficient Dataset Distillation via Minimax Diffusion" with authors from Duke University and our V. Kungurtsev was accepted for CVPR. The paper harnessed the computational power of generative diffusions for synthesizing a rich and representative synthetic dataset for retraining.
0
0
2
"A Deep Learning Blueprint for Relational Databases" by L. Zahradnik, J. Neumann, and @GustavSheer, presented last week at #NeurIPS2023's @TrlWorkshop introduced a modular message-passing scheme for end-to-end deep learning from databases - see the paper!
openreview.net
We introduce a modular neural message-passing scheme that closely follows the formal model of relational databases, effectively enabling end-to-end deep learning directly from database storages. We...
1
2
7