Aleksandr V. Petrov
@asash
Followers
697
Following
1K
Media
201
Statuses
1K
Senior Scientist (IR/RecSys/ML) @Tripadvisor | PhD @ University of Glasgow | Ex. Senior Software Engineer @Amazon | The opinions are mine
Glasgow
Joined January 2009
For anyone worried their LLM might be making stuff up, we made a budget‐friendly truth serum (semantic entropy + Bayesian). See for yourself: https://t.co/gq8oFP5Eqr Paper:
0
7
3
LLMs for estimating positional bias in logged interaction data @asash et al. at Viator use LLMs to estimate position bias in logged user interaction data as an alternative to online experimentation. 📝 https://t.co/hoLhBe49KO
arxiv.org
Recommender and search systems commonly rely on Learning To Rank models trained on logged user interactions to order items by predicted relevance. However, such interaction data is often subject...
0
1
4
Thrilled to join @asash on the Recsperts podcast! Thanks @MarcelKurovski for having us. We had a blast discussing our #RecSys research & transformer-based sequential recommendation. Tune in on your favorite podcast platform
recsperts.com
In episode 29 of Recsperts, I welcome Craig Macdonald, Professor of Information Retrieval at the University of Glasgow, and Aleksandr “Sasha” Petrov, PhD researcher and former applied scientist at ...
2
5
20
Huge congratulations to @macavaney on receiving the prestigious ACM SIGIR Early Career Researcher Award in the research category! This well-deserved recognition highlights the excellence & impact of his work in the IR community 👏🎉#sigir2025 Cc @GlasgowCS @UofGlasgow @ACMSIGIR
1
16
71
2nd day at #SIGIR2025 ☀️ After the first keynote, a great talk by @asash on combining joint product quantization and dynamic pruning to accelerate top-k computation. No need to compute all scores anymore! 😎 Work with @craig_macdonald and @ntonellotto .
0
2
13
Just presented our work at #SIGIR2025. @craig_macdonald gave a great overview. If you have any questions, feel free to catch me during the coffee break!
.@Asash is talking about our #sigir2025 📄 applying dynamic pruning ideas in sequential recommended systems, using sub-id representations w/ myself and @ntonellotto
1
0
22
If you're at #SIGIR2025 and interested in large-scale RecSys, pop by my talk on Monday! I'll be presenting our paper (w/ @craig_macdonald and @ntonellotto): 'Efficient Recommendation with Millions of Items by Dynamic Pruning of Sub-Item Embeddings'. 🔗
arxiv.org
A large item catalogue is a major challenge for deploying modern sequential recommender models, since it makes the memory footprint of the model large and increases inference latency. One...
📢 We're off to #SIGIR2025 in Padova! A large contingent of our students & staff will be at the main conference + tutorials, workshops & #ICTIR2025. Let’s connect if you’re around! 🤝 Also: we're hiring in #FinTech — DM us if you're interested! #IR #Hiring #Research #ACMSIGIR
1
1
15
Huge congratulations to our brilliant PhD graduates: @tjaenich, @mvlacho1, @asash - It’s been a joy having you @GlasgowCS. We’re so proud of all you’ve achieved and can’t wait to see what amazing things you’ll do next. Wishing you all the best! 🎉👏 #PhDGraduation #PhDSuccess
3
8
42
We’re excited to announce that WSDM 2026 will take place in Boise, Idaho, from February 22 to 26, 2026! Stay tuned and visit 🔗 https://t.co/vRdaZ9G7fV for updates! #WSDM2026
0
5
5
Happy to share that the OARS@KDD2025 workshop accepts our work. In this work, we turn a sequential recommendation system into a semantic search using generative language models. Apparently, this works really well.
1/9 Happy to share that our paper GLoSS: Generative Language Models with Semantic Search for Sequential Recommendation is accepted at the KDD OARS workshop! 🎉 Paper, code: https://t.co/TrgHgCnuPC This is joint work with my wonderful collaborators @asash and Juba Ziani.
0
2
8
Thanks to everyone involved! Extremely happy that no corrections required!
Delighted that @asash passed his 🎓 PhD defense this morning, without corrections. Thanks to @pcastells and Nicolas Pugeault for their thorough examination of the thesis, and @mobilelearnfeed for convening the defense!
9
0
39
I'm delighted to announce that I'm joining hashtag @Amazon Edinburgh as a part-time Visiting Scholar, working on recommendations
3
5
76
The pre-print of our #SIGIR2025 paper is now available at arXiv: https://t.co/gxJYGcJbTo! /w @craig_macdonald & @ntonellotto
arxiv.org
A large item catalogue is a major challenge for deploying modern sequential recommender models, since it makes the memory footprint of the model large and increases inference latency. One...
Efficient Recommendation with Millions of Items by Dynamic Pruning of Sub-Item Embeddings @asash et al. introduce a dynamic pruning algorithm that efficiently finds top-K items without computing scores for the entire catalogue 📝 https://t.co/T26DzlRx1U 👨🏽💻 https://t.co/YFPRBaBcc9
1
2
24
0
5
17
Now listening to David Wardrope who presents our IR4Good paper (work done in Amazon; @kvachai is the lead author here). Paper link: https://t.co/36VKVU0Aep
#ECIR2025
1
1
14
The paper continues our line of work on Large-Scale Sequential Recommendation. In this work we show that it is possible to find exact TOP-K highest scored items without exhaustively scoring full catalogue.
1
0
1