Andrii Semenov

Andrii Semenov

ML Researcher

EPFL

Biography

I am a first year Master’s student at École polytechnique fédérale de Lausanne, School of Computer and Communication Sciences, working in the Machine Learning and Optimization Laboratory. Additionally, I am a part of the MIPT-Yandex Fundamental Research team. I received my BSc from Moscow Institute of Physics and Technology under the supervision of Aleksandr Beznosikov. My research interests include Federated Learning, Natural Language Processing and applications of Stochastic Optimization in Deep Learning. During my school years, I participated in several International Olympiads in astronomy, physics and astrophysics.

Name: Besides “Andrii Semenov”, I sometimes use another spelling “Andrei Semenov”. Both are fine.

Education

 
 
 
 
 
School of Computer and Communication Sciences (Data Science)
MSc at École polytechnique fédérale de Lausanne
September 2024 – Present Lausanne, Switzerland
 
 
 
 
 
Boarding school with the advanced study of Mathematics, Computer Science and Physics
Ukrainian Physics and Mathematics Lyceum
September 2016 – August 2020 Kyiv, Ukraine

Work Experience

 
 
 
 
 
Research Student
Machine Learning and Optimization Laboratory, EPFL
August 2024 – Present Lausanne, Switzerland
 
 
 
 
 
Teaching Assistant
Department of Mathematical Fundamentals of Control, MIPT
January 2023 – August 2024 Moscow, Dolgoprudny, Russia
 
 
 
 
 
Deep Learning Engineer
Huawei-MIPT research group
Deep Learning Engineer
November 2023 – October 2024 Moscow, Russia
 
 
 
 
 
Research Student
MIPT-Yandex Fundamental Research Laboratory
July 2023 – October 2024 Moscow, Russia
 
 
 
 
 
Research Student
Laboratory of Mathematical Methods of Optimization
July 2023 – October 2024 Moscow, Dolgoprudny, Russia
 
 
 
 
 
Laboratory Assistant
Laboratory of Fundamental and Applied Research of Relativistic Objects of the Universe
November 2022 – April 2024 Dolgoprudny, Russia
Head: DSc Nokhrina Elena
 
 
 
 
 
Research Physicist
P.N.Lebedev Physical Institute
November 2022 – July 2023 Moscow, Russia

Recent Papers

Quickly discover relevant content by filtering publications.
Just a Simple Transformation is Enough for Data Protection in Vertical Federated Learning
Mixed Newton Method for Optimization in Complex Spaces
Gradient Clipping Improves AdaGrad when the Noise Is Heavy-Tailed
Sparse Concept Bottleneck Models: Gumbel Tricks in Contrastive Learning
Bregman Proximal Method for Efficient Communications under Similarity

Projects

Llama-LoRa Project
My poject on parameter-efficient fine-tuning of LLMs. More than 10000 downloads on HuggingFace.
Llama-LoRa Project