Publications
- Home
- Publications
Gradient optimization methods in machine learning for the identification of dynamic systems parameters Download
Mini-batch adaptive random search method for the parametric identification of dynamic systems Download
The mini-batch adaptive method of random search (MAMRS) for parameters optimization in the tracking control problem Download
Application of the mini-batch adaptive method of random search (MAMRS) in problems of optimal in mean control of the trajectory pencils Download
Application of Mini-Batch Metaheuristic algorithms in problems of optimization of deterministic systems with incomplete information about the state vector Download
- Andrei Panteleev, Aleksandr Lobanov
- 2021 Journal Paper Algorithms
Application of the Zero-Order Mini-Batch Optimization Method in the Tracking Control Problem Download
Application of Mini-Batch Adaptive Optimization Method in Stochastic Control Problems Download
Gradient-Free Federated Learning Methods with $l_1$ and $l_2$-Randomization for Non-Smooth Convex Stochastic Optimization Problems arXiv Download
Randomized gradient-free methods in convex optimization arXiv Download
- Alexander Gasnikov, Darina Dvinskikh, Pavel Dvurechensky, Eduard Gorbunov, Aleksander Beznosikov, Aleksandr Lobanov
- 2023 Journal Paper Encyclopedia of Optimization
Nonsmooth Distributed Min-Max Optimization Using the Smoothing Technique Download
Influence of the mantissa finiteness on the accuracy of gradient-free optimization methods Download
Zero-Order Stochastic Conditional Gradient Sliding Method for Non-smooth Convex Optimization arXiv Download
Stochastic Adversarial Noise in the "Black Box" Optimization Problem arXiv Download
Highly Smoothness Zero-Order Methods for Solving Optimization Problems under PL Condition arXiv Download
Upper bounds on maximum admissible noise in zeroth-order optimisation arXiv Download
- Dmitry Pasechnyuk, Aleksandr Lobanov, Alexander Gasnikov
- 2023 arXiv Preprint
Non-Smooth Setting of Stochastic Decentralized Convex Optimization Problem Over Time-Varying Graphs arXiv Download
Accelerated Zero-Order SGD Method for Solving the Black Box Optimization Problem under "Overparametrization" Condition arXiv Download
Gradient-Free Algorithms for Solving Stochastic Saddle Optimization Problems with the Polyak–Łojasiewicz Condition Download
Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance arXiv Download
Median Clipping for Zeroth-order Non-Smooth Convex Optimization and Multi Arm Bandit Problem with Heavy-tailed Symmetric Noise arXiv Download
- Nikita Kornilov, Yuriy Dorn, Aleksandr Lobanov, Nikolay Kutuzov, Innokentiy Shibaev, Eduard Gorbunov, Alexander Gasnikov, Alexander Nazin
- 2024 arXiv Preprint
Gradient-free algorithm for saddle point problems under overparametrization Download
- Ekaterina Statkevich, Sofiya Bondar, Darina Dvinskikh, Alexander Gasnikov, Aleksandr Lobanov
- 2024 Journal Paper Chaos, Solitons & Fractals
On Some Works of Boris Teodorovich Polyak on the Convergence of Gradient Methods and Their Development arXiv Download
Nesterov's method of dichotomy via Order Oracle: The problem of optimizing a two-variable function on a square arXiv Download
- Boris Chervonenkis, Andrei Krasnov, Alexander Gasnikov, Aleksandr Lobanov
- 2024 arXiv Preprint
Asymptotic Analysis of the Ruppert – Polyak Averaging for Stochastic Order Oracle arXiv Download
On Quasi-Convex Smooth Optimization Problems by a Comparison Oracle arXiv Download
Accelerated Zero-Order SGD under High-Order Smoothness and Overparameterized Regime arXiv Download
Improved Iteration Complexity in Black-Box Optimization Problems under Higher Order Smoothness Function Condition arXiv Download
- Aleksandr Lobanov
- 2024 arXiv Preprint
Improved Maximum Noise Level Estimation in Black-Box Optimization Problems Download
The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation arXiv Download
Acceleration Exists! Optimization Problems When Oracle Can Only Compare Objective Function Values arXiv Download
Linear Convergence Rate in Convex Setup is Possible! Gradient Descent Method Variants under (L0,L1)-Smoothness arXiv Download
- Aleksandr Lobanov, Alexander Gasnikov, Eduard Gorbunov, Martin Takác
- 2024 arXiv Preprint
Power of Generalized Smoothness in Stochastic Convex Optimization: First- and Zero-Order Algorithms arXiv Download
- Aleksandr Lobanov, Alexander Gasnikov
- 2025 arXiv Preprint