Gradient optimization methods in machine learning for the identification of dynamic systems parameters   Download

Mini-batch adaptive random search method for the parametric identification of dynamic systems   Download

The mini-batch adaptive method of random search (MAMRS) for parameters optimization in the tracking control problem   Download

Application of the mini-batch adaptive method of random search (MAMRS) in problems of optimal in mean control of the trajectory pencils   Download

Application of Mini-Batch Metaheuristic algorithms in problems of optimization of deterministic systems with incomplete information about the state vector   Download

Application of the Zero-Order Mini-Batch Optimization Method in the Tracking Control Problem   Download

Application of Mini-Batch Adaptive Optimization Method in Stochastic Control Problems   Download

Gradient-Free Federated Learning Methods with $l_1$ and $l_2$-Randomization for Non-Smooth Convex Stochastic Optimization Problems   arXiv   Download

Randomized gradient-free methods in convex optimization   arXiv   Download

  • Alexander Gasnikov, Darina Dvinskikh, Pavel Dvurechensky, Eduard Gorbunov, Aleksander Beznosikov, Aleksandr Lobanov
  • 2023 Journal Paper   Encyclopedia of Optimization

Nonsmooth Distributed Min-Max Optimization Using the Smoothing Technique   Download

Influence of the mantissa finiteness on the accuracy of gradient-free optimization methods   Download

Zero-Order Stochastic Conditional Gradient Sliding Method for Non-smooth Convex Optimization   arXiv   Download

Stochastic Adversarial Noise in the "Black Box" Optimization Problem   arXiv   Download

Highly Smoothness Zero-Order Methods for Solving Optimization Problems under PL Condition   arXiv   Download

Upper bounds on maximum admissible noise in zeroth-order optimisation   arXiv   Download

  • Dmitry Pasechnyuk, Aleksandr Lobanov, Alexander Gasnikov
  • 2023 arXiv Preprint  

Non-Smooth Setting of Stochastic Decentralized Convex Optimization Problem Over Time-Varying Graphs   arXiv   Download

Accelerated Zero-Order SGD Method for Solving the Black Box Optimization Problem under "Overparametrization" Condition   arXiv   Download

Gradient-Free Algorithms for Solving Stochastic Saddle Optimization Problems with the Polyak–Łojasiewicz Condition   Download

Accelerated Zeroth-order Method for Non-Smooth Stochastic Convex Optimization Problem with Infinite Variance   arXiv   Download

Median Clipping for Zeroth-order Non-Smooth Convex Optimization and Multi Arm Bandit Problem with Heavy-tailed Symmetric Noise   arXiv   Download

  • Nikita Kornilov, Yuriy Dorn, Aleksandr Lobanov, Nikolay Kutuzov, Innokentiy Shibaev, Eduard Gorbunov, Alexander Gasnikov, Alexander Nazin
  • 2024 arXiv Preprint  

Gradient-free algorithm for saddle point problems under overparametrization   Download

On Some Works of Boris Teodorovich Polyak on the Convergence of Gradient Methods and Their Development   arXiv   Download

Nesterov's method of dichotomy via Order Oracle: The problem of optimizing a two-variable function on a square   arXiv   Download

  • Boris Chervonenkis, Andrei Krasnov, Alexander Gasnikov, Aleksandr Lobanov
  • 2024 arXiv Preprint  

Acceleration Exists! Optimization Problems When Oracle Can Only Compare Objective Function Values   arXiv   Download

  • Aleksandr Lobanov, Alexander Gasnikov, Andrei Krasnov
  • 2024 arXiv Preprint  

Improved Iteration Complexity in Black-Box Optimization Problems under Higher Order Smoothness Function Condition   arXiv   Download

  • Aleksandr Lobanov
  • 2024 arXiv Preprint  

The Black-Box Optimization Problem: Zero-Order Accelerated Stochastic Method via Kernel Approximation   arXiv   Download