A Derivative-free Two Level Random Search Method for Unconstrained Optimization

preview-18
  • A Derivative-free Two Level Random Search Method for Unconstrained Optimization Book Detail

  • Author : Neculai Andrei
  • Release Date : 2021-03-31
  • Publisher : Springer Nature
  • Genre : Mathematics
  • Pages : 126
  • ISBN 13 : 3030685179
  • File Size : 90,90 MB

A Derivative-free Two Level Random Search Method for Unconstrained Optimization by Neculai Andrei PDF Summary

Book Description: The book is intended for graduate students and researchers in mathematics, computer science, and operational research. The book presents a new derivative-free optimization method/algorithm based on randomly generated trial points in specified domains and where the best ones are selected at each iteration by using a number of rules. This method is different from many other well established methods presented in the literature and proves to be competitive for solving many unconstrained optimization problems with different structures and complexities, with a relative large number of variables. Intensive numerical experiments with 140 unconstrained optimization problems, with up to 500 variables, have shown that this approach is efficient and robust. Structured into 4 chapters, Chapter 1 is introductory. Chapter 2 is dedicated to presenting a two level derivative-free random search method for unconstrained optimization. It is assumed that the minimizing function is continuous, lower bounded and its minimum value is known. Chapter 3 proves the convergence of the algorithm. In Chapter 4, the numerical performances of the algorithm are shown for solving 140 unconstrained optimization problems, out of which 16 are real applications. This shows that the optimization process has two phases: the reduction phase and the stalling one. Finally, the performances of the algorithm for solving a number of 30 large-scale unconstrained optimization problems up to 500 variables are presented. These numerical results show that this approach based on the two level random search method for unconstrained optimization is able to solve a large diversity of problems with different structures and complexities. There are a number of open problems which refer to the following aspects: the selection of the number of trial or the number of the local trial points, the selection of the bounds of the domains where the trial points and the local trial points are randomly generated and a criterion for initiating the line search.

Disclaimer: www.yourbookbest.com does not own A Derivative-free Two Level Random Search Method for Unconstrained Optimization books pdf, neither created or scanned. We just provide the link that is already available on the internet, public domain and in Google Drive. If any way it violates the law or has any issues, then kindly mail us via contact us page to request the removal of the link.

Derivative-Free and Blackbox Optimization

Derivative-Free and Blackbox Optimization

File Size : 11,11 MB
Total View : 862 Views
DOWNLOAD

This book is designed as a textbook, suitable for self-learning or for teaching an upper-year university course on derivative-free and blackbox optimization. Th

Modern Numerical Nonlinear Optimization

Modern Numerical Nonlinear Optimization

File Size : 23,23 MB
Total View : 2358 Views
DOWNLOAD

This book includes a thorough theoretical and computational analysis of unconstrained and constrained optimization algorithms and combines and integrates the mo

Implicit Filtering

Implicit Filtering

File Size : 91,91 MB
Total View : 7336 Views
DOWNLOAD

A description of the implicit filtering algorithm, its convergence theory and a new MATLABĀ® implementation.