Lipschitz Global Optimization Algorithms

Dr. Sergeyev gave one interesting talk... check the below announcement.

Lipschitz constant can be used for function optimization. Curvature also can help...
According to Dr. Segeyev, his new method is much faster than DIRECT algorithm. But I don't buy it... because it seems like it depends on how the function looks...
Anyway, it was a really interesting seminar.

BTW, one other thing is he is not going to make his code publicly available... instead, he wants to sell it... :)
And, tomorrow morning, he is talking about a new computer machine which can handle infinite numbers, which sounds weirdly great. :)

-----------------------------------------------------------------------------------------
Industrial & Systems Engineering Seminar

Yaroslav Sergeyev
Distinguished Professor
Dipartimento di Elettronica
Informatica e Sistemistica Universita della Calabria, Italy

3:00 p.m., Wednesday July 10, 2008
Room 203, Zachry

TITLE: "Lipschitz Global Optimization Algorithms"

ABSTRACT
Global optimization problems with multidimensional objective functions satisfying the Lipschitz condition over a hyperinterval with an unknown Lipschitz constant are considered. It is supposed that the objective function can be "black box", multiextremal, and non-differentiable. It is also assumed that evaluation of the objective function at a point is a time-consuming operation. Different techniques based on various adaptive partition strategies are analyzed. The main attention is dedicated to diagonal algorithms, since they have a number of attractive theoretical properties and have proved to be efficient in solving applied problems. In these algorithms, the search hyperinterval is adaptively partitioned into smaller hyperintervals and the objective function is evaluated only at two vertices corresponding to the main diagonal of the generated hyperintervals. It is demonstrated that the traditional diagonal partition strategies do not fulfill the requirements of computational efficiency because of executing many redundant evaluations of the objective function. A new adaptive diagonal partition strategy that allows one to avoid such computational redundancy is described. Some powerful multidimensional global optimization algorithms based on the new strategy are introduced. Results of extensive numerical experiments performed to test the methods proposed demonstrate their advantages with respect to diagonal algorithms in terms of both number of trials of the objective function and qualitative analysis of the search domain, which is characterized by the number of generated hyperintervals.

- H. Choi