[Joint work with A. Ismael F. Vaz (University of Minho)]
We have developed a new method for unconstrained and linearly constrained global optimization when derivatives of the objective function are unavailable or are difficult to obtain. The goal is to derive a fully parallelizable method for the efficient and accurate solution of large-scale problems. It is not our aim to design an approach capable of determining all local minimizers for small pathological problems of small dimension (say with less than 20 variables) with an high number of local minima. Instead target at an approach that does reasonably well for such problems, but that is then capable of determining the main (or leading) local minimizers of higher dimensional problems (say with hundreds or even thousands of variables).
The methodology is based on multistarting probabilistic direct search which has been shown to be an efficient and robust method for optimizing without derivatives in the context of local optimization. The initial multistarted set of runs can be split according to criteria of clusterization and variability in function values or according to the solution of appropriate nonconvex model subproblems, in both cases by looking at the data generated at a given run. Given that multiple runs may concur to the same point, a merging procedure has also been developed. We provide numerical results on a large set of nonconvex unconstrained and bound-constrained problems.