报告地点:行健楼学术活动室526
摘要:The Douglas-Rachford (DR) splitting algorithm is a classical first-order splitting algorithm for solving maximal monotone inclusion problems. This work proposes two innovative stepsize selection strategies to enhance the classical Douglas-Rachford (DR) splitting algorithm for both non-convex and convex optimization scenarios. The first method is for a class of non-convex and non-smooth optimization problems whose objective function is the sum of a smooth convex function and a proper, closed function. We regard the original problem as a minimization problem whose objective function is Douglas-Rachford envelope. In the iterative process, the approximate gradient of Douglas-Rachford envelope is taken as the search direction, and the Barzilai-Borwein step is used to determine the next iterative point. The second algorithm is for convex optimization problems. We present an adaptive stepsize, which sets the step size based on local information of the objective function, and only requires two extra function evaluations per iteration.We prove the convergence of both the algorithms. Numerical results verify the effectiveness and efficiency of the two algorithms.