Access the full text.
Sign up today, get DeepDyve free for 14 days.
E. Levitin, Boris Polyak (1966)
Constrained minimization methodsUssr Computational Mathematics and Mathematical Physics, 6
R. Millán, Scott Lindstrom, V. Roshchina (2017)
Comparing Averaged Relaxed Cutters and Projection Methods: Theory and ExamplesSpringer Proceedings in Mathematics & Statistics
R. Hesse, D. Luke, P. Neumann (2013)
Alternating Projections and Douglas-Rachford for Sparse Affine FeasibilityIEEE Transactions on Signal Processing, 62
J. Dunn (1980)
Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length RulesSiam Journal on Control and Optimization, 18
J. Douglas, H. Rachford (1956)
On the numerical solution of heat conduction problems in two and three space variablesTransactions of the American Mathematical Society, 82
R. Freund, Paul Grigas (2013)
New analysis and results for the Frank–Wolfe methodMathematical Programming, 155
M. Gonçalves, J. Melo, R. Monteiro (2020)
Projection-free accelerated method for convex optimizationOptimization Methods and Software, 37
J. Nocedal, Stephen Wright (2000)
Numerical Optimization (Springer Series in Operations Research and Financial Engineering)
T. Rothvoss (2017)
The Matching Polytope has Exponential Extension ComplexityJournal of the ACM (JACM), 64
D. Garber, Elad Hazan (2014)
Faster Rates for the Frank-Wolfe Method over Strongly-Convex Sets
J. Nocedal, Stephen Wright (2018)
Numerical Optimization
M. Alves, Marina Geremia (2017)
Iteration complexity of an inexact Douglas-Rachford method and of a Douglas-Rachford-Tseng’s F-B four-operator splitting method for solving monotone inclusionsNumerical Algorithms
M. Frank, P. Wolfe (1956)
An algorithm for quadratic programmingNaval Research Logistics Quarterly, 3
Alejandro Carderera, S. Pokutta (2020)
Second-order Conditional Gradient Sliding.arXiv: Optimization and Control
Martin Jaggi (2013)
Revisiting Frank-Wolfe: Projection-Free Sparse Convex Optimization
R. Millán, O. Ferreira, L. Prudente (2019)
Alternating conditional gradient method for convex feasibility problemsComputational Optimization and Applications, 80
A. Beck, M. Teboulle (2004)
A conditional gradient method with linear rate of convergence for solving convex linear systemsMathematical Methods of Operations Research, 59
Jonathan Eckstein, Wang Yao (2018)
Relative-error approximate versions of Douglas–Rachford splitting and special cases of the ADMMMathematical Programming, 170
M. Fukushima, Z. Luo, P. Tseng (2002)
Smoothing Functions for Second-Order-Cone Complementarity ProblemsSIAM J. Optim., 12
M. Alves, Jonathan Eckstein, Marina Geremia, J. Melo (2019)
Relative-error inertial-relaxed inexact versions of Douglas-Rachford and ADMM splitting algorithmsComputational Optimization and Applications, 75
P. Combettes, P. Bondon (1999)
Hard-constrained inconsistent signal feasibility problemsIEEE Trans. Signal Process., 47
M. Gonçalves, J. Melo (2016)
A Newton conditional gradient method for constrained nonlinear systemsJ. Comput. Appl. Math., 311
Heinz Bauschke, P. Combettes (2011)
Convex Analysis and Monotone Operator Theory in Hilbert Spaces
F. Artacho, Rubén Campoy, Matthew Tam (2019)
The Douglas–Rachford algorithm for convex and nonconvex feasibility problemsMathematical Methods of Operations Research, 91
P. Combettes (1996)
The Convex Feasibility Problem in Image RecoveryAdvances in Imaging and Electron Physics, 95
(1966)
Minimization methods in the presence of constraints. USSR Computational mathematics and mathematical physics
Iterates of the approximate Douglas Rachford algorithm to find the intersection with empty interior of ellipse and a half-plane for ǫ = 0.245, ǫ = 0.120 and exact projections
Guanghui Lan, Yi Zhou (2016)
Conditional Gradient Sliding for Convex OptimizationSIAM J. Optim., 26
Fabiana Oliveira, O. Ferreira, G. Silva (2018)
Newton’s method with feasible inexact projections for solving constrained generalized equationsComputational Optimization and Applications, 72
F. Artacho, J. Borwein, Matthew Tam (2013)
DOUGLAS–RACHFORD FEASIBILITY METHODS FOR MATRIX COMPLETION PROBLEMSThe ANZIAM Journal, 55
Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations
In this paper, we propose a new algorithm combining the Douglas–Rachford (DR) algorithm and the Frank–Wolfe algorithm, also known as the conditional gradient (CondG) method, for solving the classic convex feasibility problem. Within the algorithm, which will be named Approximate Douglas–Rachford (ApDR) algorithm, the CondG method is used as a subroutine to compute feasible inexact projections on the sets under consideration, and the ApDR iteration is defined based on the DR iteration. The ApDR algorithm generates two sequences, the main sequence, based on the DR iteration, and its corresponding shadow sequence. When the intersection of the feasible sets is nonempty, the main sequence converges to a fixed point of the usual DR operator, and the shadow sequence converges to the solution set. We provide some numerical experiments to illustrate the behaviour of the sequences produced by the proposed algorithm.
Journal of Global Optimization – Springer Journals
Published: Jul 1, 2023
Keywords: Convex feasibility problem; Douglas–Rachford algorithm; Frank–Wolfe algorithm; Conditional gradient method; Inexact projections; 65K05; 90C30; 90C25
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.