Indexed by:
Abstract:
This paper focuses on efficient algorithms for finding the Dantzig selector which was first proposed by Candès and Tao as an effective variable selection technique in the linear regression. This paper first reformulates the Dantzig selector problem as an equivalent convex composite optimization problem and proposes a semismooth Newton augmented Lagrangian (Ssnal) algorithm to solve the equivalent form. This paper also applies a proximal point dual semismooth Newton (PpdSsn) algorithm to solve another equivalent form of the Dantzig selector problem. Comprehensive results on the global convergence and local asymptotic superlinear convergence of the Ssnal and PpdSsn algorithms are characterized under very mild conditions. The computational costs of a semismooth Newton algorithm for solving the subproblems involved in the Ssnal and PpdSsn algorithms can be cheap by fully exploiting the second order sparsity and employing efficient techniques. Numerical experiments on the Dantzig selector problem with synthetic and real data sets demonstrate that the Ssnal and PpdSsn algorithms substantially outperform the state-of-the-art first order algorithms even for the required low accuracy, and the proposed algorithms are able to solve the large-scale problems robustly and efficiently to a relatively high accuracy. © 2021 Society for Industrial and Applied Mathematics.
Keyword:
Reprint 's Address:
Email:
Source :
SIAM Journal on Scientific Computing
ISSN: 1064-8275
Year: 2021
Issue: 6
Volume: 43
Page: A4147-A4171
2 . 9 6 8
JCR@2021
3 . 0 0 0
JCR@2023
ESI HC Threshold:36
JCR Journal Grade:1
CAS Journal Grade:2
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: