LEAST SQUARES OPTIMIZATION WITH L 1-NORM REGULARIZATION

ABSTRACT

The non-differentiable L1-norm penalty in the L1-norm regularized least squares problem poses a major challenge to obtaining an analytic solution. The study thus explores smoothing and non-smoothing approximations that yields differentiable loss functional that ensures a close-form solution in over-determined systems. Three smoothing approximations to the L1-norm penalty have been examined. These include the Quadratic, Sigmoid and Cubic Hermite. Tikhonov regularization is then applied to the resulting loss function. The approximations are a modification of the Lee’s approximation to the L-norm term. The regularized solution using this approximation has been presented in various forms. Using the Hilbert 12 12 matrix, it is found that for all three methods, a good approximation to the exact solution converges at a regularization parameter µ = 10 30: The solutions show an accuracy to nine digits. In each approximation, a suitable value of the parameter is obtained for which the absolute value function approximates the L1-norm penalty. The results of the Modified Newton’s method based on the Lee’s approximation however shows an accuracy of at most two digits. The solution of the smoothing methods also compares favourably with l 1 ls method. Analytic solution of the L -norm problem is also obtained by means of the sub-gradient method, after casting the constrained formulation as unconstrained. Attempt at achieving scarcity of the Least Absolute Shrinkage and Selection Operator (LASSO) solution has been made in two ways. The initial solution is expressed in terms of the singular value decomposition so that by truncating smaller singular values, the desired scarcity is achieved using suitable regularization parameter obtained by the K-fold cross validation of the fit. In another way, the LASSO solution itself has been induced to ensure sparsity. The results show that the LASSO formulation and solution must be appropriately designed for certain type of datasets, particularly those that are severely ill-conditioned and those with monotone trends.

Overall Rating

0

5 Star
(0)
4 Star
(0)
3 Star
(0)
2 Star
(0)
1 Star
(0)
APA

NKANSAH, H (2021). LEAST SQUARES OPTIMIZATION WITH L 1-NORM REGULARIZATION. Afribary. Retrieved from https://track.afribary.com/works/least-squares-optimization-with-l-1-norm-regularization

MLA 8th

NKANSAH, HENRIETTA "LEAST SQUARES OPTIMIZATION WITH L 1-NORM REGULARIZATION" Afribary. Afribary, 08 Mar. 2021, https://track.afribary.com/works/least-squares-optimization-with-l-1-norm-regularization. Accessed 23 Nov. 2024.

MLA7

NKANSAH, HENRIETTA . "LEAST SQUARES OPTIMIZATION WITH L 1-NORM REGULARIZATION". Afribary, Afribary, 08 Mar. 2021. Web. 23 Nov. 2024. < https://track.afribary.com/works/least-squares-optimization-with-l-1-norm-regularization >.

Chicago

NKANSAH, HENRIETTA . "LEAST SQUARES OPTIMIZATION WITH L 1-NORM REGULARIZATION" Afribary (2021). Accessed November 23, 2024. https://track.afribary.com/works/least-squares-optimization-with-l-1-norm-regularization