Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

External Research Organisations

  • Chiang Mai University
  • University of Texas at El Paso
View graph of relations

Details

Original languageEnglish
Title of host publicationPrediction and Causality in Econometrics and Related Topics
EditorsNguyen Ngoc Thach, Doan Thanh Ha, Nguyen Duc Trung, Vladik Kreinovich
Place of PublicationCham
Pages123-130
Number of pages8
ISBN (electronic)978-3-030-77094-5
Publication statusPublished - 27 Jul 2021
EventFourth International Econometric Conference of Vietnam - Ho Chi Minh City, Viet Nam
Duration: 11 Jan 202113 Jan 2021
Conference number: 4

Publication series

NameStudies in Computational Intelligence
Volume983
ISSN (Print)1860-949X
ISSN (electronic)1860-9503

Abstract

In many practical situations, observations and measurement results are consistent with many different models–i.e., the corresponding problem is ill-posed. In such situations, a reasonable idea is to take into account that the values of the corresponding parameters should not be too large; this idea is known as regularization. Several different regularization techniques have been proposed; empirically the most successful are LASSO method, when we bound the sum of absolute values of the parameters, ridge regression method, when we bound the sum of the squares, and a EN method in which these two approaches are combined. In this paper, we explain the empirical success of these methods by showing that these methods can be naturally derived from soft computing ideas.

ASJC Scopus subject areas

Cite this

Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing. / Yamaka, Woraphon ; Alkhatib, Hamza; Neumann, Ingo et al.
Prediction and Causality in Econometrics and Related Topics. ed. / Nguyen Ngoc Thach; Doan Thanh Ha; Nguyen Duc Trung; Vladik Kreinovich. Cham, 2021. p. 123-130 (Studies in Computational Intelligence; Vol. 983).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Yamaka, W, Alkhatib, H, Neumann, I & Kreinovich, V 2021, Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing. in N Ngoc Thach, DT Ha, ND Trung & V Kreinovich (eds), Prediction and Causality in Econometrics and Related Topics. Studies in Computational Intelligence, vol. 983, Cham, pp. 123-130, Fourth International Econometric Conference of Vietnam, Ho Chi Minh City, Viet Nam, 11 Jan 2021. https://doi.org/10.1007/978-3-030-77094-5_12
Yamaka, W., Alkhatib, H., Neumann, I., & Kreinovich, V. (2021). Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing. In N. Ngoc Thach, D. T. Ha, N. D. Trung, & V. Kreinovich (Eds.), Prediction and Causality in Econometrics and Related Topics (pp. 123-130). (Studies in Computational Intelligence; Vol. 983).. https://doi.org/10.1007/978-3-030-77094-5_12
Yamaka W, Alkhatib H, Neumann I, Kreinovich V. Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing. In Ngoc Thach N, Ha DT, Trung ND, Kreinovich V, editors, Prediction and Causality in Econometrics and Related Topics. Cham. 2021. p. 123-130. (Studies in Computational Intelligence). doi: 10.1007/978-3-030-77094-5_12
Yamaka, Woraphon ; Alkhatib, Hamza ; Neumann, Ingo et al. / Why LASSO, Ridge Regression, and EN : Explanation Based on Soft Computing. Prediction and Causality in Econometrics and Related Topics. editor / Nguyen Ngoc Thach ; Doan Thanh Ha ; Nguyen Duc Trung ; Vladik Kreinovich. Cham, 2021. pp. 123-130 (Studies in Computational Intelligence).
Download
@inproceedings{d53d5568bdf449feb5bbf33850b3e9a4,
title = "Why LASSO, Ridge Regression, and EN: Explanation Based on Soft Computing",
abstract = "In many practical situations, observations and measurement results are consistent with many different models–i.e., the corresponding problem is ill-posed. In such situations, a reasonable idea is to take into account that the values of the corresponding parameters should not be too large; this idea is known as regularization. Several different regularization techniques have been proposed; empirically the most successful are LASSO method, when we bound the sum of absolute values of the parameters, ridge regression method, when we bound the sum of the squares, and a EN method in which these two approaches are combined. In this paper, we explain the empirical success of these methods by showing that these methods can be naturally derived from soft computing ideas.",
author = "Woraphon Yamaka and Hamza Alkhatib and Ingo Neumann and Vladik Kreinovich",
note = "Funding Information: Acknowledgments. The first author is grateful for the financial support of the Center of Excellence in Econometrics, Chiang Mai University, Thailand. Funding Information: This work was also supported by the Institute of Geodesy, Leibniz University of Hannover, and by the US National Science Foundation grants 1623190 (A Model of Change for Preparing a New Generation for Professional Practice in Computer Science) and HRD-1242122 (Cyber-ShARE Center of Excellence). ; Fourth International Econometric Conference of Vietnam, ECONVN2021 ; Conference date: 11-01-2021 Through 13-01-2021",
year = "2021",
month = jul,
day = "27",
doi = "10.1007/978-3-030-77094-5_12",
language = "English",
isbn = "9783030770938",
series = "Studies in Computational Intelligence",
pages = "123--130",
editor = "{Ngoc Thach}, Nguyen and Ha, {Doan Thanh} and Trung, {Nguyen Duc} and Vladik Kreinovich",
booktitle = "Prediction and Causality in Econometrics and Related Topics",

}

Download

TY - GEN

T1 - Why LASSO, Ridge Regression, and EN

T2 - Fourth International Econometric Conference of Vietnam

AU - Yamaka, Woraphon

AU - Alkhatib, Hamza

AU - Neumann, Ingo

AU - Kreinovich, Vladik

N1 - Conference code: 4

PY - 2021/7/27

Y1 - 2021/7/27

N2 - In many practical situations, observations and measurement results are consistent with many different models–i.e., the corresponding problem is ill-posed. In such situations, a reasonable idea is to take into account that the values of the corresponding parameters should not be too large; this idea is known as regularization. Several different regularization techniques have been proposed; empirically the most successful are LASSO method, when we bound the sum of absolute values of the parameters, ridge regression method, when we bound the sum of the squares, and a EN method in which these two approaches are combined. In this paper, we explain the empirical success of these methods by showing that these methods can be naturally derived from soft computing ideas.

AB - In many practical situations, observations and measurement results are consistent with many different models–i.e., the corresponding problem is ill-posed. In such situations, a reasonable idea is to take into account that the values of the corresponding parameters should not be too large; this idea is known as regularization. Several different regularization techniques have been proposed; empirically the most successful are LASSO method, when we bound the sum of absolute values of the parameters, ridge regression method, when we bound the sum of the squares, and a EN method in which these two approaches are combined. In this paper, we explain the empirical success of these methods by showing that these methods can be naturally derived from soft computing ideas.

UR - http://www.scopus.com/inward/record.url?scp=85113375847&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-77094-5_12

DO - 10.1007/978-3-030-77094-5_12

M3 - Conference contribution

SN - 9783030770938

T3 - Studies in Computational Intelligence

SP - 123

EP - 130

BT - Prediction and Causality in Econometrics and Related Topics

A2 - Ngoc Thach, Nguyen

A2 - Ha, Doan Thanh

A2 - Trung, Nguyen Duc

A2 - Kreinovich, Vladik

CY - Cham

Y2 - 11 January 2021 through 13 January 2021

ER -

By the same author(s)