Optimization of Sparsity-Constrained Neural Networks as a Mixed Integer Linear Program: NN2MILP

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Bodo Rosenhahn

Research Organisations

View graph of relations

Details

Original languageEnglish
Pages (from-to)931-954
Number of pages24
JournalJournal of Optimization Theory and Applications
Volume199
Early online date25 Oct 2023
Publication statusPublished - Dec 2023

Abstract

The literature has shown how to optimize and analyze the parameters of different types of neural networks using mixed integer linear programs (MILP). Building on these developments, this work presents an approach to do so for a McCulloch/Pitts and Rosenblatt neurons. As the original formulation involves a step-function, it is not differentiable, but it is possible to optimize the parameters of neurons, and their concatenation as a shallow neural network, by using a mixed integer linear program. The main contribution of this paper is to additionally enforce sparsity constraints on the weights and activations as well as on the amount of used neurons. Several experiments demonstrate that such constraints effectively prevent overfitting in neural networks, and ensure resource optimized models.

Keywords

    Feature selection, Mixed integer linear programming, Neural networks, Resource optimization, Sparse networks

ASJC Scopus subject areas

Cite this

Optimization of Sparsity-Constrained Neural Networks as a Mixed Integer Linear Program: NN2MILP. / Rosenhahn, Bodo.
In: Journal of Optimization Theory and Applications, Vol. 199, 12.2023, p. 931-954.

Research output: Contribution to journalArticleResearchpeer review

Download
@article{72db1424fda24f55a19bede96cb4e6bd,
title = "Optimization of Sparsity-Constrained Neural Networks as a Mixed Integer Linear Program: NN2MILP",
abstract = "The literature has shown how to optimize and analyze the parameters of different types of neural networks using mixed integer linear programs (MILP). Building on these developments, this work presents an approach to do so for a McCulloch/Pitts and Rosenblatt neurons. As the original formulation involves a step-function, it is not differentiable, but it is possible to optimize the parameters of neurons, and their concatenation as a shallow neural network, by using a mixed integer linear program. The main contribution of this paper is to additionally enforce sparsity constraints on the weights and activations as well as on the amount of used neurons. Several experiments demonstrate that such constraints effectively prevent overfitting in neural networks, and ensure resource optimized models.",
keywords = "Feature selection, Mixed integer linear programming, Neural networks, Resource optimization, Sparse networks",
author = "Bodo Rosenhahn",
note = "Funding Information: The author received support from Leibniz Universit{\"a}t Hannover, Germany, and thanks the colleagues who provided the datasets used in this manuscript. All datasets are publicly available. ",
year = "2023",
month = dec,
doi = "10.1007/s10957-023-02317-x",
language = "English",
volume = "199",
pages = "931--954",
journal = "Journal of Optimization Theory and Applications",
issn = "0022-3239",
publisher = "Springer New York",

}

Download

TY - JOUR

T1 - Optimization of Sparsity-Constrained Neural Networks as a Mixed Integer Linear Program

T2 - NN2MILP

AU - Rosenhahn, Bodo

N1 - Funding Information: The author received support from Leibniz Universität Hannover, Germany, and thanks the colleagues who provided the datasets used in this manuscript. All datasets are publicly available.

PY - 2023/12

Y1 - 2023/12

N2 - The literature has shown how to optimize and analyze the parameters of different types of neural networks using mixed integer linear programs (MILP). Building on these developments, this work presents an approach to do so for a McCulloch/Pitts and Rosenblatt neurons. As the original formulation involves a step-function, it is not differentiable, but it is possible to optimize the parameters of neurons, and their concatenation as a shallow neural network, by using a mixed integer linear program. The main contribution of this paper is to additionally enforce sparsity constraints on the weights and activations as well as on the amount of used neurons. Several experiments demonstrate that such constraints effectively prevent overfitting in neural networks, and ensure resource optimized models.

AB - The literature has shown how to optimize and analyze the parameters of different types of neural networks using mixed integer linear programs (MILP). Building on these developments, this work presents an approach to do so for a McCulloch/Pitts and Rosenblatt neurons. As the original formulation involves a step-function, it is not differentiable, but it is possible to optimize the parameters of neurons, and their concatenation as a shallow neural network, by using a mixed integer linear program. The main contribution of this paper is to additionally enforce sparsity constraints on the weights and activations as well as on the amount of used neurons. Several experiments demonstrate that such constraints effectively prevent overfitting in neural networks, and ensure resource optimized models.

KW - Feature selection

KW - Mixed integer linear programming

KW - Neural networks

KW - Resource optimization

KW - Sparse networks

UR - http://www.scopus.com/inward/record.url?scp=85174828680&partnerID=8YFLogxK

U2 - 10.1007/s10957-023-02317-x

DO - 10.1007/s10957-023-02317-x

M3 - Article

AN - SCOPUS:85174828680

VL - 199

SP - 931

EP - 954

JO - Journal of Optimization Theory and Applications

JF - Journal of Optimization Theory and Applications

SN - 0022-3239

ER -