Nonsmooth optimization based neural networks for regression
LMBNNR
LMBNNR is a nonsmooth optimization based hyperparameter free algorithm for solving large-scale regression problems. The regression problem is modelled using fully-connected feedforward neural networks with one hidden layer, piecewise linear activations, and the L1-loss function. This nonsmooth objective is then minimized using the limited memory bundle method (LMBM). In addition, a novel incremental approach is developed for automated determination of the proper number of hidden nodes.
The software is free for academic teaching and research purposes but I ask you to refer the reference given below if you use it.
Code
| lmbnnr.f03 | - Main program for LMBNNR. |
|---|---|
| initlmbnnr.f03 | - Initialization of LMBNNR and LMBM. |
| parameters.f03 | - Global parameters and constants. |
| objfun.f03 | - Computation of the objective and the subgradient for the NNR problem. |
| lmbm.f03 | - The limited memory bundle method. |
| subpro.f03 | - Subproblems for the LMBM. |
| Makefile | - Makefile. |
| lmbnnr.zip | - All the above in the compressed form. |
For instructions compile the code (using make) and type ./lmbnnr without any arguments. Note that the code is easy to use even if you do not know Fortran.
References
- N. Karmitsa, S. Taheri, K. Joki, P. Paasivirta, A. Bagirov, and M.M. Mäkelä, "Nonsmooth Optimization-Based Hyperparameter-Free Neural Networks for Large-Scale Regression", In "Special Issue ”Machine Learning Algorithms for Big Data Analysis" of Algorithms, 16, 444, 2023. https://doi.org/10.3390/a16090444.
- N. Karmitsa, S. Taheri, K. Joki, P. Mäkinen, A. Bagirov, M.M. Mäkelä, "Hyperparameter free NN algorithm for large-scale regression problems", TUCS Technical Report, No. 1213, Turku Centre for Computer Science, Turku, 2020.
Acknowledgements
The work was financially supported by the Academy of Finland (Project No. 289500, 319274).



