On this page:

Nonsmooth optimization based neural networks for regression

"If there is a problem you can't solve, then there is an easier problem you can solve: find it."
- George Polya


LMBNNR is a nonsmooth optimization based hyperparameter free algorithm for solving large-scale regression problems. The regression problem is modelled using fully-connected feedforward neural networks with one hidden layer, piecewise linear activations, and the L1-loss function. This nonsmooth objective is then minimized using the limited memory bundle method (LMBM). In addition, a novel incremental approach is developed for automated determination of the proper number of hidden nodes.

The software is free for academic teaching and research purposes but I ask you to refer the reference given below if you use it.


lmbnnr.f03 - Main program for LMBNNR.
initlmbnnr.f03 - Initialization of LMBNNR and LMBM.
parameters.f03 - Global parameters and constants.
objfun.f03 - Computation of the objective and the subgradient for the NNR problem.
lmbm.f03 - The limited memory bundle method.
subpro.f03 - Subproblems for the LMBM.
Makefile - Makefile.

lmbnnr.zip - All the above in the compressed form.

For instructions compile the code (using make) and type ./lmbnnr without any arguments. Note that the code is easy to use even if you do not know Fortran.



The work was financially supported by the Academy of Finland (Project No. 289500, 319274).