This work is dedicated to my son “BERGHOUT Loukmane”.
Description: in these codes we lustrate in details how we can train a single hidden layer feedforward net for both classification and regression by solving a linear problem with L1 norm optimization.
In these references you will find the most important math that you need to develop the code.
 R. G. Baraniuk, “<Compressive Sensing(lecture notes).pdf>,” no. July, pp. 118–121, 2007.
 M. W. Fakhr, E. N. S. Youssef, and M. S. El-Mahallawy, “L1-regularized least squares sparse extreme learning machine for classification,” 2015 Int. Conf. Inf. Commun. Technol. Res. ICTRC 2015, no. April, pp. 222–225, 2015.
 G. Huang, S. Member, H. Zhou, X. Ding, and R. Zhang, “Extreme Learning Machine for Regression and Multiclass Classification,” vol. 42, no. 2, pp. 513–529, 2012.
 C. justin Romberg and Jrom@acm.caltech.edu, “L1 magic toolbox.”
BERGHOUT Tarek (2019). training of sparse neural network (https://www.mathworks.com/matlabcentral/fileexchange/71991-training-of-sparse-neural-network), MATLAB Central File Exchange. Retrieved .