Boosting ridge for the extreme learning machine globally optimised for classification and regression problems
Hits: 2762
- Research areas:
- Year:
- 2023
- Type of Publication:
- Article
- Keywords:
- boosting, regression, classification
- Authors:
-
- Perales-González, Carlos
- Pérez-Rodríguez, Javier
- Durán-Rosal, Antonio Manuel
- Journal:
- Scientific Reports
- Volume:
- 13
- Pages:
- 11809
- Month:
- July
- ISSN:
- 2045-2322
- BibTex:
- Note:
- JCR(2023): 3.8 Position: 25/134 (Q1) Category: MULTIDISCIPLINARY SCIENCES
- Abstract:
- This paper explores the boosting ridge (BR) framework in the extreme learning machine (ELM) community and presents a novel model that trains the base learners as a global ensemble. In the context of Extreme Learning Machine single-hidden-layer networks, the nodes in the hidden layer are preconfigured before training, and the optimisation is performed on the weights in the output layer. The previous implementation of the BR ensemble with ELM (BRELM) as base learners fix the nodes in the hidden layer for all the ELMs. The ensemble learning method generates different output layer coefficients by reducing the residual error of the ensemble sequentially as more base learners are added to the ensemble. As in other ensemble methodologies, base learners are selected until fulfilling ensemble criteria such as size or performance. This paper proposes a global learning method in the BR framework, where base learners are not added step by step, but all are calculated in a single step looking for ensemble performance. This method considers (i) the configurations of the hidden layer are different for each base learner, (ii) the base learners are optimised all at once, not sequentially, thus avoiding saturation, and (iii) the ensemble methodology does not have the disadvantage of working with strong classifiers. Various regression and classification benchmark datasets have been selected to compare this method with the original BRELM implementation and other state-of-the-art algorithms. Particularly, 71 datasets for classification and 52 for regression, have been considered using different metrics and analysing different characteristics of the datasets, such as the size, the number of classes or the imbalanced nature of them. Statistical tests indicate the superiority of the proposed method in both regression and classification problems in all experimental scenarios.
- Comments:
- JCR(2023): 3.8 Position: 25/134 (Q1) Category: MULTIDISCIPLINARY SCIENCES