|   | 
Details
   web
Records
Author Allende, H.; Salas, R.; Moraga, C.
Title A robust and effective learning algorithm for feedforward neural networks based on the influence function Type
Year 2003 Publication Lecture Notes in Computer Sciences Abbreviated Journal Lect. Notes Comput. Sc.
Volume 2652 Issue Pages 28-36
Keywords feedforward artificial neural networks; robust learning; effective parameter estimate
Abstract The learning process of the Feedforward Artificial Neural Networks relies on the data, though a robustness analysis of the parameter estimates of the model must be done due to the presence of outlying observations in the data. In this paper we seek the robust properties in the parameter estimates in the sense that the influence of aberrant observations or outliers in the estimate is bounded so the neural network is able to model the bulk of data. We also seek a trade off between robustness and efficiency under a Gaussian model. An adaptive learning procedure that seeks both aspects is developed. Finally we show some simulations results applied to the RESEX time series.
Address Univ Tecn Federico Santa Maria, Dept Informat, Valparaiso, Chile, Email: hallende@inf.utfsm.cl
Corporate Author Thesis
Publisher Springer-Verlag Berlin Place of Publication Editor
Language English Summary Language Original Title
Series Editor Series Title Abbreviated Series Title
Series Volume Series Issue Edition
ISSN 0302-9743 ISBN Medium
Area Expedition Conference Pattern Recognition And Image Analysis
Notes WOS:000184832300004 Approved
Call Number UAI @ eduardo.moreno @ Serial 35
Permanent link to this record
 

 
Author Fernandez, C.; Valle, C.; Saravia, F.; Allende, H.
Title Behavior analysis of neural network ensemble algorithm on a virtual machine cluster Type
Year 2012 Publication Neural Computing & Applications Abbreviated Journal Neural Comput. Appl.
Volume 21 Issue 3 Pages 535-542
Keywords Ensemble learning; Artificial neural networks; Virtualization; Multicore processor; Parallel algorithms
Abstract Ensemble learning has gained considerable attention in different learning tasks including regression, classification, and clustering problems. One of the drawbacks of the ensemble is the high computational cost of training stages. Resampling local negative correlation (RLNC) is a technique that combines two well-known methods to generate ensemble diversity-resampling and error negative correlation-and a fine-grain parallel approach that allows us to achieve a satisfactory balance between accuracy and efficiency. In this paper, we introduce a structure of the virtual machine aimed to test diverse selection strategies of parameters in neural ensemble designs, such as RLNC. We assess the parallel performance of this approach on a virtual machine cluster based on the full virtualization paradigm, using speedup and efficiency as performance metrics, for different numbers of processors and training data sizes.
Address [Fernandez, Cesar; Valle, Carlos; Saravia, Francisco; Allende, Hector] Univ Tecn Federico Santa Maria, Dept Comp Sci, Valparaiso 110 V, Chile, Email: cesferna@inf.utfsm.cl;
Corporate Author Thesis
Publisher Springer Place of Publication Editor
Language English Summary Language Original Title
Series Editor Series Title Abbreviated Series Title
Series Volume Series Issue Edition
ISSN 0941-0643 ISBN Medium
Area Expedition Conference
Notes WOS:000301578900014 Approved
Call Number UAI @ eduardo.moreno @ Serial 251
Permanent link to this record
 

 
Author Salas, R.; Allende, H.; Moreno, S.; Saavedra, C.
Title Flexible Architecture of Self Organizing Maps for changing environments Type
Year 2005 Publication Lecture Notes in Computer Sciences Abbreviated Journal Lect. Notes Comput. Sc.
Volume 3773 Issue Pages 642-653
Keywords catastrophic interference; Artificial Neural Networks; Self Organizing Maps; pattern recognition
Abstract Catastrophic Interference is a well known problem of Artificial Neural Networks (ANN) learning algorithms where the ANN forget useful knowledge while learning from new data. Furthermore the structure of most neural models must be chosen in advance. In this paper we introduce a hybrid algorithm called Flexible Architecture of Self Organizing Maps (FASOM) that overcomes the Catastrophic Interference and preserves the topology of Clustered data in changing environments. The model consists in K receptive fields of self organizing maps. Each Receptive Field projects high-dimensional data of the input space onto a neuron position in a low-dimensional output space grid by dynamically adapting its structure to a specific region of the input space. Furthermore the FASOM model automatically finds the number of maps and prototypes needed to successfully adapt to the data. The model has the capability of both growing its structure when novel clusters appears and gradually forgets when the data volume is reduced in its receptive fields. Finally we show the capabilities of our model with experimental results using synthetic sequential data sets and real world data.
Address Univ Valparaiso, Dept Comp, Valparaiso, Chile, Email: rodrigo.salas@uv.cl
Corporate Author Thesis
Publisher Springer-Verlag Berlin Place of Publication Editor
Language English Summary Language Original Title
Series Editor Series Title Abbreviated Series Title
Series Volume Series Issue Edition
ISSN 0302-9743 ISBN Medium
Area Expedition Conference Progress In Pattern Recognition
Notes WOS:000234341500067 Approved
Call Number UAI @ eduardo.moreno @ Serial 44
Permanent link to this record