
Allende, H., Salas, R., & Moraga, C. (2003). A robust and effective learning algorithm for feedforward neural networks based on the influence function. Lect. Notes Comput. Sc., 2652, 28–36.
Abstract: The learning process of the Feedforward Artificial Neural Networks relies on the data, though a robustness analysis of the parameter estimates of the model must be done due to the presence of outlying observations in the data. In this paper we seek the robust properties in the parameter estimates in the sense that the influence of aberrant observations or outliers in the estimate is bounded so the neural network is able to model the bulk of data. We also seek a trade off between robustness and efficiency under a Gaussian model. An adaptive learning procedure that seeks both aspects is developed. Finally we show some simulations results applied to the RESEX time series.



Carrasco, R. A., Pruhs, K., Stein, C., & Verschae, J. (2018). The Online Set Aggregation Problem. In Lecture Notes in Computer Sciences (Vol. 10807, pp. 245–259).



de Figueiredo, C. M. H., de Mello, C. P., & Ortiz, C. (2000). Edge colouring reduced indifference graphs. Lect. Notes Comput. Sc., 1776, 145–153.
Abstract: The chromatic index problem – finding the minimum number of colours required for colouring the edges of a graph – is still unsolved for indifference graphs, whose vertices can be linearly ordered so that the vertices contained in the same maximal clique are consecutive in this order. Two adjacent vertices are twins if they belong to the same maximal cliques. A graph is reduced if it contains no pair of twin vertices. A graph is overfull if the total number of edges is greater than the product of the maximum degree by [n/2], where n is the number of vertices. We give a structural characterization for neighbourhoodover full indifference graphs proving that a reduced indifference graph cannot be neighbourhoodoverfull. We show that the chromatic index for all reduced indifference graphs is the maximum degree.



Fraigniaud, P., MontealegreBarba, P., Oshman, R., Rapaport, I., & Todinca, I. (2019). On Distributed MerlinArthur Decision Protocols. In Lecture Notes in Computer Sciences (Vol. 11639).



Goles, E., Maldonado, D., MontealegreBarba, P., & Ollinger, N. (2018). FastParallel Algorithms for Freezing Totalistic Asynchronous Cellular Automata. In Lecture Notes in Computer Sciences (Vol. 11115, pp. 406–415).



Goles, E., MontealegreBarba, P., & RiosWilson, M. (2019). On the Effects of Firing Memory in the Dynamics of Conjunctive Networks. In Lecture Notes in Computer Sciences (Vol. 11525).



MontealegreBarba, P., PerezSalazar, S., Rapaport, I., & Todinca, I. (2018). Two Rounds Are Enough for Reconstructing Any Graph (Class) in the Congested Clique Model. In Lecture Notes in Computer Sciences (Vol. 11085).



Reyes, D., & Atkinson, J. (2018). Person Reidentification Using Masked Keypoints. In Lecture Notes in Computer Sciences (Vol. 10868, pp. 45–56).



Salas, R., Allende, H., Moreno, S., & Saavedra, C. (2005). Flexible Architecture of Self Organizing Maps for changing environments. Lect. Notes Comput. Sc., 3773, 642–653.
Abstract: Catastrophic Interference is a well known problem of Artificial Neural Networks (ANN) learning algorithms where the ANN forget useful knowledge while learning from new data. Furthermore the structure of most neural models must be chosen in advance. In this paper we introduce a hybrid algorithm called Flexible Architecture of Self Organizing Maps (FASOM) that overcomes the Catastrophic Interference and preserves the topology of Clustered data in changing environments. The model consists in K receptive fields of self organizing maps. Each Receptive Field projects highdimensional data of the input space onto a neuron position in a lowdimensional output space grid by dynamically adapting its structure to a specific region of the input space. Furthermore the FASOM model automatically finds the number of maps and prototypes needed to successfully adapt to the data. The model has the capability of both growing its structure when novel clusters appears and gradually forgets when the data volume is reduced in its receptive fields. Finally we show the capabilities of our model with experimental results using synthetic sequential data sets and real world data.



Valle, M. A., & Ruz, G. A. (2019). Market Basket Analysis Using Boltzmann Machines. In Lecture Notes in Computer Sciences (Vol. 11730).

