
Canessa, E., & Chaigneau, S. (2017). Response surface methodology for estimating missing values in a pareto genetic algorithm used in parameter design. Ing. Invest., 37(2), 89–98.
Abstract: We present an improved Pareto Genetic Algorithm (PGA), which finds solutions to problems of robust design in multiresponse systems with 4 responses and as many as 10 control and 5 noise factors. Because some response values might not have been obtained in the robust design experiment and are needed in the search process, the PGA uses Response Surface Methodology (RSM) to estimate them. Not only the PGA delivered solutions that adequately adjusted the response means to their target values, and with low variability, but also found more Pareto efficient solutions than a previous version of the PGA. This improvement makes it easier to find solutions that meet the tradeoff among variance reduction, mean adjustment and economic considerations. Furthermore, RSM allows estimating outputs' means and variances in highly nonlinear systems, making the new PGA appropriate for such systems.



Canessa, E., & Chaigneau, S. E. (2020). Mathematical regularities of data from the property listing task. J. Math. Psychol., 97, 19 pp.
Abstract: To study linguistically coded concepts, researchers often resort to the Property Listing Task (PLT). In a PLT, participants are asked to list properties that describe a concept (e.g., for DOG, subjects may list “is a pet”, “has four legs”, etc.), which are then coded into property types (i.e., superficially dissimilar properties such as “has four legs” and “is a quadruped” may be coded as “four legs”). When the PLT is done for many concepts, researchers obtain Conceptual Properties Norms (CPNs), which are used to study semantic content and as a source of control variables. Though the PLT and CPNs are widely used across psychology, there is a lack of a formal model of the PLT, which would provide better analysis tools. Particularly, nobody has attempted analyzing the PLT's listing process. Thus, in the current work we develop a mathematical description of the PLT. Our analyses indicate that several regularities should be found in the observable data obtained from a PLT. Using data from three different CPNs (from 3 countries and 2 different languages), we show that these regularities do in fact exist and generalize well across different CPNs. Overall, our results suggest that the description of the regularities found in PLT data may be fruitfully used in the study of concepts. (C) 2020 Elsevier Inc. All rights reserved.



Canessa, E., & Riolo, R. L. (2006). An agentbased model of the impact of computermediated communication on organizational culture and performance: an example of the application of complex systems analysis tools to the study of CIS. J. Inf. Technol., 21(4), 272–283.
Abstract: Organizations that make use of computer information systems (CIS) are prototypical complex adaptive systems (CAS). This paper shows how an approach from Complexity Science, exploratory agentbased modeling (ABM), can be used to study the impact of two different modes of use of computermediated communication (CMC) on organizational culture (OC) and performance. The ABM includes stylized representations of (a) agents communicating with other agents to complete tasks; (b) an OC consisting of the distribution of agent traits, changing as agents communicate; (c) the effect of OC on communication effectiveness (CE), and (d) the effect of CE on task completion times, that is, performance. If CMC is used in a broad mode, that is, to contact and collaborate with many, new agents, the development of a strong OC is slowed, leading to decreased CE and poorer performance early on. If CMC is used in a local mode, repeatedly contacting the same agents, a strong OC develops rapidly, leading to increased CE and high performance early on. However, if CMC is used in a broad mode over longer time periods, a strong OC can develop over a wider set of agents, leading to an OC that is stronger than an OC which develops with local CMC use. Thus broad use of CMC results in overall CE and performance that is higher than is generated by local use of CMC. We also discuss how the dynamics generated by an ABM can lead to a deeper understanding of the behavior of a CAS, for example, allowing us to better design empirical longitudinal studies.



Canessa, E., Chaigneau, S. E., Lagos, R., & Medina, F. A. (2020). How to carry out conceptual properties norming studies as parameter estimation studies: Lessons from ecology. Behavior Research Methods, to appear.
Abstract: Conceptual properties norming studies (CPNs) ask participants to produce properties that describe concepts. From that data, different metrics may be computed (e.g., semantic richness, similarity measures), which are then used in studying concepts and as a source of carefully controlled stimuli for experimentation. Notwithstanding those metrics' demonstrated usefulness, researchers have customarily overlooked that they are only point estimates of the true unknown population values, and therefore, only rough approximations. Thus, though research based on CPN data may produce reliable results, those results are likely to be general and coarsegrained. In contrast, we suggest viewing CPNs as parameter estimation procedures, where researchers obtain only estimates of the unknown population parameters. Thus, more specific and finegrained analyses must consider those parameters' variability. To this end, we introduce a probabilistic model from the field of ecology. Its related statistical expressions can be applied to compute estimates of CPNs' parameters and their corresponding variances. Furthermore, those expressions can be used to guide the sampling process. The traditional practice in CPN studies is to use the same number of participants across concepts, intuitively believing that practice will render the computed metrics comparable across concepts and CPNs. In contrast, the current work shows why an equal number of participants per concept is generally not desirable. Using CPN data, we show how to use the equations and discuss how they may allow more reasonable analyses and comparisons of parameter values among different concepts in a CPN, and across different CPNs.



Canessa, E., Chaigneau, S. E., Moreno, S., & Lagos, R. (2020). Informational content of cosine and other similarities calculated from highdimensional Conceptual Property Norm data. Cogn. Process., to appear, 14 pp.
Abstract: To study concepts that are coded in language, researchers often collect lists of conceptual properties produced by human subjects. From these data, different measures can be computed. In particular, interconcept similarity is an important variable used in experimental studies. Among possible similarity measures, the cosine of conceptual property frequency vectors seems to be a de facto standard. However, there is a lack of comparative studies that test the merit of different similarity measures when computed from property frequency data. The current work compares four different similarity measures (cosine, correlation, Euclidean and Chebyshev) and five different types of data structures. To that end, we compared the informational content (i.e., entropy) delivered by each of those 4 x 5 = 20 combinations, and used a clustering procedure as a concrete example of how informational content affects statistical analyses. Our results lead us to conclude that similarity measures computed from lowerdimensional data fare better than those calculated from higherdimensional data, and suggest that researchers should be more aware of data sparseness and dimensionality, and their consequences for statistical analyses.



Canessa, E., Droop, C., & Allende, H. (2012). An improved genetic algorithm for robust design in multivariate systems. Qual. Quant., 46(2), 665–678.
Abstract: In a previous article, we presented a genetic algorithm (GA), which finds solutions to problems of robust design in multivariate systems. Based on that GA, we developed a new GA that uses a new desirability function, based on the aggregation of the observed variance of the responses and the squared deviation between the mean of each response and its corresponding target value. Additionally, we also changed the crossover operator from a onepoint to a uniform one. We used three different case studies to evaluate the performance of the new GA and also to compare it with the original one. The first case study involved using data from a univariate real system, and the other two employed data obtained from multivariate process simulators. In each of the case studies, the new GA delivered good solutions, which simultaneously adjusted the mean of each response to its corresponding target value. This performance was similar to the one of the original GA. Regarding variability reduction, the new GA worked much better than the original one. In all the case studies, the new GA delivered solutions that simultaneously decreased the standard deviation of each response to almost the minimum possible value. Thus, we conclude that the new GA performs better than the original one, especially regarding variance reduction, which was the main problem exhibited by the original GA.



Canessa, E., Vera, S., & Allende, H. (2012). A new method for estimating missing values for a genetic algorithm used in robust design. Eng. Optimiz., 44(7), 787–800.
Abstract: This article presents an improved genetic algorithm (GA), which finds solutions to problems of robust design in multivariate systems with many control and noise factors. Since some values of responses of the system might not have been obtained from the robust design experiment, but may be needed in the search process, the GA uses response surface methodology (RSM) to estimate those values. In all test cases, the GA delivered solutions that adequately adjusted the mean of the responses to their corresponding target values and with low variability. The GA found more solutions than the previous versions of the GA, which makes it easier to find a solution that may meet the tradeoff among variance reduction, mean adjustment and economic considerations. Moreover, RSM is a good method for estimating the mean and variance of the outputs of highly nonlinear systems, which makes the new GA appropriate for optimizing such systems.



Canessa, E. C., & Chaigneau, S. E. (2016). When are concepts comparable across minds? Qual. Quant., 50(3), 1367–1384.
Abstract: In communication, people cannot resort to direct reference (e.g., pointing) when using diffuse concepts like democracy. Given that concepts reside in individuals' minds, how can people share those concepts? We argue that concepts are comparable across a social group if they afford agreement for those who use it; and that agreement occurs whenever individuals receive evidence that others conceptualize a given situation similarly to them. Based on Conceptual Agreement Theory, we show how to compute an agreement probability based on the sets of properties belonging to concepts. If that probability is sufficiently high, this shows that concepts afford an adequate level of agreement, and one may say that concepts are comparable across individuals' minds. In contrast to other approaches, our method considers that interindividual variability in naturally occurring conceptual content exists and is a fact that must be taken into account, whereas other theories treat variability as error that should be cancelled out. Given that conceptual variability will exist, our approach may establish whether concepts are comparable across individuals' minds more soundly than previous methods.



Canessa, G., Gallego, J. A., Ntaimo, L., & Pagnoncelli, B. K. (2019). An algorithm for binary linear chanceconstrained problems using IIS. Comput. Optim. Appl., 72(3), 589–608.
Abstract: We propose an algorithm based on infeasible irreducible subsystems to solve binary linear chanceconstrained problems with random technology matrix. By leveraging on the problem structure we are able to generate good quality upper bounds to the optimal value early in the algorithm, and the discrete domain is used to guide us efficiently in the search of solutions. We apply our methodology to individual and joint binary linear chanceconstrained problems, demonstrating the ability of our approach to solve those problems. Extensive numerical experiments show that, in some cases, the number of nodes explored by our algorithm is drastically reduced when compared to a commercial solver.



Canessa, G., Moreno, E., & Pagnoncelli, B. K. (2020). The riskaverse ultimate pit problem. Optim. Eng., to appear.
Abstract: In this work, we consider a riskaverse ultimate pit problem where the grade of the mineral is uncertain. We derive conditions under which we can generate a set of nested pits by varying the risk level instead of using revenue factors. We propose two properties that we believe are desirable for the problem: risk nestedness, which means the pits generated for different risk aversion levels should be contained in one another, and additive consistency, which states that preferences in terms of order of extraction should not change if independent sectors of the mine are added as precedences. We show that only an entropic risk measure satisfies these properties and propose a twostage stochastic programming formulation of the problem, including an efficient approximation scheme to solve it. We illustrate our approach in a small selfconstructed example, and apply our approximation scheme to a realworld section of the Andina mine, in Chile.



Canfora, F., Gomberoff, A., Oh, S. H., Rojas, F., & SalgadoRebolledo, P. (2019). Meronic EinsteinYangMills black hole in 5D and gravitational spin from isospin effect. J. High Energy Phys., (6), 32 pp.
Abstract: We construct an analytic black hole solution in SU(2) EinsteinYangMills theory in five dimensions supporting a Meron field. The gauge field is proportional to a pure gauge and has a nontrivial topological charge. The wouldbe singularity at the Meron core gets shielded from the exterior by the black hole horizon. The metric has only one integration constant, namely, its ADM mass, which is shown to be finite once an appropriate boundary term is added to the action. The thermodynamics is also worked out, and a firstorder phase transition, similar to the one occurring in the ReissnerNordstrom case is identified. We also show that the solution produces a spin from isospin effect, i.e., even though the theory is constructed out of bosons only, the combined system of a scalar field and this background may become fermionic. More specifically, we study scalar excitations in this purely bosonic background and find that the system describes fermionic degrees of freedom at spatial infinity. Finally, for the asymptotically AdS(5) case, we study its consequences in the context of the AdS/CFT correspondence.



Canfora, F., Gomez, A., Sorella, S. P., & Vercauteren, D. (2014). Study of YangMills.ChernSimons theory in presence of the Gribov horizon. Ann. Phys., 345, 166–177.
Abstract: The twopoint gauge correlation function in YangMillsChernSimons theory in three dimensional Euclidean space is analysed by taking into account the nonperturbative effects of the Gribov horizon. In this way, we are able to describe the confinement and deconfinement regimes, which naturally depend on the topological mass and on the gauge coupling constant of the theory. (C) 2014 Elsevier Inc. All rights reserved.



Canfora, F., Oh, S. H., & SalgadoRebolledo, P. (2017). Gravitational catalysis of merons in EinsteinYangMills theory. Phys. Rev. D, 96(8), 10 pp.
Abstract: We construct regular configurations of the EinsteinYangMills theory in various dimensions. The gauge field is of merontype: it is proportional to a pure gauge (with a suitable parameter lambda determined by the field equations). The corresponding smooth gauge transformation cannot be deformed continuously to the identity. In the threedimensional case we consider the inclusion of a ChernSimons term into the analysis, allowing lambda to be different from its usual value of 1/2. In four dimensions, the gravitating meron is a smooth Euclidean wormhole interpolating between different vacua of the theory. In five and higher dimensions smooth meronlike configurations can also be constructed by considering warped products of the threesphere and lowerdimensional Einstein manifolds. In all cases merons (which on flat spaces would be singular) become regular due to the coupling with general relativity. This effect is named “gravitational catalysis of merons”.



Canfora, F. E., Dudal, D., Justo, I. F., Pais, P., SalgadoRebolledo, P., Rosa, L., et al. (2017). Double nonperturbative gluon exchange: An update on the softPomeron contribution to pp scattering. Phys. Rev. C, 96(2), 8 pp.
Abstract: We employ a set of recent, theoretically motivated fits to nonperturbative unquenched gluon propagators to check on how far double gluon exchange can be used to describe the soft sector of pp scattering data (total and differential cross section). In particular, we use the refined GribovZwanziger gluon propagator (as arising from dealing with the Gribov gauge fixing ambiguity) and the massive Cornwalltype gluon propagator (as motivated from DysonSchwinger equations) in conjunction with a perturbative quarkgluon vertex, next to a model based on the nonperturbative quarkgluon MarisTandy vertex, popular from BetheSalpeter descriptions of hadronic bound states. We compare the cross sections arising from these models with older ISR and more recent TOTEM and ATLAS data. The lower the value of total energy root s, the better the results appear to be.



Caniupan, M., Bravo, L., & Hurtado, C. A. (2012). Repairing inconsistent dimensions in data warehouses. Data Knowl. Eng., 7980, 17–39.
Abstract: A dimension in a data warehouse (DW) is a set of elements connected by a hierarchical relationship. The elements are used to view summaries of data at different levels of abstraction. In order to support an efficient processing of such summaries, a dimension is usually required to satisfy different classes of integrity constraints. In scenarios where the constraints properly capture the semantics of the DW data, but they are not satisfied by the dimension, the problem of repairing (correcting) the dimension arises. In this paper, we study the problem of repairing a dimension in the context of two main classes of integrity constraints: strictness and covering constraints. We introduce the notion of minimal repair of a dimension: a new dimension that is consistent with respect to the set of integrity constraints, which is obtained by applying a minimal number of updates to the original dimension. We study the complexity of obtaining minimal repairs, and show how they can be characterized using Datalog programs with weak constraints under the stable model semantics. (c) 2012 Elsevier B.V. All rights reserved.



Cardenas, C., Guzman, F., Carmona, M., Munoz, C., Nilo, L., Labra, A., et al. (2020). Synthetic Peptides as a Promising Alternative to Control Viral Infections in Atlantic Salmon. Pathogens, 9(8), 600.
Abstract: Viral infections in salmonids represent an ongoing challenge for the aquaculture industry. Two RNA viruses, the infectious pancreatic necrosis virus (IPNV) and the infectious salmon anemia virus (ISAV), have become a latent risk without healing therapies available for either. In this context, antiviral peptides emerge as effective and relatively safe therapeutic molecules. Based on in silico analysis of VP2 protein from IPNV and the RNAdependent RNA polymerase from ISAV, a set of peptides was designed and were chemically synthesized to block selected key events in their corresponding infectivity processes. The peptides were tested in fish cell lines in vitro, and four were selected for decreasing the viral load: peptide GIM182 for IPNV, and peptides GIM535, GIM538 and GIM539 for ISAV. In vivo tests with the IPNV GIM 182 peptide were carried out using Salmo salar fish, showing a significant decrease of viral load, and proving the safety of the peptide for fish. The results indicate that the use of peptides as antiviral agents in disease control might be a viable alternative to explore in aquaculture.`



Cardu, M., & Seccatore, J. (2016). Quantifying the difficulty of tunnelling by drilling and blasting. Tunn. Undergr. Space Technol., 60, 178–182.
Abstract: This study deals with industrial trends in tunnelling by drill and blast (D&B). We perform a statistical analysis of accumulated experience from the 1950s to the modern day to provide advice for proper project management in tunnel driving. The basis of the study is a wide database of tunnel blast schemes. This database is made of excavation parameters, and considers two main families of blasts: with parallel hole cuts and with inclined hole cuts. Such parameters are analysed by means of statistical regression. Correlations are shown. We present a general curve of correlation between tunnel sections and specific drilling and specific explosive consumption. We show how pull efficiency cannot be correlated to a single parameter, and how tunnelling by D&B needs to be treated as a complex system. Finally, we propose a method for quantifying and classifying the difficulty of tunnelling. The deviation of specific drilling (SD) from industrial average trend is used as an indicator of difficulty: easier when SD is lower than average, and more difficult when SD is higher than average. We show how such deviation can be preliminarily associated with lithotypes. This provides to designers and cost estimators a tool of a first approximation for D&B cost prediction at the prefeasability and feasibility stages of a tunnelling project. (C) 2016 Elsevier Ltd. All rights reserved.



Carmichael, T. W., Quinn, S. N., Mustill, A. J., Huang, C., Zhou, G., Persson, C. M., et al. (2020). Two Intermediatemass Transiting Brown Dwarfs from the TESS Mission. Astron. J., 160(1), 15 pp.
Abstract: We report the discovery of two intermediatemass transiting brown dwarfs (BDs), TOI569b and TOI1406b, from NASA's Transiting Exoplanet Survey Satellite mission. TOI569b has an orbital period of P = 6.55604 0.00016 days, a mass of Mb = 64.1 1.9 , and a radius of Rb = 0.75 0.02 . Its host star, TOI569, has a mass of Mstar = 1.21 0.05, a radius of Rstar = 1.47 0.03 dex, and an effective temperature of Teff = 5768 110 K. TOI1406b has an orbital period of P = 10.57415 0.00063 days, a mass of Mb = 46.0 2.7 , and a radius of Rb = 0.86 0.03 . The host star for this BD has a mass of Mstar = 1.18 0.09 a radius of Rstar = 1.35 0.03 dex, and an effective temperature of Teff = 6290 100 K. Both BDs are in circular orbits around their host stars and are older than 3 Gyr based on stellar isochrone models of the stars. TOI569 is one of two slightly evolved stars known to host a transiting BD (the other being KOI415). TOI1406b is one of three known transiting BDs to occupy the mass range of 4050 and one of two to have a circular orbit at a period near 10 days (with the first being KOI205b). Both BDs have reliable ages from stellar isochrones, in addition to their wellconstrained masses and radii, making them particularly valuable as tests for substellar isochrones in the BD massradius diagram.



Caroca, P., Cartes, C., Davies, T. B., Olivari, J., Rica, S., & VogtGeisse, K. (2020). The anatomy of the 2019 Chilean social unrest. Chaos, to appear.
Abstract: We analyze the 2019 Chilean social unrest episode, consisting of a sequence of events, through the lens of an epidemiclike model that considers global contagious dynamics. We adjust the parameters to the Chilean social unrest aggregated public data available from the Undersecretary of Human Rights, and observe that the number of violent events follows a welldefined pattern already observed in various public disorder episodes in other countries since the sixties. Although the epidemiclike models display a single event that reaches a peak followed by an exponential decay, we add standard perturbation schemes that may produce a rich temporal behavior as observed in the 2019 Chilean social turmoil. Although we only have access to aggregated data, we are still able to fit it to our model quite well, providing interesting insights on social unrest dynamics.



Caroca, R., Concha, P., Fierro, O., Rodriguez, E., & SalgadoRebolledo, P. (2018). Generalized ChernSimons higherspin gravity theories in three dimensions. Nucl. Phys. B, 934, 240–264.
Abstract: The coupling of spin3 gauge fields to threedimensional Maxwell and AdSLorentz gravity theories is presented. After showing how the usual spin3 extensions of the Ad S and the Poincare algebras in three dimensions can be obtained as expansions of sl (3, R) algebra, the procedure is generalized so as to define new higherspin symmetries. Remarkably, the spin3 extension of the Maxwell symmetry allows one to introduce a novel gravity model coupled to higherspin topological matter with vanishing cosmological constant, which in turn corresponds to a flat limit of the AdSLorentz case. We extend our results to define two different families of higherspin extensions of threedimensional Einstein gravity. (C) 2018 The Authors. Published by Elsevier B.V.

