|
Canessa, E., Chaigneau, S. E., Moreno, S., & Lagos, R. (2020). Informational content of cosine and other similarities calculated from high-dimensional Conceptual Property Norm data. Cogn. Process., 21, 601–614.
Abstract: To study concepts that are coded in language, researchers often collect lists of conceptual properties produced by human subjects. From these data, different measures can be computed. In particular, inter-concept similarity is an important variable used in experimental studies. Among possible similarity measures, the cosine of conceptual property frequency vectors seems to be a de facto standard. However, there is a lack of comparative studies that test the merit of different similarity measures when computed from property frequency data. The current work compares four different similarity measures (cosine, correlation, Euclidean and Chebyshev) and five different types of data structures. To that end, we compared the informational content (i.e., entropy) delivered by each of those 4 x 5 = 20 combinations, and used a clustering procedure as a concrete example of how informational content affects statistical analyses. Our results lead us to conclude that similarity measures computed from lower-dimensional data fare better than those calculated from higher-dimensional data, and suggest that researchers should be more aware of data sparseness and dimensionality, and their consequences for statistical analyses.
|
|
|
Chaigneau, S. E., Canessa, E., Lenci, A., & Devereux, B. (2020). Eliciting semantic properties: methods and applications. Cogn. Process., 21(4), 583–586.
Abstract: Asking subjects to list semantic properties for concepts is essential for predicting performance in several linguistic and non-linguistic tasks and for creating carefully controlled stimuli for experiments. The property elicitation task and the ensuing norms are widely used across the field, to investigate the organization of semantic memory and design computational models thereof. The contributions of the current Special Topic discuss several core issues concerning how semantic property norms are constructed and how they may be used for research aiming at understanding cognitive processing.
|
|
|
Marchant, N., Canessa, E., & Chaigneau, S. E. (2022). An adaptive linear filter model of procedural category learning. Cogn. Process., 23(3), 393–405.
Abstract: We use a feature-based association model to fit grouped and individual level category learning and transfer data. The model assumes that people use corrective feedback to learn individual feature to categorization-criterion correlations and combine those correlations additively to produce classifications. The model is an Adaptive Linear Filter (ALF) with logistic output function and Least Mean Squares learning algorithm. Categorization probabilities are computed by a logistic function. Our data span over 31 published data sets. Both at grouped and individual level analysis levels, the model performs remarkably well, accounting for large amounts of available variances. When fitted to grouped data, it outperforms alternative models. When fitted to individual level data, it is able to capture learning and transfer performance with high explained variances. Notably, the model achieves its fits with a very minimal number of free parameters. We discuss the ALF's advantages as a model of procedural categorization, in terms of its simplicity, its ability to capture empirical trends and its ability to solve challenges to other associative models. In particular, we discuss why the model is not equivalent to a prototype model, as previously thought.
|
|