SciELO - Scientific Electronic Library Online

 
vol.14 issue2Safety Climate in Hospital Work: adaptation of a measure (ClimaSeg-H)Psychometric properties of the Multidimensional Self-Concept Scale in Brazilian adolescents author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Avaliação Psicológica

Print version ISSN 1677-0471On-line version ISSN 2175-3431

Aval. psicol. vol.14 no.2 Itatiba Aug. 2015

 

 

Factor retention in the intra-individual approach: Proposition of a triangulation strategy

 

A retenção fatorial na abordagem intraindividual: proposição de uma estratégia de triangulação

 

Retención factor intraindividual enfoque: proponer una estrategia de triangulación

 

 

Cristiano Mauro Assis Gomes1, I; Hudson Fernandes GolinoII

IUniversidade Federal de Minas Gerais
IIUniversidade Estadual de Feira de Santana

 

 


ABSTRACT

The intra-individual approach has been proposed by researchers as an alternative to traditional techniques focused on the population level. The central argument is that the inter-individual procedures are appropriate to estimate the population variance but cannot be extrapolated to the individual level. Evidence shows that the factorial structure of intelligence and personality psychometric models usually verified in the population (e.g. Carrol’s three strata factor structure and the Big Five Personality model) are not necessarily found at the individual level. However, the factor analysis in the intra-individual approach has an additional challenge in comparison to the traditional factor analysis: until the present moment there are no fit indices available to the intra-individual factor techniques. The current paper proposes the use of a set of approaches to find the adequate factor retention as a triangulation of data. The proposition will be applied in the psychometric intelligence field.

Palavras-chave: psychometric; factor analysis; methodology; intelligence.


RESUMO

A abordagem intraindividual foi proposta por autores como uma alternativa para medir o nível do indíviduo em vez de medir o nível da população. O argumento central é que as evidências interindividuais são apropriadas para estimar a variância da população, mas não podem ser extrapoladas para o nível do indivíduo. Evidências mostram que a estrutura fatorial de modelos psicométricos da inteligência e personalidade não estão necessariamente presentes no nível individual. Entretanto, a análise fatorial na abordagem intraindividual tem um desafio adicional em relação à análise fatorial tradicional. Até o presente momento, a análise fatorial intraindividual não possui índices de ajuste aos dados. Este artigo propõe o uso de um conjunto de abordagens para obter a retenção adequada de fatores, enquanto triangulação de dados. A proposição será aplicada no campo psicométrico da inteligência.

Keywords: psicometria; análise fatorial; metodologia; inteligência.


RESUMEN

El enfoque intraindividual fue propuesta por los cientificos como una alternativa para medir el nivel de la persona, en lugar de medir el nivel de la población. El argumento central es que la evidencia interindividual es apropiado para la estimación de la varianza de la población, pero no se puede extrapolar al nivel del individuo. Las evidencias muestran que la estructura factorial de modelos psicométricas de inteligencia y personalidad no están presentes necesariamente en el plano individual. Sin embargo, el análisis de los factores en el enfoque intraindividual tiene un desafío adicional en comparación con el análisis tradicional de los factores. Hasta la fecha, el análisis intra-individual no presenta índices de ajuste a los datos. En este artículo se propone el uso de un número de enfoques para retención de factores, como uma triangulación de datos. La propuesta se aplicará en el ámbito de investigación de la inteligência.

Palabras-clave: psicometría; análisis factorial; metodología; inteligencia.


 

 

Psychology is usually built upon between-individual differences, also called the interindividual approach (Molenaar, 2007a). The structure of a psychological construct (e.g. intelligence) that appears when someone conducts an exploratory or confirmatory factor analysis in a data set formed by answers from several people to a particular test is thought as being isomorphic to the structure of the individuals. Let us clarify this by considering two examples. The state-of-the-art intelligence model, the Cattel-Horn-Carrol model, has been proposed based on solid evidences that intelligence, in the population level, is composed by a hierarchy with three ability levels. The usual interpretation of the CHC model is based on the idea that each individual has the same intelligence structure as postulated by the model. In the same line, the state-of-the-art personality model (Big Five or Five Factor Model) presents a considerable amount of evidence supporting the existence of five broad personality factors in the population. As happens in the intelligence field, it is usual to think the big five model as being also present in the individuals. So, statements as “John has a high neuroticism, average extraversion, moderate openness to experience, high conscientiousness and low agreeableness” is not uncommon between Psychologists, especially in the field of Psychological Assessment. People, including practitioners and researchers, believe that the evidences from a model constructed in the population level is sufficient and necessary to generate evidences in the individual level. We call this interpretation as the level transposition fallacy.

Why is this level transposition (from population to individuals) a fallacy? Well, although providing important information for the understanding of psychological functioning in the population, the interindividual approach does not usually provide information about how individuals function (Molenaar, Sinclair, Rovine, Ram, & Corneal, 2009). The inappropriateness of directly transpose evidence from population to individuals was mathematically proved through the ergodic theorems. The ergodic theorems come from the mathematical theory of ergodicity, first developed in the 1930’s to study dynamic systems by Henri Poincaré, George Birkhoff and John Von Neumann (Ugalde, 2007). The ergodic theorems define two necessary and sufficient conditions that allow the generalization of knowledge from the interindividual structure observed in the population to the individuals: homogeneity and stationarity. The homogeneity principle requires that each individual from the population follow the same statistical model of the population (Molenaar, 2007a, 2007b). If this principle is met, studying the population means studying each individual, since each person in the ensemble of persons (population) obeys to the same statistical model. The stationarity principle, by the other side, requires that each individual does not have time-varying parameters across the observation interval (Molenaar, 2007b). Both criteria need to be met in order to be possible the generalization from the population level to single subjects and vice-versa. If homogeneity or stationarity fail to hold, the process is non-ergodic and the inference is inadequate (Molenaar, 2008). Thus, interpreting psychological models constructed based on population statistics as being equivalent to how individuals function is what we call a level transposition fallacy.

To interpret or to transpose the evidences from population models to individuals is so tricky that researches trying to validate population-based psychological models, as the Big Five, in the individual level have consistently failed. Borkenau and Ostendorf (1998) gave to participants a big five personality questionnaire composed of 30 self-report items, to be answered on 90 consecutive days. The results reported are straightforward: the big five model was not able to explain the variance of any one of the 22 first-year psychology participants of their study. Hamaker, Dolan, and Molenaar (2005) reanalyzed the Borkenau and Ostendorf ’s (1998) data and found two, three and four predominant factors explaining the within-individual variance of the questionnaire’s answers. In the same line, but investigating the intelligence structure of an 23 year-old male, Gomes, Araújo, Ferreira, and Golino (2014) applied nine intelligence tests from the High-Order Cognitive Factor Battery on 90 occasions. The authors were expecting to find evidences supporting the presence of fluid intelligence, crystallized intelligence and short-term memory, beyond the general intelligence factor, since the tests are markers of these latent variables of the Cattell-Horn-Carroll model. Contrary to the expected, they encountered only a general intelligence factor. Trying to validate the Cattel- Horn-Carroll model in the individual level led the researchers to conclude, based on the dynamic factor analysis conducted, that the adequate model to explain the performance of the participant was the old Spearman’s (1904) intelligence model (Gomes et al., 2014).

Despite of its importance, the intraindividual factor analysis’ approach does not presents data fit indices as the traditional factor model techniques. Therefore, choosing the number of factors to be retained is a challenge to the field. However, this scenario can be improved through the application of a set of factor retention strategies. The present paper proposes a data triangulation strategy where different techniques can be used to enrich the decision of which factors should be retained and which should be eliminated. At the same time, the current paper reinforces the relevant role of the intraindividual approach to the social sciences and to psychology as proposed by a body of researchers since the 80’s and 90’s.

Intraindividual factor analysis.

Raymond Cattell (1952) proposed a model to study the psychological variables between individuals, called R-technique (interindividual), as well as in a single subject through time, called P-technique (intraindividual). He pointed that experimental designs in psychology have three main components: individuals, time and variables. Depending on how they are combined, a different techninque is used. The R-technique measures one or more variables in several individuals during a single or a few occasions, and allowing the identification of common factors in the population. Ram, Brose and Molenaar (2013) point the goal of the P-technique is to describe relations among multiple responses of P-data, i.e. data collected in multiple occasions in one or more variables, in order to discover the structure underlying the responses, or to test hypothesis regarding the day-today variation observed. However, since repeated measurements obtained from the same person are generally related, a key assumption required by the traditional factor analysis will probably be violated: the independency of the observations (Ram et al., 2013). The authors argued that in the years following the development of the P-technique factor model, a number of alternatives emerged to account for the relationship between the variables, for example the autoregression and moving average time series’ models. In 1985, Peter Molenaar introduced the dynamic factor analysis as an alternative to the P-technique factor model and to the time series models, since it enables to both “deal with the independence violations and provide a framework for modeling the dynamic nature of ongoing processes” (Ram et al., 2013, p.3). In the dynamic factor model, the multivariate state of an individual at any time is given by concurrent influences and past states (Ram et al., 2013).

Two basic equations represents the P-technique and the dynamic factor analysis (Ram et al., 2013):

 

 

Where yt is a vector of the observable variables indexed by time (t=1, 2, …, T), is the p x q factor loading matrix, is a q-variate time series of latent factor scores and is the specific error plus measurement error time series. In equation 2, the is modeled as a function of prior weighted (B1 to Bs) latent states from to . As pointed by Ram et al. (2013), “present time “disturbances” are then introduced as a q-variate set of latent “innovations,” , and residual (measurement+specific) errors, , the latter of which may be correlated across occasions.”

The intraindividual factor analysis has two fundamental aims: 1. modeling the relationship between the latent variables and the observed variables and 2. modeling the time-dependent structure that occurs between the observed variables and between the latent variables. It is important to note the time-dependent structure and its modelling are not present in the traditional factor analysis, which makes intraindividual factor analysis unique.

There are different estimation procedures for the intraindividual factor analysis techniques. Zhang, Hamaker, and Nesselroade (2008) compared four different estimation procedures (maximum likelihood estimation using the Kalman filter algorithm, maximum likelihood estimation based on the block-Toeplitz covariance matrix, Bayesian estimation using Gibbs sampling, and the least squares estimation method) and concluded they were all able to generate adequate parameter estimates and presented similar accuracies.

The works of Gomes et al. (2014), and Borkenau and Ostendorf (1998), which pointed the inadequacy of the state-of-the-art models of intelligence and personality to explain intraindividual variance, used the least squares estimation method and the P-Factor Analysis, respectively. The later used the repeated measurement correlation matrix but did not include lag correlations as present in the block-Toeplitz covariance matrix.

Objective of the study.

The current study proposes to apply a set of factor retention strategies in the intraindividual approach. The strategies are available in the R package nFactors (Raiche, 2010) that implements the following criteria: Kaiser- Guttman criterion (eigenvalues-greater-than-one), scree test with acceleration factor, scree test with optimal coordinates, and parallel analysis. These four criteria are applied in a time series that represents the performance of one person on nine different intelligence tests at 90 different occasions. The eigenvalues are extracted through the R package tseries (Trapletti & Hornik, 2013). The second goal of the paper is to be a tutorial for those interested in the intraindividual factor analysis.

 

Method

Participant

The subject was a 60 years-old female, middle class, with a physical education university degree, actually retired.

Measures

Nine tests of the Higher-Order Cognitive Factor Battery were used. Evidences showed that the Higher- Order Cognitive Factor Battery is able, in the population level, to measure g and six broad cognitive abilities of the Cattell-Horn-Carroll model, with Cronbach’ alphas above .70 (Gomes & Borges, 2009a, 2009b; Gomes, 2009; Gomes, 2010; Gomes, 2011). The first three tests measures fluid intelligence (Gf), the three following tests measures crystalized intelligence (Gc) and the last three tests measures processing speed (Gs).

Inductive Reasoning (I). Composed of 12 items and a time limit of 14 minutes for its completion. Each item consists of five groups of four letters. Among the five groups there are four groups that have the same organization rule. The respondent must identify the group that has a different rule and mark it with an (x).

Logical Reasoning Test (RL). The test possesses 30 items and the time limit of 24 minutes. Each item consists of a conclusion based on two abstract logical premises, with no relationship to the real world. The respondent has to indicate if the logical conclusion is appropriate or inappropriate.

General Reasoning (RG). Composed of 15 items and the completion time is limited to 18 minutes. Each item consists of a logical-mathematical problem. The respondent must interpret the item statement, solve the problem and choose one of the five possible answers.

Verbal Comprehension Test 1 (V1). Composed of 24 items, the time limit is set to a maximum of six minutes. Each item consists of a reference word and five multiple-choice options. Each option has one word, and the goal is to identify the word which the meaning is the closest to the reference word. Verbal Comprehension Test 2 (V2). This test has the same structure as V1, but with 18 items. Verbal Comprehension Test 3 (V3). This test has the same structure as V1, but with 18 items.

Perceptive Speed 1 (P1). Ten columns with 410 words each are presented to the participant. The task consists in marking the five words that have the letter "A" in each column. The test has 50 words with the letter "A", and a time limit of two minutes. Perceptive Speed 2 (P2): The test consists in 48 pairs of numbers with at least three digits. The task consists in marking all the pairs in which the numbers are different. The test has a time limit of two minutes. Perceptive Speed 3 (P2): The test consists in 48 items and a time limit of two and a half minutes. Each item has a target figure and five answer options. Each option has a figure, and the participant is requested to identify which one is exactly the same as the target figure.

Procedures

The same person answered the nine intelligence tests for approximately three months, in 90 different occasions. She was informed about the ethical aspects of the research and consented to answer the tests and signed her agreement. The research followed the brazilian ethical guide about human research and was accepted by the ethical committee of the Federal University of Minas Gerais (number 10531613.7.0000.5149). The participant had contact with the tests only at the moment of their administration. She did not have any contact with the tests in other moments and have never seen the tests previously to the research day one. The tests were administered at the participant home, one time in the middle of morning and one time in the middle of the afternoon, from Monday to Saturday, for a period of 90 days. This approach was used in function of the preference of the participant. One of the researchers read the tests instructions and was present during the first few days. A timer controlled the total amount of time spent in each test. The participant did not complain of fatigue or boredom, showing engagement to participate througout the research.

After the last administration moment (t=90), the tests were corrected. The score in each test corresponded to the number of correct answers. The total score in each test, for every assessment occasion, were registered in an Excel spreadsheet, and then plotted using the ggplot2 package (Wickham, 2009).

To facilitate the visualization of the tests’ correlational structure in the 90 assessment occasions, the correlation matrix was plotted (Figure 1) as a weighted graph using the qgraph package (Epskamp, Cramer, Waldorp, Schmittmann, & Borsboom, 2012). The layout of the weighted graph was computed using a modified version of the Fruchterman-Reingold algorithm (Fruchterman & Reingold, 1991). This algorithm computes a graph layout with the edges depending on their absolute weights (Epskamp et al., 2012), i.e. the stronger the correlation between two vertices (representing the variables), the closest in space it is represented in the network (shorter edges for stronger weights). On the other hand, the weakest the correlation between two vertices, the further apart they are represented in the space. The qgraph package also plots the width of the edges and its color intensity according to its weight. So the highest the weight of an edge, the highest its width and the more intense its color.

 

Results and Discussion

The correlation matrix is presented in Table 1, as the mean, median and standard deviation of each test. Figure 1 presents the correlation matrix as a weighted network. The Verbal Comprehension Test 3 presented the smallest correlation coefficient and thereat was the farthest vertex in the weighted graph. In general, the tests presented moderate to high correlations, represented by the width of the edges in Figure 1

 

 

 

 

Figure 2 presents the scores’ variability over the 90 measurement occasions. The participant achieved 100% of score on test P1 and P3, and achieved a high score (higher than 90%) on test V1 and RG, indicating that it is possible that the trajectories and slopes of these tests to show some noise in function of the ceiling effect. This is a complicated aspect of intraindividual approach because its data is strongly influenced by learning effect in function of the own nature of the intraindividual differences.

Despite the factor retention’s syntax and strategies be methodological aspects of the study (it is normally presented and described in the methods’ section), they are central to our objective and, therefore, will be presented here at the Results and Discussion’ section.

Initially, the data was saved in a .csv file. An object named data was created to read the .csv file (data<-read. csv2(file="./x.csv",header=TRUE)) and was, then, transformed in a matrix named dado (dado=as.matrix(data)). The correlation matrix of the object dado was calculated and recorded in the object named z (z<-cor(dado)). Next, the eigenvalues were extracted and recorded in the object y using the eigen function of the tseries package (y<- eigen(z)). The eigenvalues were extracted from the object y and recorded as an object named b (b<- y$values). Then, an object named dados containing the eigenvalues and the occasions of measurement was created (dados<- list(eigenvalues=b,noccasions=90)). The eigenvalues (previously called b) was set as a column of the object dados with the name eigenvalues (eigenvalues< -dados$eigenvalues) and the measurement occasions (previously represented by the name noccasions) was set as a column named occasions (occasions< -dados$noccasions). The object variables was created to represent the number of eigenvalues (variables<-length(eigenvalues)), the object rep was created to define the number of replications for the parallel analysis (rep<-100) and the object cent (cent<- 0.95) to define the centile value of the parallel analysis (aparallel<-parallel(var=variables, noccasions, rep=rep, cent =cent)$eigen$qevpea). The object results contains all the four factor retention criteria used: the Kaiser-Guttman, the scree test (optimal coordinates and acceleration factor), and the parallel analysis (results<-nScree(eig=eigenvalues, aparallel=aparallel)). Finally, the function plotnScree was used to plot the four factor retention criteria (plotnScree(results)).

All criteria pointed to an one factor solution. This is good because if different criteria achieve a clear consensus, then probably it indicates that the one factor solution is the correct solution. However, it is a speculation and not an evidence because there is no available confirmatory fit indices to intraindividual factor analysis currently. Figure 3 represents the results from all the four criteria.

The first eigenvalue was very prominent in comparison on the other, with a value of 6.78 against a value of 0.83 for the second eigenvalue. The one factor solution explained 75.33% of the variance, a considerable amount that indicates the force of the general factor identified.

The intraindividual approach is a necessary but forgot, or at least underutilized, field in psychology. People usually think psychological phenomena in such a way that automatically transpose evidences from a population- based model to the individual level, a reasoning process we call the level transposition fallacy. The ergodic theorems and a set of empirical researches (Molenaar, 2007a) show that the evidences from the interindividual approach are adequate to generate conclusions about the population but not about individuals. However, in spite of all its characteristics and applications, the intraindividual approach needs to advance, developing techniques that handle its own challenges. At this moment, the intraindividual factor analysis techniques does not possess confirmatory fit indices as the traditional factor analysis. The current paper proposed the application of four factor retention criteria, as a set of strategies functioning as a data triangulation approach to retain factors in the intraindividual approach.

 

 

 

 

The data analyzed is on the intelligence field and corroborates the evidences found by Gomes et al. (2014). A general factor was sufficient to explain the performance of the participant of the study. Complementary, the use of the four factor retention criteria enrich the evidence that the Spearman model (1904) is more adequate for a specific person than the Cattell-Horn-Carroll model to explain the individual intelligence structure.

 

References

Borkenau, P., & Ostendorf, F. (1998). The Big Five as states: How useful is the five-factor model to describe intra-individual variations over time? Journal of Personality Research, 32, 202-221.         [ Links ]

Epskamp, S., Cramer, A. O. J., Waldorp, L. J., Schmittmann, V. D., & Borsboom, D. (2012). qgraph: Network Visualizations of Relationships in Psychometric Data. Journal of Statistical Software, 48(4), 1-18. Recuperado de http://www.jstatsoft.org/v48/i04/.         [ Links ]

Fruchterman, T., & Reingold, E. (1991). Graph drawing by force-directed placement. Software - Practice & Experience, 21(11), 1129-1164.         [ Links ]

Gomes, C. M. A. (2011). Validade do Conjunto de Testes da Habilidade de Memória de Curto-Termo (CTMC). Estudos de Psicologia (Natal), 16, 235-242.         [ Links ]

Gomes, C. M. A., & Borges, O. N. (2009a). Qualidades psicométricas do Conjunto de Testes de Inteligência Fluida. Avaliação Psicológica, 8(1), 17-32.         [ Links ]

Gomes, C. M. A., & Borges, O. N. (2009b). Propriedades psicométricas do Conjunto de Testes da Habilidade Visuo-Espacial. Psico-USF, 14(1), 19-34.         [ Links ]

Gomes, C.M.A. (2010). Estrutural fatorial da Bateria de Fatores Cognitivos de Alta-Ordem (BaFaCAlO). Avaliação Psicológica, 9(3), 449-459.         [ Links ]

Gomes, C. M. A., Araújo, J., Ferreira, M. G., & Golino, H. F. G. (2014). The validity of the Cattel-Horn-Carroll model on the intraindividual approach. Behavioral Development Bulletin, 19, 22-30.         [ Links ]

Hamaker, E. J., Dolan, C. V., & Molenaar, P. C. M. (2005). Statistical modeling of the individual: Rationale and application of multivariate stationary time series analysis. Multivariate Behavioral Research, 40, 207-233.         [ Links ]

Molenaar, P. C. M. (2007). Psychological methodology will change profoundly due to the necessity to focus on intra-individual variation. Integrative Psychological & Behavioral Science, 41(1), 35-40.         [ Links ]

Molenaar, P. C. M. (2007a). On the implications of the classical ergodic theorems: Analysis of developmental process has to focus on intraindividual variation. Development Psychobiology, 50(1), 60-69.         [ Links ]

Molenaar, P. C. M. (2007b). Psychological methodology will change profoundly due to the necessity to focus on intra-individual variation. Integrative Psychological & Behavioral Science, 41(1), 35-40.         [ Links ]

Molenaar, P. C. M. (2008). On the Implications of the Classical ergodic theorems: analysis of developmental process has to focus on intraindividual variation. Development Psychobiology, 50(1), 60-69.         [ Links ]

Molenaar, P. C. M., Sinclair, K.O., Rovine, M., Ram, N., & Corneal S. E. (2009). Analyzing developmental processes on an individual level using non-stationary time series modeling. Developmental Psychology, 45(1), 260-271         [ Links ]

Raiche, G. (2010). nFactors: An R package for parallel analysis and non graphical solutions to the Cattell scree test. [Computer Software Manual]. R package version 2.3.3.         [ Links ]

Spearman, C. (1904). General intelligence objectively determined and measured. American Journal of Psychology, 15(2), 201-293.         [ Links ]

Trapletti, A., & Hornik, K. (2013). tseries: Time series analysis and computational finance. [Computer Software Manual]. R package version 0.10- 32.         [ Links ]

Ugalde, E. (2007). De la mecánica estadística a la teoría ergódica. Revista Mexicana de Física, 53(2), 191-194         [ Links ]

Wickham, H. (2009). ggplot2: elegant graphics for data analysis. Springer New York, 2009.         [ Links ]

Zhang, Z., Hamaker, E. L., & Nesselroade, J. R. (2008).Comparisons of four methods for estimating a dynamic factor model. Structural Equation Modeling: A Multidisciplinary Journal, 15(3), 377-402.         [ Links ]

 

 

recebido em julho de 2014
reformulado em abril de 2015
aprovado em maio de 2015

 

 

Sobre os autores

Cristiano Mauro Assis Gomes:é Psicólogo, possui Doutorado em Educação pela Universidade Federal de Minas Gerais. Atualmente é professor do Departamento de Psicologia, do Programa de Pós-Graduação em Psicologia e do Programa de Pós-Graduação em Neurociências da UFMG.
Hudson Fernandes Golino: é Psicólogo, possui Doutorado em Neurociências pela Universidade Federal de Minas Gerais. Atualmente é professor do Programa de Pós-Graduação em Computação Aplicada e do curso de Psicologia Universidade Estadual de Feira de Santana (BA).


1Endereço para correspondência: Departamento de Psicologia, Laboratório de Investigação da Arquitetura Cognitiva, Universidade Federal de Minas Gerais, Av. Antônio Carlos, 6627, sala 4010, Pampulha, 31270-901, Belo Horizonte-MG. E-mail: cristianogomes@ufmg.br


Creative Commons License