SciELO - Scientific Electronic Library Online

vol.33 número1Facilitadores del proceso de aprendizaje de la escritura en las primeras edadesPatrones de dificultades emocionales y comportamentales durante la adolescencia: la influencia de las habilidades prosociales índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados




Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google


Anales de Psicología

versión On-line ISSN 1695-2294versión impresa ISSN 0212-9728

Anal. Psicol. vol.33 no.1 Murcia ene. 2017 



Divergent thinking and its dimensions: what we talk about and what we evaluate?

Pensamiento divergente y sus dimensiones: ¿de qué hablamos y qué evaluamos?



Carmen Ferrandiz, Mercedes Ferrando, Gloria Soto, Marta Sainz and María Dolores Prieto

Universidad de Murcia (Spain)

This work has been conducted thanks to the research project funded by the Spanish Ministry of Science and Technology (EDU2010-16370) and to the project funded by the Agency for Science and Technology in the Region of Murcia (11896/PHCS/09)





This paper examines the role of latent cognitive process and the contents of task (verbal and figural) in divergent thinking. The sample was composed of 260 students, attending different public and semi-public schools in the Murcia Region (Spain), with ages ranging from 8 to 15 years old. Creativity was assessed with the Torrance Test of Creativity Thinking (TTCT) and the Test of Creative Imagination (PIC). Results suggest that, even though both tests are based on the psychometric approach and Guilford's theory of creativity, their scores are not significantly correlated. Results from confirmatory factor analysis suggest two independent factors (one for each test), more related to tasks' demands and contents than with the cognitive processes traditionally considered in the definition and measurement of creativity.

Key words: creativity; divergent thinking; assessment of creativity; test of creativity.


Este artículo analiza el papel de los procesos cognitivos subyacentes en el pensamiento divergente y los contenidos sobre los que se aplican dichos procesos (verbal y figurativo). La muestra utilizada es de 260 alumnos con edades comprendidas entre los 8 y los 15 años. Para la evaluación del pensamiento divergente, se utilizó el Test Figurativo de Pensamiento Creativo de Torrance (TTCT), y la Prueba de Imaginación Creativa (PIC) de contenido verbal. Los resultados obtenidos sugieren que, aunque ambas pruebas evalúen el pensamiento divergente y tengan el referencial psicométrico de Guilford, sus puntuaciones no están significativamente correlacionadas. Los resultados de los análisis factoriales confirmatorios muestran que el modelo que mejor explica las puntuaciones en estas pruebas refleja la existencia de dos factores autónomos que se corresponden con cada una de las pruebas, por tanto, identificándose más con los contenidos, dominios y demandas de la tarea que con los procesos cognitivos que tradicionalmente se consideran en la definición y medida de la creatividad.

Palabras clave: creatividad; evaluación de la creatividad; pensamiento divergente; tests de creatividad.



Creativity is one of the psychological constructs most highly valued in social terms, as it is considered to be the basis of technological and social innovation, as well as human advancement (Craft, 2005; Hennessey & Amabile, 2010).

From the point of view of psychology it is understood that creativity is the ability to produce something, which is both new (original, unexpected) and appropriate (adjustable in its content regarding the features of the task)- (Sternberg & Lubart, 1995). For Sternberg and Lubart (1995), a model which explains creativity must include internal variables of the individual (related to intelligence, knowledge, thinking styles, personality features) as well as contextual variables which facilitate or prevent the expression of creativity. Other authors, such as - Amabile (1998) and Csikmensihally (1997), also consider creativity from a contextual point of view in which the individual, the field (experts) and the domain interact.

Reflexion on the study of creativity and its expression has given rise to various debates about its very nature. One of those currently under discussion refers to the 'general vs the specific' value of creativity. Such debate sprang, partly, as a consequence of the impact made by the 'multiple intelligences' theory: if intelligence is 'multiple', so should creativity be-(Gardner, 1995).

For the last years, empiric research -(e.g. Baer, 1996; Diakidoy & Spanoudis, 2002; Garaigordobil & Pérez, 2004; Han & Marvin, 2002; Runco, Dow & Smith, 2006; Silvia, Kaufman & Pretz, 2009), has shown a tendency to support the idea of the specificity of the creative domain, although there are some results contradicting it, which point to the existence of a general creativity as well -(e.g. Kaufman & Baer, 2004; Mohamed, Maker, & Lubart, 2012). This has led to the question whether the abilities required in each domain are themselves specific or what marks the difference in creative performance in different domains is the knowledge and the abilities acquired through experience. The authors seem to take this intermediate position: Thus, the 'funfair model' proposed by Kaufman and Baer (2005) argues that creativity is a general ability which becomes more and more specific as a result of the demands from the concrete task domain. The same idea is supported by - Plucker & Beghetto (2004).

According to these approaches, the specificity of creativity may not be especially given by the mental processes needed to create new ideas but by the required previous knowledge, or by a combination of both -(Plucker & Beghetto, 2004). What has actually been proven is that an individual may be very creative in a certain domain, such as literature, but very little so, for instance, in music -(Baer, 1999).

Bearing in mind that creativity encompasses multiple factors (internal and external to the individual: the field and the domain) -(Csikmensihally, 1997; Sternberg & Lubart, 1995), if we really intend to know whether creative thinking is specific to a certain domain (as the leading theories assume) or it is a general ability, we should concentrate on those thinking abilities involved in creativity, i.e., divergent thinking.

The scientific approach to the assessment of creative thinking is based on an operational definition of the construct designed by Guilford (1950). Although the author acknowledges the importance of various cognitive processes, such as memory, comprehension, knowledge, assessment, etc, he argues that the main feature of creative thinking is the ability to do so in a different, original way, i.e., divergent thinking. To this author, divergent thinking implies fluency in terms of ideas (the number of ideas an individual poses for the solution of a problem or matter); mental flexibility (the number of angles from which they tackle a problem); the originality of the ideas (how infrequent they are); and the elaborateness (the number of unnecessary details used to convey the idea).

In spite of the large variety of theoretical perspectives and the measurement instruments of the creative thinking construct, most of them have as a theoretical framework of reference, the divergent thinking model designed by Guilford (1950). Thus, among the attempts to assess creativity, from the most psychometric perspective or the objective measurement of creativity, it is important to mention the 'Torrance' Test of Creative Thinking (TTCT; Torrance, 1974), a classical instrument, universally used and having great impact on school environments -(Ferrando et al., 2007; Prieto, López, Ferrándiz, & Bermejo, 2003; Zalcateco et al., 2013), and which has also been used as a basis for the construction of new creativity measures (Sánchez, García, & Valdés, 2009; Wechsler, 2004).

The research on generality vs. specificity of divergent thinking carried out by Diakidoy & Spanoudis (2002), used two different divergent thinking tests: the verbal TTCT and a parallel one (designed by the authors and named 'History creativity Test'); and it showed that creativity was not only specific to each domain but also specific to each task. The authors mentioned in their conclusions the possible effect that test rating may have had on those results.

Previous research studies had already shown the problem of the scope of the results in the TTCT: most of the research works fail in finding the four main factors of the divergent thinking (fluency, flexibility, originality and elaborateness). To some authors, fluency, flexibility and originality are overlapping dimensions and there is no need to have three scores, and above all, originality appears to be very affected by fluency -(Chase, 1985; Dixon, 1979; Heausler & Thompson, 1988; Hocevar, 1979; Hocevar & Michael, 1979; Kim 2006; Kim, Cramond, & Bandalos, 2006; Runco & Marz, 1992; Treffinger, 1985). Other works -(Almeida et al., 2008; Ferrando et al., 2007; Oliveira et al., 2009) have shown that using a number of subtests from TTCT, the factorial structure is not arranged according to the dimensions, except for elaborateness. The fluency, flexibility and originality dimensions do not come up as independent factors but they are linked to the demands or the content specificity of each subtest.

Considering that the dimensions before mentioned (fluency, flexibility, originality and elaborateness) are the most representative ones of the divergent thinking, there rises another debate about the best way to measure and assess such dimensions. Runco (Runco & Mraz, 1992; Runco, Okuda & Thustone, 1987; Runco & Acar, 2012) y Mouchiroud & Lubart (2001) have summarised some of the main 'problems' associated to the assessment of the divergent thinking tasks. Fluency appears as a converging factor: the scores for flexibility, originality and elaborateness depend, to a large extent, on the number or responses (fluency). It is almost inevitable for flexibility, or the ability to change focus, to be related to a greater number of responses, but this is not necessarily true for originality or elaborateness.

Different assessment procedures to approach these findings have been proposed. The alternatives are manifold: a) calculating the mean originality dimension (adding the originality scores for each item and dividing that by the number of responses given) (Hocevar & Michael, 1979; Runco & Marz, 1992; Runco, Okuda, & Thurston. 1991; b) considering only the most original response (Zarnegar, Hocevas, & Michael, 1998: c) analising only the first three responses (Clark & Mirels, 1970); d) letting the assessed individual choose their most original response and rating only this response (Michael & Wringht, 1989); e) rating only single responses, i.e., counting as original those which have appeared only once within the group of participants, which implies assessing responses on the basis of the given sample (Runco, Okuda, & Thurstone, 1987); f) considering the originality score as the addition of uncommon responses [total response number= fluency (common responses) - originality (uncommon responses)], (Hong & Milgram, 1991; Moran, Milgram, Sawyer, & Fu, 1983; Wallach & Kogan, 1965); g) it has even been proposed the 'snapshot scoring method', whereby each set of responses given by one person gets a holistic score (Silvia, et al., 2008; Silvia, Martin, & Nusbaun, 2009). This last method seems to be too restrictive, as it leaves aside important information about the processes involved.

There have also been attempts to detach the influence of fluency from the flexibility calculation. The authors have arrived at formulae which seek to give a creativity index which takes into account both dimensions and the relation between them -(Nakano & Primi, 2012; Snyder, Mitchell, Bossomaier, & Pallier, 2004). More recently, Primi et al., (2013), have proposed to use residual scores obtained by regression as a fluency measure, for predicting fluency taking flexibility as the starting point.

The aim of our work is to study the specific vs general aspects of the cognitive processes of divergent thinking (fluency, flexibility and originality) when these are linked to a certain domain (verbal or figural), but which require little specific knowledge. In the present study we use the confirmatory factor analysis technique with a view to verify the structure underlying divergent thinking, seeking to use such information for a discussion on the role of the cognitive processes and the task contents on the assessed creativity dimension.

In addition, we deal with different rating formulae for the divergent thinking tests in order to prove whether the factorial structure depends on the biases occurring in such rating.




The subjects taking part in this research have been the students from some state and private institutions from the Murcia Region (Región de Murcia), aged between 8 and 15 (M = 10.12; SD = 1.57). The total number of participants has been 260 (121, 46.5 % boys). The sample has covered different school levels, in the following way: second stage of Primary School (37.7 %); third stage of Primary School (50.4 %); first course of compulsory secondary school, ESO (Secondary School Compulsory), (8.8 %); and second course of ESO (3.1 %). The attempt has been made to use a heterogeneous sample of students from the socio-cultural point of view, selecting for that urban and city, state and private schools.


Creative Thinking Test (TTCT Torrance Tests of Creative Thinking, Torrance, 1974). The object of this test is to assess the four fundamental dimensions of creativity: fluency, flexibility, originality and elaborateness. It contains a verbal and a figural part (Torrance, 1974). In the present study, the third subtest, that of the parallel lines contained in the figural mode, has been used following the adjustment and fitting guide designed by Prieto, Ferrándiz & Bermejo (2003). Previous studies have shown that the third subtest explains a higher percentage of the variance (Almeida et al., 2008; Ferrando et al., 2007; Oliveira et al., 2009; Prieto et al., 2006). The subtest task expects the student to compose different structures using parallel lines (30 pairs of parallel lines are shown to the child for them to draw with them as many different pictures as they can). Through this subtest, the four dimensions of divergent thinking are calculated: fluency, which is measured according to the number of responses given by the child (the highest possible number of them being 30); flexibility, which is the variety of responses given, i.e. the number of the different categories used (the highest possible number of them being 30); originality, which is assessed in view of the newness and uniqueness of the responses given, using a 0-to-3 scale, from 'not original at all' to 'very original', (the highest possible score being 90); and finally, elaborateness, which is the amount of details which embellish and improve the creative production. With our Spanish sample, the reliability rate found for the parallel lines task of the Torrance Creative Thinking Test, using the two-halves procedure (Spearman-Brown), is between .93 for flexibility and .84 for originality (Prieto, et al., 2006). Besides, in another study, the result was an inter-rater coefficient correlation mean of two mixed effect factors and an absolute agreement .96 for fluency, .92 for flexibility, .93 for originality and .60 for elaborateness (Sáinz, 2010). In our study, the result achieved was a Cronbach alpha coefficient of .82 for the whole set of the dimensions assessed with the parallel lines task.

Creative Imagination Test. The aim of this test is to assess the classical dimensions of creativity. In the present study, games 1, 2 and 3 from the verbal part of the test have been used.

In game 1, looking at a situation shown in a given picture, the child must write about all that might be taking place in it; the test allows them to express their curiosity and speculative attitude; and their capacity to go beyond the information given by the prompt when offering different options about what may be happening in the scene. This task measures fluency (number of responses given) and flexibility (number of categories the ideas posed fit into, each of which may do so in more than one category).

Game 2 requires the child to think of different uses for an object. This is an adaptation of the Guilford Test 'Uses for a Brick'. This task is meant to assess the capacity to 'redefine' problems; i.e., the capacity to find out uses, functions and applications which are different from the usual ones; to speed up mental processes and offer new interpretations or meanings to familiar objects and give them some new use or function. This task measures fluency, flexibility and originality (in this case, it is measured in relation to the infrequency of occurrence of the category the response belongs to).

Game 3 presents students with unbelievable situations. These vary according to the PIC version, and for instance, in the PIC-N (for children aged between 8 and 12), the situation given is 'Imagine what would happen if all of a sudden every squirrel turned into a dinosaur'. In all these tests, it is assessed the capacity for fantasizing and the ability to handle unconventional ideas, which the individual would probably not be able to express in more formal situations, such as openness and receptivity in the face of new situations. This test measures fluency, flexibility and originality.

The authors report an alpha coefficient of .83 for the PIC-N and a .85 one for the PIC-J. In our study, games 1, 2 and 3 from PIC-N and PIC-J versions, have been used (depending on the age of the individuals involved), these make up the narrative creativity and assess fluency, flexibility and originality. According to our sample, the Cronbach alpha coefficient found for the set of tests 1, 2 and 3, is .80.


Teachers and parents gave their authorization to carry out this study. Students were informed about its objectives and its confidentiality. The tests were applied on class groups and during school time. The original instructions from the tests handbooks (Artola, Ancillo, Mosteiro, & Barraca, 2004; Torrance, 1974) were used, which enabled us to determine the categories and specific scores for our sample. Special emphasis was made to ensure the activity was in the nature of a game, to prevent the restraint and anxiety inherent to the conduction of cognitive assessment psychological tests. The tests were assessed in two ways: a) following the criteria given by the handbooks; and b) applying some rating formulae which permit to adjust collinearity; in particular, the residual score designed by Primi et al. (2013) was used as the fluency score, and for originality and elaborateness, the arithmetic mean (score divided by the number of responses given) was used.

Data Analysis

For the processing of the results the statistics programme SPSS (Windows, version 20) was used. The analysis of their factorial structure was carried out by means of the confirmatory factor analysis model, through the maximum likelihood estimation method, using the AMOS 21 programme (Arbuckle, 2012). There were no missing data. The fit measures used to verify the adequacy of the models to the data were the following: statistical Chi-squared (Χ2), PGFI (Parsimony Adjusted Goodness-of-Fit Index), CFI (Comparative Fit Index), PCFI (Parsimony Adjusted Comparative Fit Index) RMSEA (Root Mean Squared Error of Approximation), and the ECVI (Expected Cross-Validation Index), taking as indexes the ones referred to in the literature (Brown, 2006; Jackson, Gillaspy, & Purc-Stephenson, 2009; Macmann & Barnett, 1994; Schreiber, Nora, Stage, Barlow, & King, 2006).



Table 1 shows the descriptive statistics of our sample for the scores of the creativity dimensions, assessed with both TTCT (fluency, flexibility, originality and elaborateness) and the PIC (fluency, flexibility and originality).



For the entire sample, as it can be seen on the table, taking the lowest and the highest dispersion measures of the seven creativity scores, we can accept a good variability in the participants results. The mean values are slightly below the mean distribution value (especially the PIC originality dimension). The asymmetry and kurtosis values render a normal distribution for the results.

Table 2 shows the score correlation matrix obtained in both creativity tests. Such correlations are between r= .002 (between the PIC residual fluency and the TTCT elaborateness dimensions) and up to r= .89 (between the TTCT originality and the TTCT fluency dimensions). Thus, in general all correlations are positive, except for TTCT mean Elaborateness with the rest of the TTCT variables, and the PIC mean originality with the rest of the PIC variables.

Additionally, the correlation intensity between the dimensions of the same test is strong, as in the case of the TTCT variables to one another (fluency, flexibility and originality), except for elaborateness; and in the case of the PIC dimensions to one another (fluency, flexibility and originality) which correlate with a medium-to-high intensity. However, correlations between the dimensions of both tests (TTCT vs PIC) are low to medium-low, even when common dimensions are taken into account.

With a view to check the data organisation according to the theoretical models proposed by the literature (general creativity vs specific creativity), two models were used: one of them was based on the Guilford (1950) and Torrance (1974) theories, whereby content domain is not as important as the cognitive function, for which reason the 6 variables (including the processes common to both tests and excluding the TTCT elaborateness variable) were to be grouped by the dimensions they measure according to the underlying cognitive processes (fluency, flexibility and originality); besides, a second model was tried, which took task domain (verbal and figural) into account. Picture 1 shows the proposed models. In the confirmatory factor analysis for Confirmatory Factorial Analysis Model 1, a general latent variable, three second order latent variables and seven observable variables were considered in the explanation of the students results in both tests. For model 2, the TTCT and PIC variable scores were linked to a specific factor (verbal or figural).

To confirm that this factorial structure was not the result of an artifice due to the test rating method, it was chosen to use the same two models, adjusting the collinearity with the fluency variable (i.e., the direct scores from the fluency variables were replaced by the residual fluency variables obtained through the linear regression, to predict fluency taking flexibility into account; and for the originality and elaborateness variables, their averages were calculated).

The fit indexes for the four models used are shown on Table 3. The analysis of these indicates that within model 1 there is a considerable difference in the fit when the direct scores are considered (according to the handbook), or when adjusted scores are used to prevent the fluency converging effect, which is expected, given the variable collinearity. This model did not adequately fit the data.

Model 2, taking both the direct and the adjusted scores, showed a better fit, which increased when collinearity was not adjusted.

Having adjusted the variable collinearity, it was expected to find model 1 the best-fitting one. The fact that even when the influence of fluency was adjusted, model 1, the one in which domains are the main feature still adjusted better (although fit was not perfect), shows us how important content is in divergent thinking.

Table 2 shows the association indexes between latent and observable variables, assuming also, due to the analysis requirements, a correlation between both variables o latent factors.


Discussion and Conclusions

Our study aimed to analyse the structure of the relationship between the creativity cognitive dimensions by means of the TTCT and PIC tests. Both of them assess fluency, flexibility and originality dimensions (with TTCT also including the elaborateness dimension) using different content tasks (figural and verbal, respectively). In this sense, regarding the correlation analysis, the coefficients showed to be higher when dimensions of the same test were related, but they were low between the scores for the same cognitive processes obtained with both tests. This result takes on even more meaning when considering that both tests have the same theoretical basis and both set out to assess creativity in a similar way.

In this connection, the results suggest that the students production is more often determined by the task content than by the cognitive operations, which agrees with other studies in which the domain specificity of divergent thinking is assessed -(Diakidoy & Spanoudis, 2002); - (Garaigordobil & Pérez, 2004; Han & Marvin, 2002; Runco, Dow & Smith, 2006). In this regard, some studies have shown the existence of a task specificity within the same domain: for instance, there is the case of individuals who are very creative in the area of poetry but not so much when it comes to narrating stories -(Baer, 1999). Another fact worth mentioning is the TTCT strong relationship between fluency, flexibility and originality, but not with elaborateness, which is in agreement with results from the other studies and raises concern that such a relationship may be due to an artifice caused by the rating method. -(Chase, 1985; Mouchiroud & Lubart, 2001; Runco & Acar, 2012; Silvia et al., 2008).

In this sense, fluency appears to be a converging factor: flexibility, originality and elaborateness scores depend to a large extent on the number of responses (fluency). It is almost inevitable to expect flexibility, or the ability to change focus, to be linked to a larger number of responses, but this is not necessarily true for elaborateness. This result pattern has been found in our confirmatory factor analysis. The three scores (fluency, flexibility and originality) are arranged according to the test used and not according to the nature of the cognitive processes assessed, not only in the case of the direct variable model but also in the case of the one applying the adjusted variables (to prevent collinearity effect with fluency). The results prove that the creative performance of the students is largely conditioned by the figural or verbal content of the tasks given (TTCT and PIC, respectively). This seems to be very relevant because also in the area of intelligence, some factorial methods point towards a cognitive ability structure taking into account the task verbal, numerical or spacial-figural contents -(Beauducel, Brocke, & Liepmann, 2001; Lemos, Abad, Almeida, & Colom, 2013).

Following the same line of thought, it could be said that the creativity assessment carried out through psychological tests is also affected by the type of content applied in the given tasks -(Almeida et al., 2008; Ferrándiz, Prieto, Ballester & Bermejo, 2004; Ferrando et al., 2007). It should be noted as well that in the light of our results, the TTCT elaborateness dimension appears to be relatively distict from fluency, flexibility and originality. This fact is also pointed out by other TTCT studies indicating that elaborateness has lower weight within the creative configuration assessed by the divergent thinking tests - (Clapham, 1998; Prieto et al., 2002). These results are on a par with previous work using TTCT in which elaborateness was left as a separate factor from fluency, flexibility and originality - (Ferrando et al., 2007; Oliveira et al., 2009; Prieto et al., 2006). At the same time, for PIC, all three dimensions (fluency, flexibility and originality) were grouped as a single factor, repeating the results reached by Artola y Barraca (2004).



The factorial structure of TTCT and PIC results suggest a difference in the students performance, given more often by the task content rather than by the cognitive processes normally considered when assessing creativity (fluency, flexibility and originality). In this light, it can be stated that, having the same theoretical grounds, TTCT and PIC complement each other when assessing creativity in terms of the figural and verbal content of the stimuli. The importance of content in creativity assessment demands further study in the future, taking for instance, a larger set of creativity tests and samples of students from different school levels.

One of the main facts informed by the results obtained is the one referring to the training of divergent thinking abilities. It is commonly accepted that certain techniques and activities improve the general creativity of individuals; however, if creativity is specific, there should be proposed activities and materials specific to each domain, as transference from one domain to another is not direct.

Among the main limitations of our study we must mention those related to the choice of the variables included in the confirmatory factor analysis. In this respect, total scores of task sets (tests) have been used as variable measures; and in future studies, it would be necessary to include each variable measured by each PIC test specific task to verify the consistency of the results obtained.

Our work has verified the independence between the verbal and figural domains of divergent thinking, although the results open a debate on what determines such specificity: whether the mental processes involved or the expert knowledge which is necessary to solve the domain tasks are different. From our point of view, the tasks indicate a low level of specific knowledge, and therefore, we could say that what explains the difference is the way of thinking. It would be necessary to continue extending the line of work intended to corroborate the importance of domain in cognition, expanding it to others, such as numerical, musical, social, etc.



1. Almeida, L. S., Prieto, M. D., Ferrando, M., Oliveira, E., & Ferrándiz, C., (2008). Creativity: The question of its construct validity. Journal of Thinking Skills and Creativity. 3(1), 53-58. doi:10.1016/j.tsc.2008.03.003.         [ Links ]

2. Amabile, T. M. (1998). How to kill creativity. Harvard Business Review, 76(5), 77-87.         [ Links ]

3. Arbuckle, J. L. (2012). IBM SPSS Amos 21 User's Guide. IBM.         [ Links ]

4. Artola, T., & Barraca. J. (2004). Creatividad e Imaginación. Un nuevo instrumento de medida: La PIC. Edu Psykhé, 3(1), 73-93.         [ Links ]

5. Artola, T., Ancillo, I, Mosteiro, P., & Barraca, J. (2004). PIC: Prueba de Imaginación Creativa. Madrid: TEA, Ediciones.         [ Links ]

6. Artola, T., Barraca, J., Martin, C., Mosteiro, P., Ancillo, I. y Poveda, B. (2008). Prueba de Imagination Creativa para Jovenes. Madrid: TEA.         [ Links ]

7. Baer, J. (1999). Domains of Creativity. En Mark A, Runco & Steven R, Pritzkermark (Eds.). Encyclopedia of Creativity, (pp 591-596). New York: Academic Press.         [ Links ]

8. Baer, J. (1996). The effects of task-specific divergent-thinking training. Journal of Creative Behavior, 30, 183-187.         [ Links ]

9. Brown, T. A. (2006). Confirmatory factor analysis for applied research. Nueva York: The Guilford Press.         [ Links ]

10. Beauducel, A., Brocke, B., & Liepmann, D. (2001). Perspectives on fluid and crystallized intelligence: Facets for verbal, numerical, and figural intelligence. Personality and Individual Differences, 30, 977-994, doi: 10.1016/S0191-8869(00)00087-8.         [ Links ]

11. Clapham, M. M. (1998). Structure of figural forms A and B of the Torrance Tests of Creative Thinking. Educational and Psychological Measurement, 58(2). 275-283. doi: 10.1177/0013164498058002010.         [ Links ]

12. Clark, P. M., & Mirels, H. L. (1970). Fluency as a pervasive element in the measurement of creativity. Journal of Educational Measurement, 7, 83-86.         [ Links ]

13. Craft, A. (2005). Creativity in schools: Tensions and dilemmas. Abingdon: Routledge.         [ Links ]

14. Csikszentmihalyi, M. (1997). Flow and the Psychology of Discovery and Invention. Harper Perennial: New York.         [ Links ]

15. Chase, C. I. (1985). Review of the Torrance Tests of Creative Thinking. En J.V. Mitchell Jr. (Ed.), The ninth mental measurements yearbook (pp. 1631-1632). Lincoln: Buros Institute of Mental Measurement. University of Nebraska.         [ Links ]

16. Diakidoy, I. A. N., & Spanoudis, G. (2002). Domain Specificity in Creativity Testing: A Comparison of Performance on a General Divergent-Thinking Test and a Parallel, Content-Specific Test. The Journal of Creative Behavior, 36(1), 41-61.         [ Links ]

17. Dixon, J. (1979). Quality versus quantity: The need to control for the fluency factor in originality scores from the Torrance Tests. Journal for the Education of the Gifted, 2, 70-79.         [ Links ]

18. Ferrándiz, C., Prieto, M. D., Ballester, P., & Bermejo, M. R. (2004). Validez y fiabilidad de los instrumentos de evaluación de las inteligencias múltiples en los primeros niveles instruccionales. Psicothema, 16(1), 7-13.         [ Links ]

19. Ferrando, M., Ferrándiz, C; Bermejo, M. R., Sánchez, C. Parra, J., & Prieto, M. D. (2007). Estructura interna y baremación del Test de Pensamiento Creativo de Torrance. Psicothema, 3(19), 489-496.         [ Links ]

20. Garaigordobil Landazabal, M., & Pérez Fernández, J. I. (2004). Un estudio de las relaciones entre distintos ámbitos creativos. Educación y Ciencia, 8(15) 67-78.         [ Links ]

21. Gardner, H. (1993/1995). Creating minds: An anatomy of creativity seen through the lives of Freud, Einstein, Picasso, Stravinsky, Eliot, Graham, and Gandhi. New York: Basic Books. (Traducción española, 1995. Mentes creativas. Barcelona: Paidós).         [ Links ]

22. Guilford, J. P. (1950). Creativity. American Psychologist, 5, 444-454.         [ Links ]

23. Han, K. S., & Marvin, C. (2002). Multiple creativities? Investigating domain-specificity of creativity in young children. Gifted Child Quarterly, 46(2), 98-109.         [ Links ]

24. Heausler, N. L., & Thompson, B. (1988). Structure of the Torrance Tests of creative thinking. Educational and Psychological Meassurement, 48, 463-468.         [ Links ]

25. Hennessey, B. A., & Amabile, T. M. (2010). Creativity. Annual Review of Psychology, 61, 569-598, doi: 10.1146/annurev.psych.093008.100416.         [ Links ]

26. Hocevar, D. (1979). Ideational fluency as a confounding factor in the measurement of originality. Journal of Educational Psychology, 71, 191-196.         [ Links ]

27. Hocevar, D., & Michael, W. B. (1979). The effects of scoring formulas on the discriminate validity of tests of divergent thinking. Educational and Psychological Measurement, 39, 917-921.         [ Links ]

28. Hong, E., & Milgram, R. M. (1991). Original thinking in preschool children: A validation of ideational fluency measures. Creativity Research Journal, 4(3), 253-260.         [ Links ]

29. Jackson, D. L., Gillaspy Jr, J. A., & Purc-Stephenson, R. (2009). Reporting practices in confirmatory factor analysis: an overview and some recommendations. Psychological methods, 14(1), 6.         [ Links ]

30. Kaufman, J. C., & Baer, J. (2004). Sure, I'm creative but not in Mathematics: Self-reported creativity in diverse domains. Empirical Studies of the Arts, 22(2), 143-155.         [ Links ]

31. Kaufman, J. C., & Baer, J. (2005). The amusement park theory of creativity. Creativity across domains: Faces of the muse, 321-328.         [ Links ]

32. Kim, K. H. (2006). Is creativity unidimensional or multidimensional? Analyses of the Torrance Tests of Creative Thinking. Creativity Research Journal, 18, 251-259.         [ Links ]

33. Kim, K. H., Cramond. B., & Bandalos, D. (2006). The latent structure and measurement invariance of scores on the TTCT of Creative Thinking-Figural. Educational and Measurement, 66, 459-477. doi: 10.1177/0013164405282456.         [ Links ]

34. Lemos, G. C., Abad, F. J., Almeida, L. S., & Colom, R. (2013). Sex differences on g and non-g intellectual performance reveal potential sources of STEM discrepancies. Intelligence, 41, 11-18. doi: 10.1016/j.intell.2012.10.009.         [ Links ]

35. Mcmann, G.M., & Barnett, D.W. (1994). Structural analysis of correlated factors: Lessons from the verbal-performance dichotomy of the Wechsler scales. School Psychology Quarterly, 9, 161-197.         [ Links ]

36. Michael, W. B., & Wright, C. R. (1989). Psychometric issues in the assessment of creativity. In J. A. Glover, R. R. Ronning, & C. R. Reynolds (Eds.), Handbook of creativity (pp. 33-52). New York: Plenum Press.         [ Links ]

37. Mohamed, A., Maker, C., & Lubart, T. (2012). Exploring the Domain Specificity of Creativity in Children: The Relationship between a NonVerbal Creative Production Test and Creative Problem-Solving Activities. Online Submission, 2(2), 84-101.         [ Links ]

38. Moran, J. D., Milgran , R. M., Sawyers, J., & Fu, V. R. (1983). Original thinking in preschool children. Child Development, 54 (4), 921-926.         [ Links ]

39. Mouchiroud, C., & Lubart, T. (2001). Children's original thinking: An empirical examination of alternative measures derived from divergent thinking task. Journal of Genetic Psychology, 162, 383-401. doi: 10.1080/00221320109597491.         [ Links ]

40. Nakano, T. C., & Primi, R. (2012). A estrutura fatorial do Teste de Criatividade Figural Infantil. Psicología: Teoria e Pesquisa, 28(3), 275-283.         [ Links ]

41. Oliveira, E.P., Almeida, L., Ferrándiz, C., Ferrando, M., Sáinz, M., & Prieto, M. D. (2009). Tests de Pensamiento Creativo de Torrance (TTCT): Elementos para la validez del constructo en adolescentes portugueses. Psicothema, 21, 562-567.         [ Links ]

42. Plucker, J. A., & Beghetto, R. A. (2004). Why Creativity Is Domain General, Why It Looks Domain Specific, and Why the Distinction Does Not Matter. In R. J. Sternberg, E. L. Grigorenko, & J. L. Singer (Eds.), Who's creative? (pp. 153-157). Washington, DC: American Psychological Association.         [ Links ]

43. Prieto, M. D., López, O., Ferrándiz, C., & Bermejo, R. (2003). Adaptación de la prueba figurativa del Test de Pensamiento Creativo de Torrance en una muestra de alumnos de los primeros niveles educativos. Revista de Investigation Psicoeducativa, 21, 201-213.         [ Links ]

44. Prieto, M. D., López, O., Bermejo, R., Renzulli, J., & Castejón, J. L. (2002). Evaluación de un programa de desarrollo de la creatividad. Psicothema. 14(2), 410-414.         [ Links ]

45. Prieto, M. D., Parra, J., Ferrando, M., Ferrándiz, C., Bermejo, M. R., & Sánchez, C. (2006). Creative abilities in early childhood. Journal of Early Childhood Research, 4(3), 277-290. doi: 10.1177/1476718X06067580 .         [ Links ]

46. Primi, R., Nakano, T. D. C., Morais, M. D. F., Almeida, L. S., & David, A. P. M. (2013). Factorial structure analysis of the Torrance Test with Portuguese students. Estudos de Psicología (Campinas), 30(1), 19-28.         [ Links ]

47. Runco, M. A., & Acar, S. (2012). Divergent thinking as an indicator of creative potential. Creativity Research Journal, 24, 66-75.         [ Links ]

48. Runco, M. A., & Mraz, W. (1992). Scoring divergent thinking tests using total ideational output and a creativity index. Educational and Psychological measurement, 52, 213-221. DOI: 10.1177/001316449205200126.         [ Links ]

49. Runco, M. A., Dow, G., & Smith, W. (2006). Information, experience, and divergent thinking: An empirical test. Creativity Research Journal, 18, 269-277.         [ Links ]

50. Runco, M. A., Okuda, S. M., & Thurston, B. J. (1987). The psychometric properties of four systems for scoring divergent thinking tests. Journal of Psychoeducational Assessment, 5, 149-156. DOI: 10.1177/073428298700500206.         [ Links ]

51. Runco, M. A., Okuda, S. M., & Thurston, B. J. (1991). Environmental cues and divergent thinking. In M. A. Runco (Ed.), Divergent thinking (pp. 79-85). Norwood, NJ: Ablex Publishing Corporation.         [ Links ]

52. Sainz, M. (2010). Creatividad, Personalidad y Competencia Socio-emocional en Alumnos de Altas Habilidades versus no Altas Habilidades. Tesis Doctoral. Universidad de Murcia.         [ Links ]

53. Sánchez, P. A. García, A., & Valdés, A. A. (2009). Validez y confiabilidad de un instrumento para medir la creatividad en adolescentes. Revista Iberoamericana de Educación, 50(6), 1-12.         [ Links ]

54. Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., & King, J. (2006). Reporting structural equation modelling and confirmatory factor analysis results: A review. The Journal of Educational Research, 99(6), 323-338.         [ Links ]

55. Silvia, P. J., Kaufman, J. C., & Pretz, J. E. (2009). Is creativity domain-specific? Latent class models of creative accomplishments and creative self-descriptions. Psychology of Aesthetics, Creativity, and the Arts, 3(3), 139.         [ Links ]

56. Silvia, P. J., Winterstein, B. P., Willse, J. T., Barona, C. M., Cram, J. T., Hess, K. L., Martinez, L. & Richard, C. A. (2008). Assessing creativity with divergent thinking tasks: Exploring the reliability and validity of new subjective scoring methods. Psychology of Aesthetics, Creativity and the Arts, 2, 68-85.         [ Links ]

57. Silvia, P.J., Martin, C., & Nusbaun, E. C. (2009). A snapshot of creativity: Evaluating a quick and simple method for assessing divergent thinking. Thinking Skills and Creativity, 4, 2, 79-85.         [ Links ]

58. Silvia, P. J., Winterstein, B. P., Willse, J. T., Barona, C. M., Cram, J. T., Hess, K. L., Martinez, L., & Richard, C. A. (2008). Assessing creativity with divergent thinking tasks: Exploring the reliability and validity of new subjective scoring methods. Psychology of Aesthetics. Creativity and the Arts, 2, 68-85. doi: 10.1037/1931-3896.2.2.68.         [ Links ]

59. Snyder, A., Mitchell, J., Bossomaier, T., & Pallier, G. (2004). The creativity quotient: an objective scoring of ideational fluency. Creativity Research Journal, 16(4), 415-419.         [ Links ]

60. Sternberg, RJ, & Lubart, T. I. (1995). Defying the crow: Cultivating creativity in a culture of conformity. New York: The Free Press.         [ Links ]

61. Torrance, E. P. (1974). The Torrance Tests of Creative Thinking - Norms-Technical Manual Research Edition - Verbal Tests, Forms A and B - Figural Tests, Forms A and B. Princeton NJ: Personnel Press.         [ Links ]

62. Treffinger, D. J. (1985). Review of the Torrance Tests of Creative Thinking. In J. V. Mitchell Jr. (Ed.), The ninth mental measure- ments yearbook (pp. 1632-1634). Lincoln: University of Nebraska, Buros Institute of Mental Measurements.         [ Links ]

63. Wallach, M. & Kogan, N. (1965). Models of thinking in young chidren. New York: Holt, Rinehart & Winston.         [ Links ]

64. Wechsler, S. M. (2004). Avaliação da criatividade por figuras e palabras. Teste de Torrance - Versão Brasileira. Campinas, SP: PUC Camp, LAMP.         [ Links ]

65. Zacatelco, F., Chávez, B. I., González, A., & Acle, G. (2013). Validez de una prueba de creatividad: Estudio en una muestra de estudiantes mexicanos de educación primaria. Revista Intercontinental de Psicología y Educación, 15(1), 141-155.         [ Links ]

66. Zarnegar, Z., Hocevas, D., & Michael, W. B. (1988) Components of original thinking in gifted children. Educational and psychological measurement, 48(1), 5-1.         [ Links ]



Carmen Ferrándiz.
Departamento de Psicología Evolutiva y de la Educación.
Facultad de Educación.
Universidad de Murcia.
Campus Espinardo s/n.
30100. Murcia (Spain).

Article received: 27-03-2015
Revised: 25-05-2015
Accepted: 28-05-2015

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons