Original Research ARTICLE
Developing and Validating a Conceptual Change Cognitive Engagement Instrument
- 1Department of Educational Psychology, University of Oklahoma, Norman, OK, United States
- 2Department of Statistics and Analytical Sciences, Kennesaw State University, Kennesaw, GA, United States
- 3Department of Educational Leadership, Sport Studies, and Educational/Counseling Psychology, Washington State University, Pullman, WA, United States
Conceptual change (CC) occurs when learners move from a misconception to a scientifically accepted conception (Heddy et al., 2017). Many researchers agree that deep cognitive engagement is integral to facilitating conceptual change (Sinatra, 2005). Although conceptual change has been explored in great depth, a valid and reliable instrument to assess the type of engagement that occurs during the change process is lacking in the field. In the present study, we designed an instrument meant to assess cognitive engagement during conceptual change. Our measure is modeled after Dole and Sinatra (1998) model theorizing that learners consider message and personal factors when learning new concepts. We used exploratory factor analysis to assess the structure of the Conceptual Change Cognitive Engagement Scale (CCCES) with participants recruited from the M-Turk survey recruitment tool. The CCCES will be beneficial for theoretical understanding related to conceptual change and engagement.
The Conceptual Change Cognitive Engagement Scale
Conceptual change occurs when students move from a misconception to a scientifically accepted conception (Chi, 2008; Heddy et al., 2017). Many factors have been implicated in facilitating conceptual change including motivation (Johnson and Sinatra, 2013), message characteristics (Dole and Sinatra, 1998), personal relevance (Heddy and Sinatra, 2013), and culture (Costa, 1995; Abd-El-Khalick and Akerson, 2004). One construct that has been shown to be of particular importance to the facilitation of conceptual change is learner engagement (Dole and Sinatra, 1998; Heddy and Sinatra, 2013). While engagement has been shown to predict conceptual change in previous research (Pugh et al., 2010; Heddy and Sinatra, 2013), measuring engagement as it relates to conceptual change has proven to be an arduous task.
While we recognize that engagement is a complex process, educational psychologists have generally defined this process as having three components: cognitive, affective, and behavioral (Fredricks et al., 2004). Dole and Sinatra (1998) suggest that cognitive engagement is especially important for generating conceptual change and almost all of the research on conceptual change and engagement has focused on the cognitive component of engagement (Sinatra et al., 2015). In conceptual change research cognitive engagement is most often assessed by exploring study strategies (Greene et al., 2004; Taasoobshirazi et al., 2016). This is problematic because study strategies are only one small component of engagement as it relates to conceptual change. There is need for a valid, reliable, objective, and convenient tool that researchers can use to assess learners' cognitive engagement while undergoing conceptual change. To solve this problem, we designed an instrument that assesses the extent to which learners are mentally wrestling with the message being presented (message characteristics), and how it may relate to their current knowledge and experiences (personal characteristics). The purpose of the present study was to develop and assess the initial construct validity of the Conceptual Change Cognitive Engagement Scale (CCCES) through an exploratory factor analysis (EFA). To assess cognitive engagement in the process of conceptual change, we reviewed the research and identified seven variables that have been linked to conceptual change. These variables are discussed in the section below.
Conceptual change is a special type of learning that takes place when individuals change their knowledge from naïve, conflicting, and misconceived (referred to as misconceptions) to more scientifically-accepted knowledge (Chi et al., 1994; Vosniadou, 2004). Generally speaking, a misconception is defined as knowledge that misaligns with scientific knowledge (Chi and Roscoe, 2002; Heddy et al., 2017). More specifically, Chi (2013) theorizes that misconceptions can exist as either inaccurate or incommensurate with the scientific knowledge. Inaccurate misconceptions are those where learners have details of the conception, but inaccuracies exist within their framework of knowledge, whereas incommensurate misconceptions is knowledge where learners place the ideas into incorrect categories or do not have the relevant schemas. For example, a learner may believe that a salamander is the same size as a Komodo dragon, which represents inaccurate knowledge—a komodo dragon is significantly larger than a salamander. That same learner may think that a salamander is a reptile like a Komodo dragon, when actually a salamander is an amphibian. This represents an incommensurate misconception because the learner is placing the knowledge (salamander) into the wrong category (reptile). Thus the learner may not have a schema for an amphibian and therefore places them into a reptile category. This can make conceptual change much more difficult because facilitating change would require building a new schema or category. Inaccurate misconceptions include false beliefs and flawed mental models. Incommensurate misconceptions include category mistakes and missing schemas (Chi, 2013). Based on the type of misconception learners maintain, conceptual change can be more or less difficult to achieve (e.g., incommensurate misconceptions are more difficult given the need to help the student develop a new category or schema). Thus, conceptual change is incredibly complex with the process being influenced by type of misconception and many other cognitive and affective factors (Gadgil et al., 2012).
Given the complex nature of conceptual change, it is not surprising that there has been decades of research on models of how conceptual change occurs (Strike and Posner, 1992; Smith et al., 1994; Vosniadou, 1994; Dole and Sinatra, 1998; Carey, 2000; Gregoire, 2003; Murphy and Mason, 2006; Ohlsson, 2009; Shtulman, 2009; Chi, 2013). We designed the CCCES based on the Cognitive Reconstruction of Knowledge Model (CRKM; Dole and Sinatra, 1998). Dole and Sinatra (1998) theorized that conceptual change occurs through two main factors—interpreting the incoming message and individual differences. Factors related to interpreting a message include coherency, plausibility, credibility, and comprehensibility. Individual difference variables that contribute to engagement that promote conceptual change include existing conceptions, motivation, dissatisfaction, social (or cultural) context, need for cognition, and personal relevance. We used Dole and Sinatra's 1998 CRKM to identify the conceptual change variables linked to the extent of engagement that supports knowledge reconstruction (For full model see Dole and Sinatra, 1998).
Although we used Dole and Sinatra (1998) CRKM as the guiding framework for the (CCCES), there are some important distinctions between variables included on the CCCES and those included in the CRKM. First, we included two individual characteristic variables that are not included in the CRKM: culture and attention. Second, we excluded several individual characteristic variables: motivation, dissatisfaction, social context, and need for cognition. Given that the CRKM was published nearly two decades ago, and the authors never intended the CRKM to be an exhaustive list of variables, we feel these updates are warranted. We discuss the rational for these changes below.
We rationalize the inclusion of attention based on current research illustrating that attention is particularly important to experiencing conceptual change (Ariasi and Mason, 2014; Kendeou et al., 2014; Jones et al., 2015). Similarly, researchers have exhibited that culture, as it relates to the topic of change, is integral to experiencing conceptual change (Abd-El-Khalick and Akerson, 2004). We rationalize the exclusion of several CRKM variables based on our operationalization of cognitive engagement as explicitly considering incoming information (Fredricks et al., 2004), rather than having an implicit impact on the conceptual change process. Our goal in creating the scale was to design an instrument that specifically assessed the factors that a learner explicitly cognitively engages with while undergoing conceptual change. That is, when reading a refutation text, what types of factors is a learner mentally wrestling with while reading the text? We posit that although all of these variables influence conceptual change, the learner may not be cognitively aware that they are influencing their engagement with the text. For example, learners are likely not questioning their own need for cognition while reading a text. Thus we selected variables that learners are consciously relating to the material in the text, and we are calling this cognitive engagement, as it relates to conceptual change variables, thus the name CCCES.
The variables included in the CCCES inventory are measured as engagement with that variable while reading a refutation text. For example, in the Science Motivation Questionnaire (Koballa and Glynn, 2007), personal relevance is assessed with the question: The science I learn relates to my personal goals. In the CCCES, personal relevance is assessed with the question: While reading the text, I thought about how the reading would be helpful to my personal goals. Therefore, we are measuring cognitive engagement of each of the variables during the learning process (e.g., engagement of personal relevance as it relates to the learning content).
We would like to point out that in measuring students' engagement with the variables linked to conceptual change, we assess “thinking about” those variables while completing a task. Metacognition, which is the knowledge and regulation needed for understanding and controlling one's cognition, and self-regulation more generally, which involves regulating one's own cognition, metacognition, and motivation (e.g., Winne and Perry, 2000), is in some ways parallel to our items measuring cognitive engagement. This makes sense given that we are measuring cognitive engagement. However, our items are specific to the conceptual change process (e.g., focused on credibility, coherency, and other characteristics of the message) whereas metacognition inventory items tend to focus on students' knowledge and regulation of their cognitive problem solving and learning processes in general (Schraw and Dennison, 1994). The CRKM distinguishes between two factors that influence conceptual change—characteristics of the message and characteristics of the individual. Below we briefly describe each of the message and individual characteristics that we included on the CCCES.
The coherency of a message is an important factor to consider when addressing engagement and conceptual change (Hewson and Hewson, 1983; Vosniadou, 2004). Social psychological research shows that when a message is perceived as coherent, it is more likely to be processed through the central route of the brain and thus better encoded (Petty and Briñol, 2012). When learners perceive a message as coherent, they are more likely to engage with the content of the message, which, in turn, is expected to support conceptual change (Dole and Sinatra, 1998).
Plausibility is defined as an individual's subjective judgment about the potential truthfulness of a message (Lombardi and Sinatra, 2012) and predicts engagement in conceptual change. According to the CRKM (Dole and Sinatra, 1998), a message must be considered plausible in order for the learner to engage in the higher levels of cognitive processing and engagement that are associated with conceptual change. However, judgments of plausibility should not be confused with actual belief in the message. As Lombardi and Sinatra (2012) point out, a person can perceive a message to be plausible without actually believing the message. A message can be considered coherent, credible, and comprehensible by the learner, yet still be considered implausible, thus providing evidence that plausibility perceptions are a unique variable. Research by Lombardi and colleagues has shown that plausibility predicts engagement in conceptual change (Lombardi et al., 2016).
Looking through the lens of the CRKM (Dole and Sinatra, 1998), a message must be considered credible by the learner to support cognitive engagement. Credibility refers to the expertise or trustworthiness of the source presenting the message. Wegener et al. (2010) describe credibility as an important variable in the process of attitude change because it influences the motivation and ability of the learner to engage in higher elaboration (cognition) of the message. Research by Lombardi et al. (2014) suggests that source credibility is linked to engagement in conceptual change.
Comprehension of a message is a key component in conceptual change (Vosniadou, 1994). If learners do not understand a message, they may immediately disengage from the conceptual change process and maintain their misconception (Pintrich et al., 1993). For example, if the message of a refutation text meant to facilitate conceptual change is difficult to understand, then learners are unlikely to fully understand the message, making conceptual change unlikely. Therefore, a crucial aspect of engagement during conceptual change is the extent to which the learner perceives the message as comprehensible. Comprehensibility has been shown to have a mediating role in the relationship between persuasion and conceptual change (Eagly, 1974; Ratneshwar and Chaiken, 1991). Thus, when comprehension is low, persuasion is unlikely to impact change (Chaiken and Eagly, 1976). We posit that when engaging in conceptual change, learners meta-cognitively assess their levels of comprehension, which then predicts conceptual change.
An integral aspect of the conceptual change process is the amount of attention that students give to the task (Broughton et al., 2010; Jones et al., 2015). Increased attention is likely to aid in the facilitation of conceptual change as long as learners are paying attention to the content of the message. Attention and engagement are often perceived as similar constructs (Broughton et al., 2010). However, research suggests that attention is predictive of cognitive engagement and the two constructs are distinct (Jones et al., 2015). The mediator between attention and conceptual change is likely cognitive engagement (Dole and Sinatra, 1998). Kendeou and colleagues have shown that when learners read a refutation text they allocate their attention to their own misconceptions and the accepted conception, and this attention allocation leads to the engagement that supports conceptual change (Van den Broek et al., 1999; Kendeou and Van Den Broek, 2005; Kendeou and van Den Broek, 2007; Van Den Broek and Kendeou, 2008; Ariasi and Mason, 2014; Kendeou et al., 2014). Given the aforementioned relationship between attention and conceptual change, we included attention as an integral aspect of the CCCES.
We chose to measure attention instead of motivation for three main reasons. Firstly, there has already been extensive research in educational psychology measuring motivation and the self-regulation of motivation (beliefs and attitudes that influence the use and development of one's cognition and metacognition). This research has shown, over and over, the importance of motivation and self-regulating one's motivation in impacting conceptual change and learning. Secondly, enhancing motivation tends to be more of a long-term goal of instructors and scholars, whereas a more short term, and task specific goal is to grab and keep students' attention. For this reason, we chose to measure students' engagement of their attention. Thirdly, as part of the CCCES, we measure students' engagement of personal relevance (described further below). In many inventories measuring motivation, personal relevance has been studied as a component of motivation (e.g., Koballa and Glynn, 2007) along with others such as self-efficacy and task-value. To avoid redundancy, we focus on personal relevance as motivation.
Dole and Sinatra (1998) predict that personal relevance is an essential motivational component in the conceptual change process. Personal relevance is operationalized as a determination of the relatedness of content being learned to one's personal everyday life experiences, interests, and goals (Petty et al., 1981; Pintrich et al., 1993). Experiencing personal relevance has been shown to impact learning (Heddy and Sinatra, 2013) and more specifically conceptual change (Sinatra et al., 2015). For example, Heddy and Sinatra (2013), conducted a study that encouraged students to apply concepts related to the theory of biological evolution to their everyday life experiences. In doing so, the students recognized the personal relevance and value of the content and engaged in conceptual change as an outcome.
Dole and Sinatra (1998) suggest that personal relevance contributes to conceptual change through increased levels of engagement. That is, when someone learns about a concept and relates it to his or her everyday life, personal goals, and interests, their engagement increases, improving the likelihood of conceptual change. For example, when learning about climate change, an individual may relate it to their passion for recycling. Therefore, noticing the relationship between the concept being learned and one's perceived personal relevance can increase engagement with the content, which should predict conceptual change.
Although Dole and Sinatra (1998) did not explicitly include culture in the CRKM, other researchers have found that cultural characteristics are highly important in the conceptual change process (Pintrich et al., 1993; Costa, 1995; Diakidoy et al., 1997; Abd-El-Khalick and Akerson, 2004). For instance, research suggests that learning is mediated by the congruency between school content and cultural knowledge (Costa, 1995; Abd-El-Khalick and Akerson, 2004) and thus conceptual change would be influenced by knowledge and understanding in learners cultural spaces. Aikenhead and Jegede (1999) state that many academic concepts are different from and often conflict with knowledge learned in individuals' cultural space and experiences, which is known as cultural border crossing. If academic content exceedingly conflicts with cultural knowledge, engagement, and learning can be difficult for students (Phelan et al., 1991). Therefore, learners' culture can influence the level of engagement with a task and, in turn, have an impact on the extent of conceptual change.
Our goal was to develop an instrument to measure the cognitive component of engagement when undergoing conceptual change. This is the first study to develop and assess the construct validity of such a much needed instrument. Given the importance of engagement for conceptual change and the lack of an inventory that measures engagement specific to the conceptual change process, this study fills an important gap in the conceptual change research for both theoretical and practical reasons.
We recruited 513 participants (213 men and 186 women, 114 chose not to respond to this question) using Amazon Mechanical Turk. Mechanical Turk (M-Turk) is a survey implementation system that administers surveys to participants who are paidto complete the questionnaires. Regarding ethnicity, 273 participants were White/Caucasian, 29 African-American, 22 Hispanic, 62 Asian, 7 Native-American, 7 Other, and 113 chose not to respond to the item.
We used a 27-item climate change knowledge assessment to measure participants' knowledge pre and post reading (the human-induced climate change knowledge instrument or HICCK; Lombardi and Sinatra, 2013). We used this instrument to ensure that the CCCES predicted conceptual change and thus has criterion validity. Participants rated each item on a 5-point Likert scale gauging their level of agreement about what climate scientists would indicate for each statement. Overall reliability of the HICCK was very good, α = 0.98 for the pre-assessment and 0.99 for the post-assessment.
We used a refutation text developed by Lombardi (2016) and validated by Danielson et al. (2016). This 999 word text was created by adding a number of refutation statements to an expository piece discussing the natural process through which the Earth maintains and regulates temperature. For example, one refutation statement read, “Although it is true that climate changes can and do happen naturally, the rapid warming that the earth is currently experiencing cannot be explained by natural factors alone.” The text's Flesch-Kincaid readability score is 10.8. We implemented this text to facilitate conceptual change in order to explore the ecological validity of the CCCES, investigating if the CCCES predicted conceptual change.
Deep Strategy Use
We used four items developed by Greene and Miller (1996) designed to measure students' deep strategy use. For example, one item read, “While reading the text, I put together ideas and concepts and drew conclusions that were not directly stated in the text.” The items were on a five point Likert-scale ranging from strongly disagree to strongly agree. Our goal for using the items was to assess criterion validity between this scale and the CCCES. The reliability of this instrument was good, α = 0.94.
After obtaining approval from the University of Oklahoma Office of Human Research Participation Protection and Institutional Review Board, we posted a recruitment statement to Amazon Mechanical Turk explaining the goals of the study. When participants clicked on the study they were asked to complete a written informed consent within Qualtrics. After consenting, the participants took the pre-knowledge assessment, read a refutation text, responded to the CCCES and deep strategy use scale, and then completed the post-knowledge assessment. After completing these tasks, the participants were thanked for their time and sent payment for completing the study.
Conceptual Change Cognitive Engagement Scale Development
Following guidelines by Pett et al. (2003), the CCCES (see Table 1 for the 27 items) was developed. These guidelines, described in detail in the texts, include reviewing the relevant research, identifying latent variables, and developing quality empirical indicators of the latent variables.
The components and items included message characteristics: coherency of message (items 1, 2, 3, and 4), plausibility of message (items 5, 6, 7, and 8), credibility of message (items 9, 10, and 11), and comprehensibility of message (12, 13, and 14); and individual difference characteristics: attention (items 15 and 16), culture (items 17, 18, 19, 20, and 21), and personal relevance (items 22, 23, 24, 25, 26, and 27).
Participants responded to 27 items, which were randomly ordered by Qualtrics. The participants responded to each prompt on a five-point Likert scale ranging from 1 (never) to 5 (always) with the instructions: Think about the text that you read. Please choose the response that best represents your level of agreement with each statement.
PASW (Predictive Analytics SoftWare), version 18.0 was used to analyze the data. It was determined that the inter-item correlation matrix for the 27 items was appropriate for factor analysis as determined by Bartlett's test of sphericity, chi-square = 21852, df = 351, p < 0.001 and the Kaiser-Meyer-Olkin measure of sampling adequacy, KMO = 0.97. To extract the factors, principal axis factoring was used, and the Kaiser Guttman rule indicated that there were three factors with eigenvalues larger than one. The three factors were rotated using a promax rotation to aid with interpretation of factor loadings. The factor loadings are presented in Table 1. All of the items met the criterion of loading at least 0.35 on their respective factor (Tabachnick and Fidell, 2000) and all items clearly loaded on just a single factor.
Factor 1 was comprised of 14 items, including all of the message components and items: coherency of message, plausibility of message, credibility of message, and comprehensibility of message. This first factor explained 70.96% of the total variance in the items. Factor 2 contained seven items, including the individual difference components and items of attention and culture. This second factor explained 7.66% of the total variance. The last factor, factor 3, contained the six personal relevance items. This third factor explained 4.13% of the total variance. We interpreted these findings to mean that our participants perceived the engagement inventory to be comprised of three parts, the first including all of the message related items, the second including the attention and culture items, and the third being how personally relevant the text was viewed to be.
The CCCES inventory was correlated with conceptual change, r = 0.28, p < 0.01. Furthermore, as additional evidence of criterion related validity, the CCCES was highly correlated r = 0.84, p < 0.01 with deep strategy use, which is often used to measure cognitive engagement (Greene et al., 2004). Reliability, as measured by Cronbach's alpha, for the 27 items was 0.98. Cronbach's alpha for factor 1 was 0.99; alpha was 0.94 for factor 2 and 0.96 for factor 3.
We found the CCCES to be a valid and reliable instrument for measuring cognitive engagement. The three factors identified by the exploratory factor analysis were interpreted as dimensions by which participants perceived their cognitive engagement. These factors included message characteristics, the individual difference characteristics of attention and culture, and the individual difference motivational variable personal relevance. Interestingly, the message characteristics described in the CRKM all fit within one factor. This validates the CRKMs theoretical representation of message characteristics.
The second factor included culture and attention, which we are labeling as individual characteristics of conceptual change. We define individual characteristics as attributes that are unique to the individual that predict conceptual change. For example, when learning about GMO's an individual may consider whether the topic conflicts with their cultural beliefs, and if so, this conflict may hinder engagement in conceptual change (Heddy et al., 2017). Culture being a predictor of conceptual change has important implications. In future development and revisions of conceptual change models, culture should be an integral component.
The final factor included only one variable, which were the personal relevance items, and we are labeling this factor as personal relevance. Our findings validate Dole and Sinatra (1998) decision to include personal relevance as an important and distinct variable in the CRKM. We recognize that there are likely additional motivational variables that learners consciously engage with when undergoing conceptual change; however, for the present study, we focused primarily on the variables explicitly described in the CRKM. Goal orientation is a motivational variable that is likely involved with engagement when reading a refutation text (Taasoobshirazi and Sinatra, 2011). Goal orientation and other motivational components such as self-efficacy, and items that assess the affective component of engagement including emotions, interest, and task-value could be included and assessed as part of the inventory. Revising items and developing new ones are common procedure in the process of construct validation, which typically occurs over a series of studies (Pett et al., 2003; Glynn et al., 2009). When items have been revised and new ones developed, the next step is to cross-validate the instrument on new samples of participants using the method of confirmatory factor analysis (Kline, 2005; Glynn et al., 2009; Lomax and Schumacker, 2012). In addition to the inclusion of additional motivational constructs, we believe that developing and adding more attention items to the inventory would improve the validity and reliability of the CCCES.
The CCCES will have important theoretical implications for understanding how cognitive engagement predicts conceptual change. Existing instruments focus on learning strategies (Greene and Miller, 1996) and thus are not sufficient empirical or theoretical predictors of conceptual change. Due to this gap, researchers have had difficulties pinpointing the role of cognitive engagement in conceptual change; the CCCES will allow for a more thorough mapping of the variables that generate change. This will afford researchers the opportunity to design a new more accurate model of conceptual change that encompasses all relevant variables.
The CCCES has practical implications including using it to assess the value of conceptual change interventions such as use of a refutation text. That is, this tool will allow us to investigate the influence of interventions and components of interventions on cognitive engagement with conceptual change. Teachers can use the instrument when deciding what intervention to use to maximize cognitive engagement when teaching about a concept like evolution (Heddy and Sinatra, 2013), heat transfer (Pugh et al., 2017), and the causes of the seasons on Earth (Chi et al., 1994; Mason et al., 2017) that are typically associated with many misconceptions.
In addition to assessing cognitive engagement with conceptual change instructional activities, teachers could investigate each of the subcomponents of the CCCES and explore what issues with cognitive engagement students may be having and pinpoint interventions. For instance, if learners score high on cognitive engagement with culture, but low on cognitive engagement with personal relevance, teachers can modify instruction to facilitate more personal relevance. Doing so would likely increase cognitive engagement and lead to more in depth learning and conceptual change. Therefore, investigating each subcomponent of this scale will allow teachers to modify and improve instruction relevant to depth of cognitive engagement.
Limitations and Future Directions
As with all research, there were limitations with the current study. First, there are several variables that may be modified or added to the CCCES. For example, religion, political affiliation, and family were included as examples of culture when developing the culture-related items. Although we posit that these three cultural factors are relevant for learning about human induced climate change, there are many other cultural factors that may influence conceptual change such as ethnicity, socio-economic status, and tightness vs. looseness (acceptance of deviance), among others. Thus, we recommend that future researchers explore the many facets of culture that may differentially impact conceptual change and modify the instrument accordingly.
We assessed prior knowledge or pre-existing conceptions with our knowledge assessment. A second limitation of our study is that we made assumptions, when designing the study, about the level of knowledge that participants of this study held related to climate change. That is, we assumed that participants had misconceptions to be changed. Rather, some researchers such as diSessa (1983, 1996) theorize that learners' scientific knowledge is fragmented and that over time their conceptions become more complex and complete. Thus, learners with fragmented views of climate change may have difficulty thinking about the comprehensibility, plausibility, credibility, and coherency of a message related to that concept because their knowledge related to that concept is incomplete (diSessa, 2014). Although learners with fragmented knowledge of a concept may have difficulty considering a message related to that concept, we purport that they will still be able to consider the comprehensibility, plausibility, credibility, and coherency of a message in a text and in doing so, they may in fact develop a more complete view of the concept via cognitive engagement. That is, learners with more fragmented knowledge may experience lower levels of cognitive engagement and, in turn, may be less likely to undergo conceptual change, whereas learners with less fragmented knowledge may be more likely to experience cognitive engagement and conceptual change due to their ability to consider the message in greater depth. This hypothesis is speculative and we suggest that researchers explore the impact of the extent of fragmented knowledge on differential levels of cognitive engagement and conceptual change.
Finally, we suggest that future research explore the impact of misconception type on cognitive engagement according to the CCCES and its resulting impact on conceptual change. Chi (2013) posits four types of misconceptions including false beliefs, flawed mental models, category mistakes, and missing schemas. We hypothesize that learners with inaccurate misconceptions such as false beliefs and flawed mental models would be less distant from the scientific knowledge and thus be more likely to experience high cognitive engagement and conceptual change. However, learners with incommensurate misconceptions such as category mistakes and missing schemas would experience lower levels of cognitive engagement because their misconceptions render the scientific knowledge incoherent. That is, they would not have the necessary schema to comprehend the scientific knowledge, which would result in reduced cognitive engagement, making conceptual change less likely to occur. Therefore, a fruitful avenue for future research could be to explore misconception type according to Chi (2013) and their differential impact on cognitive engagement and conceptual change.
Our goal was to design an instrument that measures cognitive engagement that occurs during the conceptual change process. To design our instrument, we used Dole and Sinatra (1998) (CRKM) as a theoretical framework to specifically focus on variables of conceptual change related to cognitive engagement. Results of this research suggest that our instrument, the (CCCES), validly measures many of the aspects of cognitive engagement that influence conceptual change. There are important implications to take away from the findings of our research. From a theoretical standpoint, researchers can use this instrument to assess the relationship between cognitive engagement and conceptual change. Additionally, this instrument could be used by researchers and educators alike to assess the efficacy of conceptual change interventions that are common to science courses.
This study was approved by the internal review board (IRB) of the University of Oklahoma. All participants completed an approved digital consent form via Qualtrics.
All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.
Conflict of Interest Statement
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Abd-El-Khalick, F., and Akerson, V. L. (2004). Learning as conceptual change: Factors mediating the development of preservice elementary teachers' views of nature of science. Sci. Educ. 88, 785–810. doi: 10.1002/sce.10143
Aikenhead, G. S., and Jegede, O. J. (1999). Cross-cultural science education: a cognitive explanation of a cultural phenomenon. J. Res. Sci. Teach. 36, 269–287. doi: 10.1002/(SICI)1098-2736(199903)36:3<269::AID-TEA3>3.0.CO;2-T
Ariasi, N., and Mason, L. (2014). From covert processes to overt outcomes of refutation text reading: the interplay of science text structure and working memory capacity through eye fixations. Int. J. Sci. Math. Educ. 12, 493–523. doi: 10.1007/s10763-013-9494-9
Broughton, S. H., Sinatra, G. M., and Reynolds, R. E. (2010). The nature of the refutation text effect: an investigation of attention allocation. J. Educ. Res. 103, 407–423. doi: 10.1080/00220670903383101
Chi, M. T. H. (2008). “Three types of conceptual change: Belief revision, mental model transformation, and categorical shift,” in Handbook of Research on Conceptual Change, ed S. Vosniadou (Hillsdale, NJ: Erlbaum), 61–82.
Chi, M. T. H. (2013). “Two kinds and four sub-types of misconceived knowledge, ways to change it and the learning outcomes,” in International Handbook of Research on Conceptual Change, 2nd Edn., ed S. Vosniadou (New York, NY: Routledge), 49–70.
Chi, M. T. H., and Roscoe, R. D. (2002). “The processes and challenges of conceptual change,” in Reconsidering Conceptual Change: Issues in Theory And Practice, eds M. Limon and L. Mason (Kluwer Academic Publishers), 3–27.
Diakidoy, I. A., Vosniadou, S., and Hawks, J. D. (1997). Conceptual change in astronomy: models of the earth and of the day/night cycle in American-Indian children. Eur. J. Psychol. Educ. 12, 159–184.
diSessa, A. A. (1996). “What do “just plain folk” know about physics?” in The Handbook of Education and Human Development: New Models of Learning, Teaching, and Schooling, eds D. R. Olson and N. Torrance (Oxford: Blackwell Publishers, Ltd.), 709–730.
Gadgil, S., Nokes-Malach, T. J., and Chi, M. T. (2012). Effectiveness of holistic mental model confrontation in driving conceptual change. Learn. Instruct. 22, 47–61. doi: 10.1016/j.learninstruc.2011.06.002
Greene, B. A., Miller, R. B., Crowson, H. M., Duke, B. L., and Akey, K. L. (2004). Predicting high school students' cognitive engagement and achievement: contributions of classroom perceptions and motivation. Contemp. Educ. Psychol. 29, 462–482. doi: 10.1016/j.cedpsych.2004.01.006
Gregoire, M. (2003). Is it a challenge or a threat? A dual-process model of teachers' cognition and appraisal processes during conceptual change. Educ. Psychol. Rev. 15, 147–179. doi: 10.1023/A:1023477131081
Heddy, B. C., Danielson, R. W., Sinatra, G. M., and Graham, J. (2017). Modifying knowledge, emotions, and attitudes regarding genetically modified foods. J. Exp. Educ. 85, 513–533. doi: 10.1080/00220973.2016.1260523
Heddy, B. C., and Sinatra, G. M. (2013). Transforming misconceptions: Using transformative experience to promote positive affect and conceptual change in students learning about biological evolution. Sci. Educ. 97, 723–744. doi: 10.1002/sce.21072
Hewson, M. G., and Hewson, P. W. (1983). Effect of instruction using students' prior knowledge and conceptual change strategies on science learning. J. Res. Sci. Teach. 20, 731–743. doi: 10.1002/tea.3660200804
Johnson, M. L., and Sinatra, G. M. (2013). Use of task-value instructional inductions for facilitating engagement and conceptual change. Contemp. Educ. Psychol. 38, 51–63. doi: 10.1016/j.cedpsych.2012.09.003
Jones, S. H., Johnson, M. L., and Campbell, B. D. (2015). Hot factors for a cold topic: examining the role of task-value, attention allocation, and engagement on conceptual change. Contemp. Educ. Psychol. 42, 62–70. doi: 10.1016/j.cedpsych.2015.04.004
Kendeou, P., and van den Broek, P. (2007). The effects of prior knowledge and text structure on comprehension processes during reading of scientific texts. Mem. Cogn. 35, 1567–1577. doi: 10.3758/BF03193491
Koballa, T., and Glynn, S. (2007). “Attitudinal and motivational constructs in science learning,” in Handbook of Research on Science Education eds S. Abell and N. Lederman (Mahwah, NJ: Lawrence Erlbaum Associates), 75–102.
Lombardi, D., Danielson, R. W., and Young, N. (2016). A plausible connection: models examining the relations between evaluation, plausibility, and the refutation text effect. Learn. Instruct. 44, 74–86. doi: 10.1016/j.learninstruc.2016.03.003
Mason, L., Baldi, R., Di Ronco, S., Scrimin, S., Danielson, R. W., and Sinatra, G. M. (2017). Textual and graphical refutations: effects on conceptual change learning. Contemp. Educ. Psychol. 49, 275–288. doi: 10.1016/j.cedpsych.2017.03.007
Petty, R. E., and Briñol, P. (2012). “The elaboration likelihood model,” in Handbook of Theories of Social Psychology, eds P. A. M. Van Lange, A. W. Kruglanski, and E. T. Higgins (London: Sage), 224–245.
Phelan, P., Davidson, A. L., and Cao, H. T. (1991). Students' multiple worlds: negotiating the boundaries of family, peer, and school cultures. Anthropol. Educ. Qu. 22, 224–250. doi: 10.1525/aeq.1991.22.3.05x1051k
Pintrich, P. R., Marx, R. W., and Boyle, R. A. (1993). Beyond cold conceptual change: the role of motivational beliefs and classroom contextual factors in the process of conceptual change. Rev. Educ. Res. 63, 167–199.
Pugh, K. J., Linnenbrink-Garcia, L., Koskey, K. L. K., Stewart, V. C., and Manzey, C. (2010). Motivation, learning, and transformative experience: a study of deep engagement in science. Sci. Educ. 94, 1–28. doi: 10.1002/sce20344
Pugh, K. J., Bergstrom, C. M., Heddy, B. C., and Krob, K. E. (2017). Teaching for transformative experiences in science: developing and evaluating an instructional model. J. Exp. Educ. 85, 629–657. doi: 10.1080/00220973.2016.1277333
Strike, K. A., and Posner, G. J. (1992). “A revisionist theory of conceptual change,” in Philosophy of Science, Cognitive Psychology, and Educational Theory and Practice, eds R. A. Duschl and R. J. Hamilton (Albany: State University of New York Press), 147–176.
Van Den Broek, P., and Kendeou, P. (2008). Cognitive processes in comprehension of science texts: the role of co-activation in confronting misconceptions. Appl. Cogn. Psychol. 22, 335–351. doi: 10.1002/acp.1418
van den Broek, P., Young, M., Yuhtsuen, T., and Linderholm, T. (1999). “The Landscape Model of reading: Inferences and the online construction of memory representation,” in The Construction of Mental Representations During Reading, eds H. van Oostendorp and S. R. Goldman (Mahwah, NJ: LEA), 71–98.
Wegener, D. T., Petty, R. E., Blankenship, K. L., and Detweiler-Bedell, B. (2010). Elaboration and numerical anchoring: breadth, depth, and the role of (non-) thoughtful processes in anchoring theories. J. Consum. Psychol. 20, 28–32. doi: 10.1016/j.jcps.2009.12.007
Keywords: engagement, conceptual change, cognitive engagement, instrument development, personal relevance
Citation: Heddy BC, Taasoobshirazi G, Chancey JB and Danielson RW (2018) Developing and Validating a Conceptual Change Cognitive Engagement Instrument. Front. Educ. 3:43. doi: 10.3389/feduc.2018.00043
Received: 01 March 2018; Accepted: 24 May 2018;
Published: 12 June 2018.
Edited by:Barbara McCombs, University of Denver, United States
Reviewed by:Calvin S. Kalman, Concordia University, Canada
Ning Gu, University of South Australia, Australia
Copyright © 2018 Heddy, Taasoobshirazi, Chancey and Danielson. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Benjamin C. Heddy, firstname.lastname@example.org