The tip of the iceberg in organic chemistry – revisited

Nicole Graulich
Justus-Liebig-University Gießen, Institute of Chemistry Education, Heinrich-Buff Ring 17, 35392 Gießen, Germany. E-mail: Nicole.Graulich@dc.jlug.de

Received 26th November 2024 , Accepted 13th February 2025

First published on 14th February 2025


Abstract

Students often perceive learning organic chemistry as a tremendous struggle, linking the invisible molecular level to the visible symbolic representations. Memorising reactions and not knowing how to approach or propose a reaction mechanism differs from what we want students to experience in an organic chemistry classroom. How do we shift this focus from rote memorisation to developing representational competence, enabling students to meaningfully engage with organic mechanisms to connect underlying molecular behaviour with observable chemical phenomena? In 2015, I looked back at the early work in organic chemistry education research to understand the state-of-the-art and potential missing research gaps worth exploring. Various research strands looking into student mechanistic reasoning, their representational competence, and how variables in the classroom impact their learning have developed since then. Ten years later, the question arises of how far we have come to understand the complex interplay of learning organic chemistry. Have we better understood how to help students to link the visible to the invisible? What happened to the iceberg of organic chemistry? How has our perspective on learning organic chemistry grown and acknowledged the interplay of multiple variables shaping the learning experience? In this perspective, the current state-of-the-art in organic chemistry education research is revisited by looking back on the achievements and advancements of the last decade and opening the discussion for potential future research endeavours.


Introduction

“Only one-tenth of an iceberg's volume is above the water; the rest is beneath the surface” (Graulich, 2015, p. 9). This was the beginning of a review of research studies in organic chemistry in 2015. This iceberg analogy illustrated the idea that learning organic chemistry bears the challenge of connecting the visible surface, e.g., the structural representations used, with the implicit conceptual level, often hidden from the learner. Since then, many research studies and reviews focusing on organic chemistry education research have devoted their work to bridge these levels to help students grasp the underlying driving forces of organic chemical processes (see reviews: Dood and Watts, 2022; Dood and Watts, 2023) (Fig. 1).
image file: d4rp00345d-f1.tif
Fig. 1 Adapted iceberg from Graulich (2015).

General chemistry, as one of the main gatekeeper courses, was, early on, at the centre of various research studies, focusing most often on student alternative conceptions and conceptual change (see reviews: Teo et al., 2014; Stone, 2021). In the early 1990s, organic chemistry learning gained increasing attention within the chemistry education research community. Initial studies examined how students approached typical synthesis problems, revealing that graduate students often relied on memorising reaction names rather than reasoning through structural properties and mechanistic processes (Bowen, 1990; Bowen and Bodner, 1991). The research community realised that students might be able to solve problems successfully without knowing what we intend them to know (Bhattacharyya and Bodner, 2005; Anderson and Bodner, 2008; Ferguson and Bodner, 2008).

These early insights initiated numerous research efforts to explore discipline-specific affordances, such as defining mechanistic reasoning and clarifying what representational competence comprises and how it can be supported, how we meet the individual needs of a heterogeneous student population and what is happening in an organic chemistry classroom. The question now is: how much progress has been made in the last decade, what changes have occurred, and what future developments might we need to transform how organic chemistry is taught and learned?

This perspective focuses on some selected aspects of organic chemistry education research based on the empirical research from the last decade to reflect on what might be necessary research questions to further support student learning in organic chemistry. Advances in understanding discipline-specific affordances such as mechanistic reasoning and representational competence will be discussed. Furthermore, a changed model of cognition emphasising individual learning and adapting the learning process, as well as how this impacts research approaches and changes in the classroom environment, is highlighted.

How far have we come to understand discipline-specific affordances in organic chemistry?

On symbolism and models

A core learning objective in organic chemistry is the ability to rationalise structure–property–reactivity relationships to predict, control, and explain reactions. This involves three key aspects: (1) understanding electron distribution in molecules, (2) elucidating the step-by-step pathways of reactions, including transition states and intermediates, and (3) considering how energy and reaction rates change based on reaction conditions and structural features (Goodwin, 2003, 2007). Considering these aspects in reasoning requires more than simply tracking atoms from starting materials to products or as Grove et al. (2012b) entitled their publication “Decorating with electron arrows”. Focusing on the use and interpretation of the electron-pushing formalism (EPF)—a disciplinary tool for visualising electron movement—became the topic for many studies between 2005 and 2017 (e.g., Bhattacharyya and Bodner, 2005; Penn and Al-Shammari, 2008; Grove et al., 2012a, 2012,b; Bhattacharyya, 2013, 2014; Galloway et al., 2017).

It became clear that students did not seem to use this tool as an aid to purposefully rationalise one structure from another by actually considering the electron movement from source to sink (Bhattacharyya, 2014). Unexpectedly, students even added the arrows at the end of the drawing process (Bhattacharyya and Bodner, 2005; Grove et al., 2012a, 2012,b) to make the reaction look more “complete”.

The appropriate understanding and use of the EPF seems to depend largely on the meaning students assign to these arrows to make purposeful use of it in a problem-solving context. This finding is emphasised by work from Weinrich and Britt (2022), who used eye-tracking to document students’ visual decoding of the EPF in organic reaction mechanisms. The findings confirmed earlier qualitative studies by stating that students typically focused more on the reactants than on the electron-pushing arrows when trying to rationalise a reaction mechanism. However, students who mentioned more implicit features during discussions spent significantly more time on the arrows. This suggests that explicitly encouraging students during instruction to connect implicit properties of molecules to electron movement—such as verbalising the cause for an electron movement—could improve their ability to reason through mechanisms and make meaningful connections.

When students are explicitly taught to follow the pattern of the EPF to generate the reaction products, students get better at this skill (Flynn and Ogilvie, 2015; Galloway et al., 2017; Webber and Flynn, 2018). However, even in this designed curriculum by Flynn and coworkers, students still struggled to interpret the symbolism, especially when implicit atoms or intramolecular reactions were presented (Flynn and Featherstone, 2017). It is a new language that students are learning when dealing with these representations. When students are not yet equipped with the necessary chemical semantics of representations and symbols, drawing conclusions about reactions or properties may still be constrained to surface features and heuristics (Talanquer, 2014; Graulich and Bhattacharyya, 2017).

Although early studies in organic chemistry education research focused heavily on the EPF and questioned to what extent this symbolism is used in class as a meaningful tool, this topic has received less attention in recent years. To improve instruction in organic chemistry, it is now well accepted that we have to move beyond the symbolism and focus on what the EPF represents and how its purpose can be emphasised in teaching. An electron-pushing arrow is more than just a way to track electrons from source to sink; it expresses an assumption about the cause of potential electrostatic interactions based on foundational models of chemical reactivity – a model for a chemical interaction between entities. This makes it a powerful tool as the arrow represents a hypothesis for a potential transformation that can be rationalised by mechanistic reasoning or tested experimentally to confirm or refute the proposed pathway. However, current instruction may tend to present reaction mechanisms and the EPF not as a model but as a somewhat fixed drawing aid to reach the final product of inquiry, with mechanisms typically displayed in a linear sequence to be memorized. Often, this does not require explicit consideration of molecular properties or interactions to hypothesise potential electron movement.

In many organic chemistry classrooms, modelling and inquiry, thus, still play a small role. It may not be surprising that students do not invest cognitive effort in deeper reasoning when learning lacks meaningful engagement with mechanisms as models for phenomena or the necessary experimental or computational data to test claims (Williams et al., 2021). Without the real-world value of inquiry and modelling in instruction and assessment, students may naturally default to memorising syntheses rather than engaging in more elaborate reasoning. Thus, an electron-pushing arrow may just remain what it is on paper: a drawn-out arrow. When modelling becomes a tool for generating new insights and is valued in assessment, rather than simply reciting known facts, the learning experience can potentially shift from rote explanation to scientific inquiry.

The necessity to bring back modelling aligns with even earlier discussions by Justi and Gilbert (2003), who emphasised the importance of students’ perspective on the nature, scope, and limitations of models in learning chemistry. When students recognise the role of models in scientific inquiry and understand the purpose of generating mechanisms, they learn about science itself (Justi and Gilbert, 2003). Moreover, by constructing their own models—such as proposing different reaction pathways and making and testing claims about intermediates and transition states—students learn how to do science. Reaction mechanisms should, thus, be seen as epistemic tools in the inquiry process, that are then testable and revisable as well as explanatory and capable of generating new insights (Rost and Knuuttila, 2022; Silva and Sasseron, 2025).

Additionally, understanding the process of generating scientific knowledge with all its nuances becomes crucial with the demand to acknowledge more global and complex relationships within our current challenges. There is an increasing call for preparing students to handle and approach uncertainty and probability in their reasoning as part of broader constructs, such as nature of science and scientific literacy (see, e.g. Haskel-Ittah, 2023; Talanquer, 2024). Although uncertainty is inherent to science (Duschl, 2008; Miller et al., 2018; Chen, 2022), the knowledge in organic chemistry is often presented as certain or established by presenting the product of the inquiry process but not how knowledge is gained. A lack of authenticity of the scientific process may impact students’ perception of the nature of organic chemistry and how knowledge is acquired, i.e., impacting student epistemologies (Grove and Bretz, 2010; Cooper et al., 2019; Schwarz et al., 2024b). In the long run, it may influence students’ approach to science and becoming science-literate and reflective citizens (Rosenberg et al., 2022).

Future research could explore how to meaningfully focus on the inquiry process in traditional organic chemistry classes and support students in modelling activities with mechanistic representations, exploring and arguing for alternative pathways (Lieber and Graulich, 2020), and using the EPF to express ideas. And on the meta-level, how does, for instance, an emphasis on modelling impact students’ understanding of the nature of the discipline and their epistemic beliefs? How can we cultivate a positive stance towards uncertainty and probability in the organic chemistry classroom?

It might be illusory to think that emphasising those reasoning processes can be included in a traditional curriculum. This requires restructuring the curriculum, as has already been done but not as largely implemented in organic chemistry (Flynn and Ogilvie, 2015; Cooper et al., 2019). It may also necessitate a shift in our epistemic perspective on the discipline, a critical reflection about what we do in an organic chemistry classroom and how this reflects the disciplinary practice.

Affordances of representations

Representations, such as molecular structures, models, or diagrams, are undoubtedly the most important means to express invisible chemical and physical properties (Hoffmann and Laszlo, 1991; Gilbert and Treagust, 2009). This topic has been the centre of many studies to tap into the realm of representational competencies and to understand the challenges that students have with it (Kozma et al., 2000; Kozma and Russell, 2005; Kozma, 2020). When it comes to reasoning with representations, we step into ambiguous ground, on the one side with students’ perception of representations as a symbolic pictorial “snapshot” of entities and, on the other side, proposed models for entities or processes. Thus, a learner has to navigate in the space of symbolic conventions (with a sometimes arbitrary connection between symbols and semantics) and “pictures” to express physical properties, reactivity, and shape. For a learner, this is a lot to digest, also in a short amount of time. This two-folded perspective on the function of representations, i.e., as a model or picture, might cause confusion. When should I use a representation as a model that is tested and revised, and when to use structural representations or mechanisms to express conclusions from an inquiry process? When is it necessary to infer implicit property, and when is drawing a “picture” enough? This discussion calls for attention that drawing a structure or a mechanism might be a simple drawing and not direct evidence of semantically understanding the representation or the causal relationships within a mechanism (Cooper et al., 2010).

In organic chemistry at the university level, drawing to express assumptions about reaction processes or expressing one's own mental model of how entities could interact is rarely used as a central part of the inquiry process. A strong focus on presenting normative ideas and describing known mechanisms does not generate the necessity for more modelling activities in organic chemistry (Cooper et al., 2017). Instruction might be too busy helping students draw and practice the accuracy of representations and navigate the landscape of synthetic routes, i.e., emphasising the “Rolodex” of reactions (e.g., Luque-Corredera et al., 2024) instead of reflectively using them purposefully to make claims based on data and challenge ideas in the inquiry process (Stowe and Esselman, 2022).

Studies, for instance, on the usage of domain-specific representations such as reaction coordinate diagrams suggest that even after organic chemistry 2, instruction may have neglected to help students successfully decode the kinetic and thermodynamic concepts encoded within reaction coordinate diagrams (Popova and Bretz, 2018b, 2018,c; Atkinson et al., 2020; Atkinson and Bretz, 2021). These findings indicate that we have to define what students should be able to do and carefully reflect on when it is meaningful to the students to engage with representations to advance chemical understanding. To further understand how students are using models or domain-specific representations, moving beyond a single snapshot in time might help understanding how students’ (epistemic) beliefs towards the role, purpose and usage of these representations develop over time and across the curriculum (Stieff et al., 2016).

In the classroom, however, emphasis is often on skills, such as generating, interpreting, and translating, and less on modelling activities or discussing the affordances and limitations of representations (Xue and Stains, 2020; Jones et al., 2022). It might thus not be surprising that how students are reasoning with representations in studies might be mirroring what is taught in class. Barakat and Orgill (2024) have shown with regard to critical features in resonance that instructors in both general and organic chemistry tend to focus more on what students are supposed to do with the representation (e.g., identifying delocalisable lone pairs) rather than on what they need to know about it (Barakat and Orgill, 2024).

Focusing research efforts on how representations influence student comprehension processes productively or unproductively might be valuable to understanding students’ developing representational competence. This can, in turn, help to inform meaningful practices in the organic chemistry classroom. What can happen in the classroom to make the decoding process easier (e.g., using embodied actions to support spatial reasoning (Stieff et al., 2022; Kiernan et al., 2024))?

When looking at the formulated and well-known representational competence skills (Kozma et al., 2000; Kozma and Russell, 2005; Kozma, 2020), not much is known about how different representational skills are interrelated; do they develop simultaneously? What is the empirical evidence for a hierarchy in developing those skills? To what extent does it depend on other types of problem-solving skills or on prior knowledge and task context, as documented for the construct of spatial abilities (Hegarty and Waller, 2005)? There is evidence that students’ reasoning with representations in organic chemistry may largely depend on the task, the type of representation, and students’ overall approach to decode explicit and implicit structural features (Schönborn and Anderson, 2008; Ward et al., 2022). Farheen et al. (2024), for instance, have shown that students activated different concepts or interpreted different aspects when determining electrophilicity and nucleophilicity from ball-and-stick models or electronic potential maps.

Additionally, to these findings, there seems to be no clear predictive relationship between single representational competence skills, students’ learning outcomes, or their content knowledge (Sim et al., 2014; Stieff et al., 2016; Stieff and DeSutter, 2021; Steinbach et al., 2024). The assumption to foster representational competence to support content knowledge might not be as tightly coupled as previously assumed (Kozma and Russell, 1997). Empirical support for a strong relationship that representational competence fosters content knowledge and how these two constructs are separable or develop over time, thus, remains unsolved. Recent work indicates that the representational skills interpret, translate and use might be a unified representational skill (Ward et al., 2025). How are these skills related to more complex skills, such as identifying affordances and limitations? What type of representational skills depends stronger on students’ epistemic beliefs or modelling competence?

Surface feature focus and perception

In the last years, it is slowly acknowledged that students learn from and use representations they do not yet understand, known as the representational dilemma (Rau, 2017). Very little has been done to address this dilemma in teaching. How can we support learners in using and learning with representations that they may not fully understand? There is ample evidence in the chemistry education literature that students can define a nucleophile (conceptual knowledge) without actually “seeing” it in a different structural context (representational competence) (Cruz-Ramírez de Arellano and Towns, 2014; Anzovino and Bretz, 2016). How can students’ understanding of representations be strengthened to help them become more fluent in linking representation to properties? In 2010 already, Strickland et al. (2010) noted that when a representation fails to represent a referent—an entity with inherent chemical properties—they become little more than pictures that decode a mental image rather than a functional mental model. Although there may be many reasons for this failure, learners entering a discipline are primarily influenced by the salient features of visual stimuli (Chi and VanLehn, 2012). Various studies in organic chemistry have consistently confirmed that students often focus heavily on surface features when solving or describing organic chemistry problems (e.g., Cruz-Ramírez de Arellano and Towns, 2014; Sandi-Urena et al., 2019; Crowder et al., 2024). When these surface features are misleading or irrelevant, the likelihood of successful problem-solving decreases (Graulich et al., 2019). There might be even more distinctive dimensions of chemical representations, such as the degree of iconicity and dimensionality, that we have not yet considered as a lens to understand how this affects student reasoning (Talanquer, 2022; Nelsen et al., 2024).

Students might anchor their understanding to the respective representational context in which they learnt relationships, especially when instruction provides many qualitative statements on which structure is “better than” or “more stable than” without further explaining how this transfers to other structures. Knowing simply that halogens are good leaving groups in substitution reactions does not prepare students to look for electron distribution of a negative charge in other functional groups. The context for applying such rule-based reasoning is often highly constrained. When students progress in their studies, abstraction from the structural context becomes the key to navigating the structural landscape of organic chemistry structures, looking for differences in electron density and energy differences irrespective of the concrete structural context. Currently, instruction may not adequately prepare students for this abstraction (Sevian et al., 2015; Weinrich and Sevian, 2017).

A recent study could also show that students are triggered to provide wrong claims when prompted to determine if resonance stabilisation occurs when the electron lone pairs in molecular structures are explicitly presented. Students might be used to seeing lone pairs when “something” is happening (Braun and Graulich, 2024b). Hence, in this case, the indication of electron lone pairs may induce an affirmation bias toward the presence of resonance. Students with high prior knowledge might be able to compensate and abstract; students with low prior knowledge may rely on this structural indicator. This explicitness of structural cues significantly shapes students’ perception and interpretation of representations, also recently shown for nucleophiles and electrophiles (Frost et al., 2023; Farheen et al., 2024). These observations question how structural representations are displayed and how this affects student pattern recognition or their assumptions about reactivity, e.g., implicit hydrogens are typically drawn out when abstracted by a base. Future research need to explore when and how those structural displays should be used in teaching and if applying a fading effect, i.e., slowly taking away support, to train students’ fluency is beneficial. In instruction, more time is needed to support students in becoming fluent in structural variations, e.g., considering electron-lone pairs even if they are not drawn out or linked to an electron arrow. As students advance in their courses, the molecules are more and more functionalised; multiple reactive sites must be considered. To what extent is the current practice potentially impeding students from achieving the necessary skills?

Students’ visual perception of organic representations

Besides this aspect of making sense of representations, as expertise in a domain increases, individuals become more fluent in inferring implicit information and quickly recognise it within representations (Kellman and Massey, 2013). This effortless recognition of conceptual information in visual representations is crucial for solving more complex tasks as it frees up cognitive capacity for sense-making processes (Rau, 2017, 2018). Much research in the last decade focused on students’ sense-making processes and less on fluency-building activities in organic chemistry classes.

Recently, with the use of eye-tracking technology, the community has the possibility to tap into sense-making and fluency processes to understand the subtle differences in how students perceive representations to interpret or generate structures. Besides eye-tracking studies focusing on integration processes between multiple representations, such as diagrams and structures (Cullipher and Sevian, 2015; Topczewski et al., 2017; Connor et al., 2021), a handful of studies have used eye-tracking to specifically look at how students perceive or draw Lewis structures in problem-solving contexts (Rodemer et al., 2020; Braun et al., 2022; Weinrich and Britt, 2022), for instance, to determine the presence of resonance in a molecule. Students differ in how they focus on specific features locally or on broader areas of structural features globally (Braun and Graulich, 2022; Braun et al., 2022). In this context, students’ problem-solving processes are more productive if they can perceive a representation more globally (i.e., looking beyond a single atom or functional group) than if their focus is more constrained. A narrowed focus hinders the perception of potential electron delocalisation, especially when it comes to detecting emergent properties, e.g., resonance in unsaturated carbonyls, which is known for years and has been described again recently (DeFever et al., 2015; Cox et al., 2024).

One approach to address this challenge, could be to determine the effect of promoting visual chunking on students’ perception, in conjugated or emergent systems. Visual chunking enables students to process complex, interrelated features more effectively (Chi and VanLehn, 2012). By perceiving, for instance, conjugated systems as a perceptual chunk, students could improve their problem-solving abilities. How can varying degrees of explicitness in highlighting techniques affect students' ability to chunk interrelated structural features?

Future research could further clarify how students’ perception and productive decoding processes of representations evolve over time. What are the differentiating characteristics of (un)productive perception (i.e., measured by eye movements) in problem-solving scenarios? How are students' visual and conceptual decoding processes of representations influenced by other variables, for instance, by their understanding of models and modelling or their mechanistic reasoning?

Defining mechanistic reasoning – when causes are linked to effects

Early studies often found that students relied on general strategies, such as one-reason decision-making, memorisation, or heuristic reasoning, which hindered their ability to transfer knowledge to new contexts (DeFever et al., 2015; Graulich and Bhattacharyya, 2017; Bhattacharyya and Harris, 2018). At that time, these studies concluded that students assigned different meanings to symbols and struggled to explain why organic reactions occurred. Influenced heavily by Russ et al.'s work (2008) in physics education, which examined children's early mechanistic reasoning, the focus shifted toward understanding what constitutes a mechanistic explanation in organic chemistry. How do students work through mechanistic steps, rationalising the interactions of entities, their properties, and their activities? Once it is defined what we expect from students, supporting and assessing mechanistic reasoning becomes much easier.

Consequently, some chemistry education research groups have focused on characterising mechanistic reasoning and helping students build the conceptual foundation necessary to understand, rather than merely memorise, statements like “strong bases are good nucleophiles.” Some conceptualisations of mechanistic reasoning have adopted a more component-based approach, identifying students’ references to entities, properties, and activities as indicators of students’ mechanistic reasoning (Watts et al., 2020). In more advanced courses, accounting for these components may not fully capture the cause for a mechanism. In these cases, aspects like backward and forward chaining of causal links—providing a detailed account of underlying causes and effects—become crucial (Caspari et al., 2018b).

Explaining molecular interactions by unpacking causes and effects aligns well with broader approaches, such as Krist et al.'s (2019) essential epistemic heuristics. In essence, explaining scientific phenomena involves three steps: describing the mechanism a scalar level below the target phenomenon (i.e., the granularity), unpacking entities, properties, activities (i.e., thinking about charged species, nucleophiles, and electrophiles), and linking them to provide a causal explanation (i.e., the causality). This focus on causal interactions also resonates with the work of Russ et al. (2008), who argued that simply identifying causes is not enough for mechanistic explanations—the interactions between these causes are equally important. This idea of these essential epistemic heuristics is incorporated in varying degrees in the frameworks to describe mechanistic reasoning in organic chemistry. All frameworks agree on causality and granularity as core dimensions but differ in the degree to which students should document those dimensions in their reasoning (Sevian and Talanquer, 2014; Becker et al., 2016; Cooper et al., 2016; Caspari et al., 2018a; Bodé et al., 2019; Deng and Flynn, 2021).

Causality, defined as the process by which a cause produces an effect (Koslowski, 1996), is thus a central element in all these frameworks. In the dimension of causality, these frameworks emphasise how information is integrated to construct causal explanations rather than focusing on students’ conceptual sophistication. Various studies have since then captured students’ causality when engaged in organic chemistry problems in different rubrics. For example, Sevian and Talanquer (2014) discussed four modes of reasoning with causal links emerging in higher modes, while Cooper et al. (2016) differentiated between mechanistic, causal, and causal-mechanistic reasoning, emphasising that a mechanistic account (describing the mechanism) and a causal account (explaining cause and effect) can be expressed separately.

A second core dimension of mechanistic reasoning is the granularity or the scalar level below the phenomenon. Different frameworks conceptualise the necessary level of granularity for an explanation to go beyond the phenomenon itself in various ways, ranging from considering phenomenological, structural, electronic, or energetic levels (Bodé et al., 2019; Deng and Flynn, 2021; Talanquer, 2022). What constitutes a scalar level below the phenomenon in organic chemistry and what is necessary to unpack the respective level has been conceptualised differently (Krist et al., 2019; Talanquer, 2022). In organic chemistry, reasoning often begins with a structural representation, linking it to its properties and reactivity. These can be further unpacked by inferring the electronic properties of the involved molecules and the energetic consideration of the kinetic or thermodynamic driving forces of a reaction. But does this imply a hierarchy of scalar levels in explanatory power? Is accounting for the electron flow a first step or is it linking reactivity to the energy of entities and processes? Or could it be a combination of these? Talanquer (2022) also argues that we have not yet explored the obstacles as well as the potential of purposefully “packing” and “unpacking” representations, for instance, along the granularity dimension and especially how students’ understanding progresses over time.

Cooper and her group have started to look into how mechanistic reasoning translates across science disciplines (Franovic et al., 2023; Shiroda et al., 2024) and varies across courses depending on instructional decisions and task prompts (Crandell et al., 2019; Noyes et al., 2022). They could document how students’ mechanistic reasoning is sensible to the task context, as well as to the assessment practices in the respective course.

Future work has to explore how mechanistic reasoning should develop over time, what should be emphasised along study programmes, whether a learning progression can be defined and especially if and how mechanistic reasoning, as a competence, transfers beyond chemistry to support students in becoming scientific literate citizens.

Supporting mechanistic reasoning

Besides reflecting on what counts as mechanistic reasoning, one has to rethink the way we engage students in this way of thinking. Research in the last years indicated that students’ mechanistic reasoning is highly influenced by task context, purpose, and framing (Bodé et al., 2019; DeCocq and Bhattacharyya, 2019; Deng and Flynn, 2021; Noyes et al., 2022; Crowder et al., 2024). Expecting students to think mechanistically and to predict potential reaction steps cannot be supported when the emphasis is on recalling reaction products in traditional assessments such as predict-the-product tasks (Bhattacharyya, 2022). Tasks and prompts need to be adapted to require the use of cause-effect relationships (Graulich and Schween, 2018).

Due to the insights from earlier studies in organic chemistry, many research groups have started to make various changes, either through changing the curriculum (Cooper et al., 2019), introducing writing-to-learn activities (Finkenstaedt-Quinn et al., 2021), designing tutorial videos with an emphasis on mechanistic explanations (Eckhard et al., 2022a; Bernholt et al., 2023) or changing task design to engage students in more reflections about reaction pathways in contrasting case formats (e.g., Caspari and Graulich, 2019; Lieber and Graulich, 2020; Lieber et al., 2022b; Noyes et al., 2022; Franovic et al., 2023; Zaimi et al., 2024a). These instructional changes could document what students are able to activate when asked differently, and it became evident that students’ mechanistic reasoning improves when guided through specific prompts.

However, these studies also revealed that engaging in mechanistic reasoning, argumentation, or explanations is complex, and students need practice and feedback. While current scaffolding approaches have shown promise, particularly for low-achieving students (Kranz et al., 2023), scaffolding is currently often used as a one-fits-all approach. Future instruction would benefit from more personalised scaffolding that supports students across different skill levels, especially in the dimensions causality and granularity when rationalising through mechanistic steps (Dood et al., 2020a; Lieber et al., 2022a). Since scaffolding is designed as temporary support, future research further should examine explore how independent problem-solving in new instructional contexts can be achieved through fading the scaffolds and how this impacts students’ developing reasoning over time.

When designing scaffolding for mechanistic reasoning, one needs to reflect about what students should consider when judging reactions. Bhattacharyya (2006) and Kraft et al. (2010) already claimed quite some time ago that problem-solving in organic chemistry often involves multivariate problems, e.g., in which considering multiple variables and weighing them becomes crucial to make claims about reaction outcomes, whereas heuristics, such as one-reason decision-making (Talanquer, 2014), naturally drives students reasoning. Since then, there has been relatively little research in organic chemistry education to elicit the difficulties students have with considering multiple variables. Multivariate reasoning requires knowing how to compare the strength of properties (i.e., comparing the strength of effects, for instance, inductive and mesomeric effects) and how to interpret differences in energy values to make a claim about a preferred pathway. With regard to the first aspect, Zaimi et al. (2025) have recently documented in a qualitative study the variety of how students develop and weigh (multiple) lines of reasoning in case comparisons. They illustrated which conceptual resources, e.g., explicit or implicit features, students were able to activate to build their lines of reasoning. Other studies have furthermore expressed a lack of energy considerations as a reappearing theme in students’ reasoning, often related to various perceptions of stability (Asmussen et al., 2023b; Demirdöğen et al., 2023; Pölloth et al., 2023; Haas et al., 2024; Liang-Lin et al., 2024). Additionally, students may be more often prompted to build a claim for a certain product or mechanism (Caspari et al., 2018b) than against it, which is reflected in more advanced reasoning when they argue for correct products (e.g., Deng and Flynn, 2021; Watts et al., 2022b), rather than against a proposed reaction outcome. This reinforces the tendency to concentrate only on the most plausible products, neglecting potential side products (DeFever et al., 2015; Popova and Bretz, 2018b; Lieber and Graulich, 2020, 2022). Popova and Bretz (2018a) documented in this regard, that students were focusing only on the major species when drawing energy profiles, stating that “it was not important to consider by-products” (Popova and Bretz, 2018a, p. 1091). Instruction should ideally support students to shift their perspective that a mechanism only leads to one defined product to acknowledge that various products and byproducts with differing probabilities can occur. This could contribute to a more sophisticated perception that a mechanism is not a pre-defined outcome but an outcome of interrelated interactions constrained by energy and entropy (Maeyer and Talanquer, 2013; Talanquer, 2013).

Future research should further clarify what students should know and do to unpack a mechanism and establish causal links, particularly around which elements—such as orbital interactions or energy considerations—should be unpacked and at which stages of learning. In instructional design, we are still in need of a deeper understanding of the affordances of activating conceptual resources in mechanistic tasks to tailor the support and how these resources impact students’ construction of detailed causal explanations (Asmussen et al., 2023a). What tasks can make mechanistic reasoning meaningful enough to prompt students to invest cognitive effort? How can we better align classroom practices and assessments to support productive learning progressions for mechanistic reasoning?

A changed perspective on learning and teaching

Instructors at the centre of attention

Besides focusing on students’ learning difficulties, organic chemistry education research has, in recent years, shifted the perspective toward the instructors’ enacted practice in chemistry and organic chemistry (see, e.g., Zotos et al., 2021; Eckhard et al., 2022b; Jones et al., 2022; Kraft et al., 2023; Moreira and Talanquer, 2024; Wang et al., 2024). In the realm of organic chemistry teaching, most ongoing studies have focused on how instructors teach certain topics and often documented a mismatch between instructors’ perceptions of their intention in teaching, their goals for student learning, and their actual framing in the classroom. With regard to teaching resonance, for instance, many studies described that instructors largely differ in what they believe students should know about this topic. In their teaching practice, they emphasised rather an operational instead of a conceptual stance towards resonance (Carle and Flynn, 2020; Xue and Stains, 2020; Barakat and Orgill, 2024). This impacts how students frame the knowledge elements and may explain why students focus on using resonance instead of understanding it, as seen in several studies (among others: Brandfonbrener et al., 2021; Braun and Graulich, 2024a; Braun et al., 2025). Popova and Jones (2021) showed comparable results with regard to representational competence, highlighting that instructors might teach various representational skills “accidentally”, without knowing how they relate.

Other studies on chemistry college instructors determined that instructors hold strong pedagogical intentions, aiming at engaging students in meaningful learning processes. Still, those intentions were coupled with a more authoritative stance in teaching (Farré and Lorenzo, 2009; Lorenzo et al., 2010; Moreira and Talanquer, 2024). This may arise from multiple factors, their pedagogical content knowledge, their personal experiences or their beliefs about appropriate conduct in a chemistry lecture. Academic teaching often follows certain conventions or traditions regarding how science content is communicated. During lectures, instructors express their understanding, their underlying epistemic ideas, and their agency as members of a scientific community (Rozumko, 2017). An individual's personal experience as a learner in organic chemistry may shape their teaching approaches; “emulating” one's own organic chemistry instruction (Kurdziel et al., 2003).

Recent work focusing on graduate teaching assistants in organic chemistry, who typically act as learning assistants in lectures and labs, revealed that many relied on knowledge-telling approaches or overlooked to provide strategies for students, thus, showing bounded knowledge of effective teaching methods (Zotos et al., 2021). However, a finer analysis of the teaching experience of graduate teaching assistants (Zaimi et al., 2024b) revealed the complex interplay of identity, framing, and expectations. These findings encourage a focus on pedagogical communities of practice to initiate change and underline the necessity for further research into organic chemistry instructors’ teaching strategies, both in theory and practice, and how they address students’ difficulties. How teaching practices in the classroom and learning are aligned or happen in the actual moment of classroom discourse is currently a black box. Walsh et al. (2022) established a fine-grained analysis to uncover how, for instance, undergraduate learning assistants facilitate student learning process in organic chemistry in small group discussions. Eliciting what is truly communicated during teaching moments, conveyed by instructors, such as epistemic messages, the depth of explanations, or signaled by curriculum artefacts, could offer insights into why we observe certain phenomena in organic chemistry learning.

A resources model of cognition to describe learning processes

Over the last years, the perspective on students' competencies and how learning is framed has shifted, moving away from a deficit-oriented approach toward a more asset-based perspective (Krutkowski, 2017; Patton Davis and Museus, 2019). It has become clear that focusing solely on what students cannot do fails to acknowledge their abilities and ideas. Such an approach overlooks the value of students’ contributions and, in some cases, places a high responsibility on students for their own success, with minimal implication for instructional improvement beyond the broad claim that “students should learn more.” A deficit-based classification of student performance—often reduced to simplistic categories like “not present,” “partial”, or “fully present”—may give a good sense of whether an instructional intervention was effective in this measure compared to a control group in a pre-post design. It may offer little guidance on what happened in the process of learning and how to adapt teaching strategies or build on student ideas.

Fortunately, a more asset-based perspective has gradually gained traction in chemistry education research, heavily influenced by the work of Hammer and colleagues (Hammer, 1996; Hammer et al., 2005) on cognitive resources and the work on phenomenological primitives (Disessa, 1983). This reshaped how learning and problem-solving are framed under a resource perspective on learning. Increasingly, chemistry education studies are moving away from viewing misconceptions as stable, unchangeable constructs or non-normative ideas as inherently wrong (e.g., Rodriguez et al., 2020; Crandell and Pazicni, 2023; Braun and Graulich, 2024a; Gao et al., 2024; Pölloth et al., 2024). This new model of cognition acknowledges the importance of how students interpret a learning situation and how context-dependent activation of resources influences learning, offering a more nuanced understanding of both student learning and teaching experiences.

Recent studies on students’ use of the resonance concept demonstrated that students activate a wide variety of cognitive resources across different tasks (Braun and Graulich, 2024a) and focus on specific structural features and heuristics (Rotich et al., 2024). These studies revealed that there isn’t a single, linear way of considering resonance forms in problem-solving. Instead, the variation in how students approach these problems is influenced by task affordances and complex individual factors in how they perceive structures and make inferences. We are just at the beginning of adopting a fine-grained resource perspective on student cognition to understand the subtle differences of reasoning processes students engage in during problem-solving.

Viewing learning as an individualised process—where students actively and deliberately activate productive or unproductive knowledge resources—challenges our view on learning and researching knowledge acquisition. Cognition may be more variable and flexible. While this theoretical shift offers a valuable perspective on the diversity of students’ ideas, it also challenges deriving practical implications and designing effective instructional strategies. There are some key challenges for future research under this paradigm. It is not only to acknowledge and characterize the diversity and detours of students’ thinking in the moment of learning (Karch et al., 2024) but also to better characterise when certain ideas or resources are productive but not normative or, conversely, unproductive but normatively correct. The terms “productive” and “unproductive” are often used to label cognitive resources, but their definitions are sometimes vaguely related to correct, frequently equating “productive” with normatively correct. Acknowledging students’ productive but not normatively correct ideas and how these change in multiple task contexts (see: Crandell and Pazicni, 2023; García Ramos and Towns, 2024; Rodriguez, 2024), could lead to a deeper understanding of where students’ ideas come from and how to support students in advancing their cognitive resources to be able to solve problems across various contexts.

There is a clear demand for more qualitative research that carefully examines students’ approaches by characterising their resource activation to develop an understanding of student coordination classes, i.e., the complex network of fine-grained related resources. Especially research must explore when rather fragmented resource activation in student reasoning becomes a more consistent activation of resources when their coordination classes get more sophisticated (DiSessa and Wagner, 2005). Such an approach could improve our ability to guide students more individually through problem-solving. Potentially, this lens can bring us closer to deriving in-depth learning progressions for central concepts in chemistry, such as energy, stability, and reactivity, in multiple task contexts and how they become interconnected.

Additionally, acknowledging that learning and problem-solving do not follow a linear, stable path—where the same cognitive resources are automatically activated across different contexts (often referred to as “transfer”)—demands new approaches to diagnose and monitor students’ learning in both qualitative and quantitative studies. This highlights the need to move beyond single, moment-in-time data collection to make more valid conclusions about the development of competencies, such as tracking causal reasoning through multiple classroom activities (Haas et al., 2024) to design and adapt task prompts gradually to foster students’ reasoning (Cooper et al., 2016; Yik et al., 2023; Zaimi et al., 2024a).

Adaptive learning

As the model of cognition evolves, shifting away from viewing alternative conceptions as stable constructs and recognising that individual framing and resource activation better describe the learning process (Hammer et al., 2005), it becomes clear that there might not be a one-size-fits-all approach to learning. This growing recognition of student individuality in all its facets calls for more individualised learning opportunities to maximise the potential at the individual student level. Diagnosing a range of learner characteristics and tailoring learning experiences to accommodate specific needs enhances student learning outcomes compared to non-adaptive scenarios (Aleven et al., 2017; Dockterman, 2018; Plass and Pawar, 2020). In organic chemistry, for instance, adaptive support, such as guidance in argumentation and/or conceptual knowledge in organic chemistry, has been shown to help mitigate performance gaps between different student groups (Lieber et al., 2022a, 2022,b).

The idea of personalised learning is not a new approach, given Vygotsky's work in 1978, claiming a zone of proximal development for individual learners (Vygotsky, 1978). However, it did not gain traction due to the challenge of addressing the diverse needs of students in large classes, using traditional teaching and formative assessment methods. Adaptivity requires a huge amount of data and a continuous collection of personal variables (e.g., prior knowledge, emotions, interest, motivation) and real-time student work (e.g., explanations, arguments, essays, drawings) to analyse, monitor, and adjust learning trajectories (Novak et al., 1999; Aleven et al., 2017).

In science education, the debate around using artificial intelligence (AI) applications—such as machine learning (ML) and natural language processing (NLP)—for adaptive learning has grown significantly in recent years (e.g., Zhai et al., 2020). Digital learning environments, or intelligent tutoring systems enhanced with AI, offer a promising solution for providing these personalised instructions, guidance, and feedback on a large scale (Plass and Pawar, 2020; Shemshack et al., 2021). These technologies allow for the automatic analysis of open-ended answers, providing educators with valuable insights into student performance and offering opportunities to implement targeted feedback and personalised exercise distribution. Personalised learning can go beyond simply monitoring cognitive development; it also involves tracking metacognitive growth and understanding how students make choices about their learning (Molenaar, 2022). This enables educators to offer recommendations tailored not only to what students are learning, but also to how they can learn more effectively.

Early on, some groups in organic chemistry education have been using automated analysis and machine learning algorithms to offer automatised formative assessment for constructed responses (Dood et al., 2018, 2020a, 2024; Raker et al., 2022; Watts et al., 2022a; Yik et al., 2023). Work by Dood and colleagues (Dood et al., 2018, 2020b) developed logistic regression models to automatically classify students’ answers in constructed-response items on the SN1 reaction mechanism and acid–base reactions. Additionally, they showed how this classification can be linked to an automated distribution of instructional material according to the needs of the students (Dood et al., 2020a). A recently developed ML algorithm allows the categorisation of student mechanistic reasoning in multiple contexts, allowing for the monitoring of students’ elaborateness of reasoning over multiple time points (Martin and Graulich, 2024; Martin et al., 2024). Those ML models, for example, can analyse students’ performance on assessments over time, identify areas of difficulty, and deliver targeted practice problems and tutorials to address specific knowledge gaps (Dood et al., 2020a; Ariely et al., 2024). This continuous, real-time formative assessment enables students to receive direct, previously unattainable feedback and support (Swiecki et al., 2022).

A key challenge in providing personalised learning lies in designing learning environments that can automatically assess students’ learning trajectories towards defined domain-specific learning goals and generate personalised learning supports, such as adaptive scaffolds that react adaptively to students’ individual needs (Kubsch et al., 2022; Martin and Graulich, 2023). Developing AI-based algorithms to track student learning prompts fundamental questions: What should students learn? What counts as evidence of their learning gains? And how should these gains be appropriately assessed?

How flexible should the adjustment of the learning environment—such as training tasks and feedback—be to adapt to students’ evolving needs? A recent qualitative, non-AI-based study by Asmussen et al. (2024) demonstrated that adaptive conceptual support must be highly flexible to accommodate students’ in-the-moment needs during problem-solving. Simply relying on prior knowledge assessments or following a predefined support path may not be sufficient to effectively tailor support to the challenges that arise during problem-solving.

Learning is an inherently individual journey and is, thus, shaped by the unique characteristics of each student. Beyond prior knowledge, factors such as epistemological beliefs, emotions, cultural or social capital, and the environment are affecting how students learn (Deng et al., 2022; Collini et al., 2023, 2024; Martinez et al., 2024). Tailoring instruction to meet student-specific needs should theoretically promote learning progress for all students and help them reach their full potential. Reflecting on the systemic nature of a learning experience for all kinds of students in an organic chemistry course might be a starting point for a more sustainable change, especially when implementing adaptive learning.

While it may be tempting to focus on individualising the learning experience, maintaining constructive alignment between assessment, classroom instruction, and practice remains crucial to shape students’ overall learning experience (Ralph et al., 2022). Research on active learning in organic chemistry has shown that changing instructional methods without aligning assessments does not necessarily lead to improved performance (Crimmins and Midkiff, 2017; Rau et al., 2017; Stowe et al., 2021; DeGlopper et al., 2022; Schwarz et al., 2024a). Hence, it might be a waste of resources if, for instance, machine learning models are in place to support students’ mechanistic reasoning when the curriculum and the way it is taught do not adapt.

In addition to the potential of using sophisticated back-end technologies (such as machine learning models) to assess and tailor student feedback, recent research has also focused on the front-end of AI applications (Watts et al., 2023a). Furthermore, the access to chatbots might heavily influence how students learn in the future. Studies have explored how chatbots answer chemistry problems (Watts et al., 2023b; Yik and Dood, 2024) aiming to understand how effectively these tools can correctly explain or predict chemical reactions. Currently, the provided answers are improving, but they may lack a concise description of the electron movement, an aspect valued in students’ answers (Watts et al., 2023b). As AI models become more advanced, new opportunities for teaching and instruction can be designed. This may include helping students reflect on domain-specific cause-effect relationships or domain-general skills such as self-regulated learning through AI. Customising chatbots to provide tailored feedback on specific problems could help alleviate the time-consuming task of generating feedback (Cooper and Klymkowsky, 2024; Crowder and Raker, 2024; Yik and Dood, 2024). Future research in this area should explore the nature and process of gradually fading AI support to ensure knowledge transfer. How can successful adaptive learning occur not only for students’ cognitive skills but also in accordance with other variables, such as emotional-affective or motivational variables, when designing individualised learning or feedback? How much support is beneficial and for what specific skill should students receive feedback?

Conclusions

Research in organic chemistry education has hugely advanced in the last decade and opened multiple avenues to better understand learning and teaching in this discipline. But also opened-up new questions for advancing organic chemistry education research.

First, there has been substantial progress in understanding discipline-specific affordances, such as incorporating strategies that emphasise mechanistic reasoning and cause-effect relationships, beyond mere memorisation of chemical processes. But there is the need to understand what reasoning fragments are developing early and how to be supportive longitudinally and not just in one context. While frameworks for teaching and assessing mechanistic reasoning have been developed, instruction needs to evolve to strengthen why it is relevant to think about deeper causal relationships and encouraging students to rationalise molecular interactions at a more granular level. More emphasis on relevance and authenticity in the way students could engage with mechanistic problems could further enhance students’ understanding of reaction mechanisms, thus moving from rote learning toward a more profound inquiry-driven approach. This may not be successful without focusing on instructors’ professional development and aspects of the whole classroom system that shape the learning experience (Talanquer and Kelly, 2024; Talanquer et al., 2024). Emphasizing scientific practises, such as arguing from evidence or deriving mechanistic explanations for reaction processes, requires time in the classroom and will not be successful without decluttering the traditional organic chemistry curriculum.

Second, the role of representational competence in organic chemistry remains complex. Representations such as molecular models and diagrams are central to understanding the subject, yet students often misinterpret or focus too narrowly on surface features. Research indicates that students require support in navigating between symbolic conventions and the underlying conceptual knowledge these representations convey. To address this, educational strategies, on the one hand, could emphasise a deeper engagement with representations as models and tools for scientific inquiry, as well as borrow strategies from educational psychology to help students automatise pattern recognition and fluency, on the other hand.

Lastly, adaptive learning and personalised instruction offer promising avenues for improving organic chemistry education. By leveraging AI and other digital learning environments, instructors can provide individualised feedback and tailored learning experiences that respond to students’ evolving needs. However, such approaches must be carefully aligned with classroom practices and assessments to counteract prevalent inequalities. This responsibility ensures that the focus on adaptive learning and individual support offers a more student-centred approach to chemistry education, helping to cultivate more personalised learning.

This perspective was meant to shed some light and thoughts on selected research areas that have governed the last 10 years in organic chemistry education research. One could say that there is still room for improvement in organic chemistry education research, but we have already come a long way. Nearly 10 years ago, Gautam Bhattacharyya shared a thought with me, that still sticks with me. He emphasised that the community needs to gather evidence showing that how organic chemistry is taught does not lead to students truly understanding the subject. Over the past 20 years, many studies have provided such evidence. We know what students do not know in traditional classes. We know what they do instead, and we know why they do what they do. Now, the question is: what do we do with the current insights and evidence? There is a growing recognition that a broader perspective on learning organic chemistry as a system of multiple factors is required; shifting the responsibility for meaningful learning from the side of the students to the instructor and the culture of the classroom environment. This realisation should remind the organic chemistry research community to expand its view and work toward a deeper understanding of the entire system of learning organic chemistry. While this may seem like a large-scale undertaking, even small qualitative studies that examine multiple task contexts can offer far more insight into the learning process than data from a single moment-in-time. Describing data from a single student learning experience in a traditional classroom setting will likely give us the same results we've seen so far. It might be getting more important in the next years to revisit the conditions the iceberg of organic chemistry education is facing.

Limitation

This perspective could only emphasise selected topics and does not claim to be all-encompassing. This does not mean that other topics, such as curriculum design, active learning pedagogies or lab education, are less important or less worth exploring. The focus of this perspective lies primarily on outlining future research endeavours and pending black boxes of learning organic chemistry in some selected areas. Hence, practical teaching implications are not explicitly discussed.

Data availability

There is no data availability statement to be declared for this perspective.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

I want to wholeheartedly thank Sascha Bernholt, Leonie Lieber, Axel Langner, Sarah Saupe, Paul Martin, Irina Braun and Elias Heinrich for providing feedback during the writing process. Further, I would like to thank the German Research Foundation for funding this work under the umbrella of the ROChET-network initiative (Research in organic chemistry education and teaching) (Grant 528961647).

References

  1. Aleven V., McLaughlin E. A., Glenn R. A. and Koedinger K. R., (2017), Instruction based on adaptive learning technologies, in Mayer R. E. and Alexander P. (ed.) Handbook of research on learning and instruction, Routledge, pp. 522–560.
  2. Anderson T. L. and Bodner G. M., (2008), What can we do about ‘Parker’? A case study of a good student who didn't ‘get’ organic chemistry, Chem. Educ. Res. Pract., 9, 93–101 10.1039/B806223B.
  3. Anzovino M. E. and Bretz S. L., (2016), Organic chemistry students' fragmented ideas about the structure and function of nucleophiles and electrophiles: a concept map analysis, Chem. Educ. Res. Pract., 17, 1019–1029 10.1039/c6rp00111d.
  4. Ariely M., Nazaretsky T. and Alexandron G., (2024), Causal-mechanical explanations in biology: applying automated assessment for personalized learning in the science classroom, J. Res. Sci. Teach., 61, 1858–1889 DOI:10.1002/tea.21929.
  5. Asmussen G., Rodemer M. and Bernholt S., (2023a), Blooming student difficulties in dealing with organic reaction mechanisms – an attempt at systemization, Chem. Educ. Res. Pract., 24, 1035–1054 10.1039/D2RP00204C.
  6. Asmussen G., Rodemer M. and Bernholt S., (2024), Stepping stones to success: a qualitative investigation of the effectiveness of adaptive stepped supporting tools for problem-solving in organic chemistry to design an intelligent tutoring system, Int. J. Sci. Educ., 1–23 DOI:10.1080/09500693.2024.2361933.
  7. Asmussen G., Rodemer M., Eckhard J. and Bernholt S., (2023b), From Free Association to Goal-directed Problem-solving—Network Analysis of Students’ Use of Chemical Concepts in Mechanistic Reasoning, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 90–109.
  8. Atkinson M. B. and Bretz S. L., (2021), Measuring Changes in Undergraduate Chemistry Students’ Reasoning with Reaction Coordinate Diagrams: A Longitudinal, Multi-institution Study, J. Chem. Educ., 98, 1064–1076 DOI:10.1021/acs.jchemed.0c01419.
  9. Atkinson M. B., Popova M., Croisant M., Reed D. J. and Bretz S. L., (2020), Development of the Reaction Coordinate Diagram Inventory: Measuring Student Thinking and Confidence, J. Chem. Educ., 97, 1841–1851 DOI:10.1021/acs.jchemed.9b01186.
  10. Barakat S. and Orgill M., (2024), Identifying the critical features of resonance: instructors’ intentions for the teaching and learning of resonance in General Chemistry I and Organic Chemistry I, Chem. Educ. Res. Pract., 25, 491–505 10.1039/d3rp00289f.
  11. Becker N., Noyes K. and Cooper M., (2016), Characterizing Students’ Mechanistic Reasoning about London Dispersion Forces, J. Chem. Educ., 93, 1713–1724 DOI:10.1021/acs.jchemed.6b00298.
  12. Bernholt S., Eckhard J., Rodemer M., Langner A., Asmussen G. and Graulich N., (2023), Designing Tutorial Videos to Support Students’ Learning of Reaction Mechanisms in Organic Chemistry, in Dori Y., Ngai C. and Szteinberg G. (ed.) Digital Learning and Teaching in Chemistry, The Royal Society of Chemistry, pp. 234–248.
  13. Bhattacharyya G., (2006), Practitioner development in organic chemistry: how graduate students conceptualize organic acids, Chem. Educ. Res. Pract., 7, 240–247.
  14. Bhattacharyya G., (2013), From Source to Sink: Mechanistic Reasoning Using the Electron-Pushing Formalism, J. Chem. Educ., 90, 1282–1289 DOI:10.1021/ed300765k.
  15. Bhattacharyya G., (2014), Trials and tribulations: student approaches and difficulties with proposing mechanisms using the electron-pushing formalism, Chem. Educ. Res. Pract., 15, 594–609 10.1039/C3RP00127J.
  16. Bhattacharyya G., (2022), Assessment of Assessment in Organic Chemistry—Review and Analysis of Predominant Problem Types Related to Reactions and Mechanisms, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 267–284.
  17. Bhattacharyya G. and Bodner G. M., (2005), “It Gets Me to the Product”: How Students Propose Organic Mechanisms, J. Chem. Educ., 82, 1402–1407 DOI:10.1021/ed082p1402.
  18. Bhattacharyya G. and Harris M. S., (2018), Compromised Structures: Verbal Descriptions of Mechanism Diagrams, J. Chem. Educ., 95, 366–375 DOI:10.1021/acs.jchemed.7b00157.
  19. Bodé N. E., Deng J. M. and Flynn A. B., (2019), Getting Past the Rules and to the WHY: Causal Mechanistic Arguments When Judging the Plausibility of Organic Reaction Mechanisms, J. Chem. Educ., 96, 1068–1082 DOI:10.1021/acs.jchemed.8b00719.
  20. Bowen C. W., (1990), Representational Systems Used by Graduate-Students While Problem-Solving in Organic-Synthesis, J. Res. Sci. Teach., 27, 351–370 DOI:10.1002/tea.3660270406.
  21. Bowen C. W. and Bodner G. M., (1991), Problem-solving processes used by students in organic synthesis, Int. J. Sci. Educ., 13, 143–158 DOI:10.1080/0950069910130202.
  22. Brandfonbrener P. B., Watts F. M. and Shultz G. V., (2021), Organic Chemistry Students’ Written Descriptions and Explanations of Resonance and Its Influence on Reactivity, J. Chem. Educ., 98, 3431–3441 DOI:10.1021/acs.jchemed.1c00660.
  23. Braun I. and Graulich N., (2022), Die Zeichnung im Blick – Nutzung von Eye-Tracking zur Analyse der zeichnerischen Erschließung von Mesomerie-Aufgaben [Keeping an eye on the drawing – Using eye-tracking to analyze the drawing process of resonance tasks], CHEMKON, 29, 261–266 DOI:10.1002/ckon.202200007.
  24. Braun I. and Graulich N., (2024a), Exploring diversity: student's (un-)productive use of resonance in organic chemistry tasks through the lens of the coordination class theory, Chem. Educ. Res. Pract., 25, 643–671 10.1039/d3rp00298e.
  25. Braun I. and Graulich N., (2024b), The More Explicit, the Better? How the Indication of Electron Lone Pairs Affects Students’ Consideration of Resonance in Organic Chemistry, J. Chem. Educ., 101, 4830–4836 DOI:10.1021/acs.jchemed.4c00762.
  26. Braun I., Langner A. and Graulich N., (2022), Let's draw molecules: Students’ sequential drawing processes of resonance structures in organic chemistry, Front. Educ., 7 1055280 DOI:10.3389/feduc.2022.1055280.
  27. Braun I., Lewis S. E. and Graulich N., (2025), A question of pattern recognition: investigating the impact of structure variation on students’ proficiency in deciding about resonance stabilization, Chem. Educ. Res. Pract., 26(1), 158–182 10.1039/d4rp00155a.
  28. Carle M. S. and Flynn A. B., (2020), Essential learning outcomes for delocalization (resonance) concepts: How are they taught, practiced, and assessed in organic chemistry? Chem. Educ. Res. Pract., 21, 622–637 10.1039/c9rp00203k.
  29. Caspari I. and Graulich N., (2019), Scaffolding the Structure of Organic Chemistry Students’ multivariate Mechanistic Reasoning, Int. J. Phys. Chem. Educ., 11, 31–43.
  30. Caspari I., Kranz D. and Graulich N., (2018a), Resolving the complexity of organic chemistry students' reasoning through the lens of a mechanistic framework, Chem. Educ. Res. Pract., 19, 1117–1141 10.1039/c8rp00131f.
  31. Caspari I., Weinrich M., Sevian H. and Graulich N., (2018b), This mechanistic step is “productive”: organic chemistry students' backward-oriented reasoning, Chem. Educ. Res. Pract., 19, 42–59 10.1039/C7RP00124J.
  32. Chen Y.-C., (2022), Is Uncertainty a Barrier or Resource to Advance Science? The Role of Uncertainty in Science and Its Implications for Science Teaching and Learning, Sci. Educ., 31, 543–549 DOI:10.1007/s11191-021-00244-9.
  33. Chi M. T. H. and VanLehn K. A., (2012), Seeing deep structure from the interactions of surface features, Educ. Psychol., 47, 177–188 DOI:10.1080/00461520.2012.695709.
  34. Collini M. A., Miguel K., Weber R. and Atkinson M. B., (2024), Investigating changes in students’ attitudes towards organic chemistry: a longitudinal study, Chem. Educ. Res. Pract., 25, 613–624 10.1039/D3RP00228D.
  35. Collini M. A., Rocha L. A., Ford J. E., Weber R. and Atkinson M. B., (2023), Characterizing and identifying influences on undergraduates’ attitudes towards organic chemistry, Chem. Educ. Res. Pract., 24, 723–739 10.1039/D2RP00256F.
  36. Connor M. C., Glass B. H., Finkenstaedt-Quinn S. A. and Shultz G. V., (2021), Developing Expertise in (1)H NMR Spectral Interpretation, J. Org. Chem., 86, 1385–1395 DOI:10.1021/acs.joc.0c01398.
  37. Cooper M. M., Grove N., Underwood S. M. and Klymkowsky M. W., (2010), Lost in Lewis Structures: An Investigation of Student Difficulties in Developing Representational Competence, J. Chem. Educ., 87, 869–874 DOI:10.1021/ed900004y.
  38. Cooper M. M. and Klymkowsky M. W., (2024), Let Us Not Squander the Affordances of LLMs for the Sake of Expedience: Using Retrieval Augmented Generative AI Chatbots to Support and Evaluate Student Reasoning, J. Chem. Educ., 101, 4847–4856 DOI:10.1021/acs.jchemed.4c00765.
  39. Cooper M. M., Kouyoumdjian H. and Underwood S. M., (2016), Investigating Students’ Reasoning about Acid–Base Reactions, J. Chem. Educ., 93, 1703–1712 DOI:10.1021/acs.jchemed.6b00417.
  40. Cooper M. M., Stieff M. and DeSutter D., (2017), Sketching the Invisible to Predict the Visible: From Drawing to Modeling in Chemistry, Top. Cogn. Sci., 9, 902–920 DOI:10.1111/tops.12285.
  41. Cooper M. M., Stowe R. L., Crandell O. M. and Klymkowsky M. W., (2019), Organic Chemistry, Life, the Universe and Everything (OCLUE): A Transformed Organic Chemistry Curriculum, J. Chem. Educ., 96, 1858–1872 DOI:10.1021/acs.jchemed.9b00401.
  42. Cox, Jr. C. T., Shimomura R., Hall J. T. V. E. and Gao S., (2024), Conjugation Amplifies Complexity: A Comparison of Problem-Solving Strategies Across the Organic Chemistry Sequence, J. Chem. Educ., 101, 2608–2617 DOI:10.1021/acs.jchemed.3c00847.
  43. Crandell O. M., Kouyoumdjian H., Underwood S. M. and Cooper M. M., (2019), Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry, J. Chem. Educ., 96, 213–226 DOI:10.1021/acs.jchemed.8b00784.
  44. Crandell O. M. and Pazicni S., (2023), Leveraging cognitive resources to investigate the impact of molecular orientation on students’ activation of symmetry resources, Chem. Educ. Res. Pract., 24, 353–368 10.1039/D2RP00164K.
  45. Crimmins M. T. and Midkiff B., (2017), High Structure Active Learning Pedagogy for the Teaching of Organic Chemistry: Assessing the Impact on Academic Outcomes, J. Chem. Educ., 94, 429–438 DOI:10.1021/acs.jchemed.6b00663.
  46. Crowder C. J. and Raker J. R., (2024), Patterns in Explanations of Organic Chemistry Reaction Mechanisms: A Text Analysis by Level of Explanation Sophistication, J. Chem. Educ., 101(12), 5203–5220 DOI:10.1021/acs.jchemed.4c01042.
  47. Crowder C. J., Yik B. J., Frost S. J. H., Cruz-Ramírez de Arellano D. and Raker J. R., (2024), Impact of Prompt Cueing on Level of Explanation Sophistication for Organic Reaction Mechanisms, J. Chem. Educ., 101(2), 398–410 DOI:10.1021/acs.jchemed.3c00710.
  48. Cruz-Ramírez de Arellano D. and Towns M., (2014), Students understanding of alkyl halide reactions in undergraduate organic chemistry, Chem. Educ. Res. Pract., 15, 501–515 10.1039/C3RP00089C.
  49. Cullipher S. and Sevian H., (2015), Atoms versus Bonds: How Students Look at Spectra, J. Chem. Educ., 92, 1996–2005 DOI:10.1021/acs.jchemed.5b00529.
  50. DeCocq V. and Bhattacharyya G., (2019), TMI (Too much information)! Effects of given information on organic chemistry students’ approaches to solving mechanism tasks, Chem. Educ. Res. Pract., 20, 213–228 10.1039/c8rp00214b.
  51. DeFever R. S., Bruce H. and Bhattacharyya G., (2015), Mental Rolodexing: Senior Chemistry Majors’ Understanding of Chemical and Physical Properties, J. Chem. Educ., 92, 415–426 DOI:10.1021/ed500360g.
  52. DeGlopper K. S., Schwarz C. E., Ellias N. J. and Stowe R. L., (2022), Impact of Assessment Emphasis on Organic Chemistry Students’ Explanations for an Alkene Addition Reaction, J. Chem. Educ., 99, 1368–1382 DOI:10.1021/acs.jchemed.1c01080.
  53. Demirdöğen B., Nelsen I. and Lewis S. E., (2023), Organic chemistry students’ use of stability in mental models on acid and base strength, Chem. Educ. Res. Pract., 24, 1127–1141 10.1039/D3RP00049D.
  54. Deng J. M. and Flynn A. B., (2021), Reasoning, granularity, and comparisons in students’ arguments on two organic chemistry items, Chem. Educ. Res. Pract., 22, 749–771 10.1039/D0RP00320D.
  55. Deng J. M., Rahmani M. and Flynn A. B., (2022), The role of language in students’ justifications of chemical phenomena, Int. J. Sci. Educ., 1–21.
  56. Disessa A. A., (1983), Phenomenology and the evolution of intuition, in Gentner D. and Stevens A. L. (ed.) Mental Models, Psychology Press.
  57. DiSessa A. A. and Wagner J. F., (2005), What coordination has to say about transfer, in Mestre J. P. (ed.) Transfer of Learning from a Modern Multidisciplinary Perspective, Information Age Publishing, vol. 89, pp. 121–154.
  58. Dockterman D., (2018), Insights from 200+ years of personalized learning, NPJ Sci. Learn., 3, 15 DOI:10.1038/s41539-018-0033-x.
  59. Dood A. J., Dood J. C., Cruz-Ramírez de Arellano D., Fields K. B. and Raker J. R., (2020a), Using the Research Literature to Develop an Adaptive Intervention to Improve Student Explanations of an SN1 Reaction Mechanism, J. Chem. Educ., 97, 3551–3562 DOI:10.1021/acs.jchemed.0c00569.
  60. Dood A. J., Dood J. C., de Arellano D. C. R., Fields K. B. and Raker J. R., (2020b), Analyzing explanations of substitution reactions using lexical analysis and logistic regression techniques, Chem. Educ. Res. Pract., 21, 267–286 10.1039/c9rp00148d.
  61. Dood A. J., Fields K. B. and Raker J. R., (2018), Using Lexical Analysis To Predict Lewis Acid-Base Model Use in Responses to an Acid-Base Proton-Transfer Reaction, J. Chem. Educ., 95, 1267–1275 DOI:10.1021/acs.jchemed.8b00177.
  62. Dood A. J. and Watts F. M., (2022), Mechanistic Reasoning in Organic Chemistry: A Scoping Review of How Students Describe and Explain Mechanisms in the Chemistry Education Research Literature, J. Chem. Educ., 99, 2864–2876 DOI:10.1021/acs.jchemed.2c00313.
  63. Dood A. J. and Watts F. M., (2023), Students’ Strategies, Struggles, and Successes with Mechanism Problem Solving in Organic Chemistry: A Scoping Review of the Research Literature, J. Chem. Educ., 100, 53–68 DOI:10.1021/acs.jchemed.2c00572.
  64. Dood A. J., Watts F. M., Connor M. C. and Shultz G. V., (2024), Automated Text Analysis of Organic Chemistry Students’ Written Hypotheses, J. Chem. Educ., 101, 807–818 DOI:10.1021/acs.jchemed.3c00757.
  65. Duschl R., (2008), Science Education in Three-Part Harmony: Balancing Conceptual, Epistemic, and Social Learning Goals, Rev. Res. Educ., 32, 268–291 DOI:10.3102/0091732x07309371.
  66. Eckhard J., Rodemer M., Bernholt S. and Graulich N., (2022a), What Do University Students Truly Learn When Watching Tutorial Videos in Organic Chemistry? An Exploratory Study Focusing on Mechanistic Reasoning, J. Chem. Educ., 99, 2231–2244 DOI:10.1021/acs.jchemed.2c00076.
  67. Eckhard J., Rodemer M., Langner A., Bernholt S. and Graulich N., (2022b), Let's frame it differently – analysis of instructors’ mechanistic explanations, Chem. Educ. Res. Pract., 23, 78–99 10.1039/d1rp00064k.
  68. Farheen A., Martin N. and Lewis S. E., (2024), Student perceptions of partial charges and nucleophilicity/electrophilicity when provided with either a bond-line, ball-and-stick, or electrostatic potential map for molecular representation, Chem. Educ. Res. Pract., 25, 343–359 10.1039/D3RP00173C.
  69. Farré A. S. and Lorenzo M. G., (2009), Another piece of the puzzle: the relationship between beliefs and practice in higher education organic chemistry, Chem. Educ. Res. Pract., 10, 176–184 10.1039/B908256P.
  70. Ferguson R. and Bodner G. M., (2008), Making sense of the arrow-pushing formalism among chemistry majors enrolled in organic chemistry, Chem. Educ. Res. Pract., 9, 102–113 10.1039/b806225k.
  71. Finkenstaedt-Quinn S. A., Petterson M., Gere A. and Shultz G., (2021), Praxis of Writing-to-Learn: A Model for the Design and Propagation of Writing-to-Learn in STEM, J. Chem. Educ., 98, 1548–1555 DOI:10.1021/acs.jchemed.0c01482.
  72. Flynn A. B. and Featherstone R. B., (2017), Language of mechanisms: exam analysis reveals students' strengths, strategies, and errors when using the electron-pushing formalism (curved arrows) in new reactions, Chem. Educ. Res. Pract., 18, 64–77 10.1039/c6rp00126b.
  73. Flynn A. B. and Ogilvie W. W., (2015), Mechanisms before Reactions: A Mechanistic Approach to the Organic Chemistry Curriculum Based on Patterns of Electron Flow, J. Chem. Educ., 92, 803–808 DOI:10.1021/ed500284d.
  74. Franovic C. G. C., Noyes K., Stoltzfus J. R., Schwarz C. V., Long T. M. and Cooper M. M., (2023), Undergraduate Chemistry and Biology Students’ Use of Causal Mechanistic Reasoning to Explain and Predict Preferential Protein–Ligand Binding Activity, J. Chem. Educ., 100, 1716–1727 DOI:10.1021/acs.jchemed.2c00737.
  75. Frost S. J. H., Yik B. J., Dood A. J., de Arellano D. C. R., Fields K. B. and Raker J. R., (2023), Evaluating electrophile and nucleophile understanding: a large-scale study of learners' explanations of reaction mechanisms, Chem. Educ. Res. Pract., 24, 706–722 10.1039/d2rp00327a.
  76. Galloway K. R., Stoyanovich C. and Flynn A. B., (2017), Students' interpretations of mechanistic language in organic chemistry before learning reactions, Chem. Educ. Res. Pract., 18, 353–374 10.1039/c6rp00231e.
  77. Gao S., Outlaw T. C., Liang-Lin J. G., Feng A., Shimomura R., Roizen J. L. and Cox C. T., (2024), Analysis of resources applied to rationalize elimination mechanisms, Chem. Educ. Res. Pract., 25, 62–78 10.1039/D3RP00031A.
  78. García Ramos J. and Towns M. H., (2024), Projecting Is Such Sweet Sorrow: Undergraduate Students’ Interpretation of Fischer and Haworth Carbohydrate Projections, J. Chem. Educ., 101, 2730–2739 DOI:10.1021/acs.jchemed.4c00155.
  79. Gilbert J. K. and Treagust D., (2009), Multiple Representations in Chemical Education – Models and Modeling in Science Education, Springer.
  80. Goodwin W., (2003), Explanation in Organic Chemistry, Ann. N. Y. Acad. Sci., 988, 141–153 DOI:10.1111/j.1749-6632.2003.tb06093.x.
  81. Goodwin W. M., (2007), Structural formulas and explanation in organic chemistry, Found. Chem., 10, 117–127 DOI:10.1007/s10698-007-9033-2.
  82. Graulich N., (2015), The tip of the iceberg in organic chemistry classes: how do students deal with the invisible? Chem. Educ. Res. Pract., 16, 9–21 10.1039/c4rp00165f.
  83. Graulich N. and Bhattacharyya G., (2017), Investigating students' similarity judgments in organic chemistry, Chem. Educ. Res. Pract., 18, 774–784 10.1039/c7rp00055c.
  84. Graulich N., Hedtrich S. and Harzenetter R., (2019), Explicit versus implicit similarity – exploring relational conceptual understanding in organic chemistry, Chem. Educ. Res. Pract., 20, 924–936 10.1039/C9RP00054B.
  85. Graulich N. and Schween M., (2018), Concept-Oriented Task Design: Making Purposeful Case Comparisons in Organic Chemistry, J. Chem. Educ., 95, 376–383 DOI:10.1021/acs.jchemed.7b00672.
  86. Grove N. P. and Bretz S. L., (2010), Perrys Scheme of Intellectual and Epistemological Development as a framework for describing student difficulties in learning organic chemistry, Chem. Educ. Res. Pract., 11, 207–211.
  87. Grove N. P., Cooper M. M. and Cox E. L., (2012a), Does Mechanistic Thinking Improve Student Success in Organic Chemistry? J. Chem. Educ., 89, 850–853.
  88. Grove N. P., Cooper M. M. and Rush K. M., (2012b), Decorating with Arrows: Toward the Development of Representational Competence in Organic Chemistry, J. Chem. Educ., 89, 844–849 DOI:10.1021/ed2003934.
  89. Haas D. B., Watts F. M., Dood A. J. and Shultz G. V., (2024), Analysis of organic chemistry students’ developing reasoning elicited by a scaffolded case comparison activity, Chem. Educ. Res. Pract., 25, 742–759 10.1039/D4RP00021H.
  90. Hammer D., (1996), Misconceptions or p-prims: How may alternative perspectives of cognitive structure influence instructional perceptions and intentions, J. Learn. Sci., 5, 97–127.
  91. Hammer D., Elby A., Scherr R. E. and Redish E. F., (2005), Resources, framing, and transfer, in Mestre J. P. (ed.) Transfer of learning from a modern multidisciplinary perspective, Information Age Publishing (IAP), pp. 89–119.
  92. Haskel-Ittah M., (2023), Explanatory black boxes and mechanistic reasoning, J. Res. Sci. Teach., 60, 915–933 DOI:10.1002/tea.21817.
  93. Hegarty M. and Waller D. A., (2005), Individual Differences in Spatial Abilities, in Shah P. and Miyake A. (ed.) The Cambridge Handbook of Visuospatial Thinking, Cambridge University Press, ch. 4, pp. 121–169.
  94. Hoffmann R. and Laszlo P., (1991), Representation in Chemistry, Angew. Chem., Int. Ed., 30, 1–16.
  95. Jones T., Romanov A., Pratt J. M. and Popova M., (2022), Multi-framework case study characterizing organic chemistry instructors’ approaches toward teaching about representations, Chem. Educ. Res. Pract., 23, 930–947 10.1039/d2rp00173j.
  96. Justi R. and Gilbert J., (2003), Models and Modelling in Chemical Education, in Gilbert J. K., De Jong O., Justi R., Treagust D. F. and Van Driel J. H. (ed.) Chemical Education: Towards Research-based Practice, Netherlands: Springer, ch. 3, pp. 47–68.
  97. Karch J. M., Maggiore N. M., Pierre-Louis J. R., Strange D., Dini V. and Caspari-Gnann I., (2024), Making in-the-moment learning visible: a framework to identify and compare various ways of learning through continuity and discourse change, Sci. Educ., 108, 1292–1328 DOI:10.1002/sce.21874.
  98. Kellman P. J. and Massey C. M., (2013), Chapter Four – Perceptual Learning, Cognition, and Expertise, in Ross B. H. (ed.) Psychology of Learning and Motivation, Academic Press, vol. 58, pp. 117–165.
  99. Kiernan N. A., Manches A. and Seery M. K., (2024), Resources for reasoning of chemistry concepts: multimodal molecular geometry, Chem. Educ. Res. Pract., 25, 524–543 10.1039/D3RP00186E.
  100. Koslowski B., (1996), Theory and evidence: The development of scientific reasoning, Mit Press.
  101. Kozma R., (2020), Use of multiple representations by experts and novice, in Handbook of learning from multiple representations and perspectives, (ed.) Meter P. V., List A., Lombardi D. and Kendeou P., Routledge/Taylor and Francis Group, pp. 33–47.
  102. Kozma R., Chin E., Russell J. and Marx N., (2000), The roles of representations and tools in the chemistry laboratory and their implications for chemistry learning, J. Learn. Sci., 9, 105–143 DOI:10.1207/s15327809jls0902_1.
  103. Kozma R. B. and Russell J., (1997), Multimedia and understanding: expert and novice responses to different representations of chemical phenomena, J. Res. Sci. Teach., 34, 949–968 DOI:10.1002/(sici)1098-2736(199711)34:9[double bond splayed right]949::aid-tea7[double bond splayed left]3.0.co;2-u.
  104. Kozma R. and Russell J., (2005), Students Becoming Chemists: Developing Representational Competence, Visualization in Science Education, Springer, pp. 121–145.
  105. Kraft A., Popova M., Erdmann R. M., Harshman J. and Stains M., (2023), Tensions between depth and breadth: an exploratory investigation of chemistry assistant professors’ perspectives on content coverage, Chem. Educ. Res. Pract., 24, 567–576 10.1039/d2rp00299j.
  106. Kraft A., Strickland A. M. and Bhattacharyya G., (2010), Reasonable reasoning: multi-variate problem-solving in organic chemistry, Chem. Educ. Res. Pract., 11, 281–292 10.1039/C0RP90003F.
  107. Kranz D., Schween M. and Graulich N., (2023), Patterns of reasoning – exploring the interplay of students’ work with a scaffold and their conceptual knowledge in organic chemistry, Chem. Educ. Res. Pract., 24, 453–477 10.1039/D2RP00132B.
  108. Krist C., Schwarz C. V. and Reiser B. J., (2019), Identifying Essential Epistemic Heuristics for Guiding Mechanistic Reasoning in Science Learning, J. Learn Sci., 28, 160–205 DOI:10.1080/10508406.2018.1510404.
  109. Krutkowski S., (2017), A strengths-based approach to widening participation students in higher education, Ref. Services Rev., 45, 227–241 DOI:10.1108/RSR-10-2016-0070.
  110. Kubsch M., Czinczel B., Lossjew J., Wyrwich T., Bednorz D., Bernholt S., Fiedler D., Strauß S., Cress U., Drachsler H., Neumann K. and Rummel N., (2022), Toward learning progression analytics—Developing learning environments for the automated analysis of learning using evidence centered design, Front. Educ., 7, 981910 DOI:10.3389/feduc.2022.981910.
  111. Kurdziel J. P., Turner J. A., Luft J. A. and Roehrig G. H., (2003), Graduate Teaching Assistants and Inquiry-Based Instruction: Implications for Graduate Teaching Assistant Training, J. Chem. Educ., 80, 1206 DOI:10.1021/ed080p1206.
  112. Liang-Lin J. G., Locascio T. R., Witherspoon A., Gao S., Laster J., Tripp M., Shorb J. M. and Cox, Jr. C. T., (2024), How Much? How Fast? A Study of Kinetics and Thermodynamics through the Lens of Reaction Coordinate Diagrams in Organic Chemistry, J. Chem. Educ., 101, 2308–2320 DOI:10.1021/acs.jchemed.3c00437.
  113. Lieber L. and Graulich N., (2020), Thinking in Alternatives—A Task Design for Challenging Students’ Problem-Solving Approaches in Organic Chemistry, J. Chem. Educ., 97, 3731–3738 DOI:10.1021/acs.jchemed.0c00248.
  114. Lieber L. and Graulich N., (2022), Investigating students’ argumentation when judging the plausibility of alternative reaction pathways in organic chemistry, Chem. Educ. Res. Pract., 23, 38–54 10.1039/d1rp00145k.
  115. Lieber L. S., Ibraj K., Caspari-Gnann I. and Graulich N., (2022a), Closing the gap of organic chemistry students’ performance with an adaptive scaffold for argumentation patterns, Chem. Educ. Res. Pract., 23, 811–828 10.1039/D2RP00016D.
  116. Lieber L. S., Ibraj K., Caspari-Gnann I. and Graulich N., (2022b), Students’ Individual Needs Matter: A Training to Adaptively Address Students’ Argumentation Skills in Organic Chemistry, J. Chem. Educ., 99, 2754–2761 DOI:10.1021/acs.jchemed.2c00213.
  117. Lorenzo M. G., Farré A. S. and Rossi A. M., (2010), Teachers' discursive practices in a first organic chemistry course, Contemp. Sci. Edc. Res., 13–22.
  118. Luque-Corredera C., Bartolome E. and Bradshaw B., (2024), “Synthetic Map”: A Graphic Organizer Inspired by Artificial Neural Network Paradigms for Learning Organic Synthesis, J. Chem. Educ., 101, 4256–4267 DOI:10.1021/acs.jchemed.4c00592.
  119. Maeyer J. and Talanquer V., (2013), Making predictions about chemical reactivity: assumptions and heuristics, J. Res. Sci. Teach., 50, 748–767 DOI:10.1002/tea.21092.
  120. Martin P. P. and Graulich N., (2023), When a machine detects student reasoning: a review of machine learning-based formative assessment of mechanistic reasoning, Chem. Educ. Res. Pract., 24, 407–427 10.1039/D2RP00287F.
  121. Martin P. P. and Graulich N., (2024), Navigating the data frontier in science assessment: advancing data augmentation strategies for machine learning applications with generative artificial intelligence, Comput. Educ.: Artif. Intell., 7, 100265 DOI:10.1016/j.caeai.2024.100265.
  122. Martin P. P., Kranz D., Wulff P. and Graulich N., (2024), Exploring new depths: applying machine learning for the analysis of student argumentation in chemistry, J. Res. Sci. Teach., 61, 1757–1792 DOI:10.1002/tea.21903.
  123. Martinez B. L., Roche Allred Z. D. and Underwood S. M., (2024), A Qualitative Investigation of Higher Education Chemistry Students’ Perceptions of What Scientists Do, J. Chem. Educ., 101, 4071–4082 DOI:10.1021/acs.jchemed.4c00334.
  124. Miller E., Manz E., Russ R., Stroupe D. and Berland L., (2018), Addressing the epistemic elephant in the room: epistemic agency and the next generation science standards, J. Res. Sci. Teach., 55, 1053–1075 DOI:10.1002/tea.21459.
  125. Molenaar I., (2022), The concept of hybrid human-AI regulation: exemplifying how to support young learners’ self-regulated learning, Comput. Educ.: Artif. Intell., 3, 100070 DOI:10.1016/j.caeai.2022.100070.
  126. Moreira P. and Talanquer V., (2024), Exploring relationships that college instructors seek to build with intention in chemistry classrooms, Chem. Educ. Res. Pract., 25, 225–241 10.1039/d3rp00198a.
  127. Nelsen I., Farheen A. and Lewis S. E., (2024), How ordering concrete and abstract representations in intermolecular force chemistry tasks influences students’ thought processes on the location of dipole–dipole interactions, Chem. Educ. Res. Pract., 25, 815–832 10.1039/D4RP00025K.
  128. Novak G. M., Gavrin A., Patterson E. and Christian W., (1999), Just-In-Time Teaching: Blending Active Learning with Web Technology, Upper Saddle River NJ: Prentice Hall.
  129. Noyes K., Carlson C. G., Stoltzfus J. R., Schwarz C. V., Long T. M. and Cooper M. M., (2022), A Deep Look into Designing a Task and Coding Scheme through the Lens of Causal Mechanistic Reasoning, J. Chem. Educ., 99, 874–885 DOI:10.1021/acs.jchemed.1c00959.
  130. Patton Davis L. and Museus S., (2019), What Is Deficit Thinking? An Analysis of Conceptualizations of Deficit Thinking and Implications for Scholarly Research, Currents, 1, 117–130 DOI:10.3998/currents.17387731.0001.110.
  131. Penn J. H. and Al-Shammari A. G., (2008), Teaching Reaction Mechanisms Using the Curved Arrow Neglect (CAN) Method, J. Chem. Educ., 85, 1291 DOI:10.1021/ed085p1291.
  132. Plass J. L. and Pawar S., (2020), Toward a taxonomy of adaptivity for learning, J. Res. Technol. Educ., 52, 275–300 DOI:10.1080/15391523.2020.1719943.
  133. Pölloth B., Diekemper D., Bosch C. and Schwarzer S., (2024), Reply to the ‘Comment on “What resources do high school students activate to link energetic and structural changes in chemical reactions? – A qualitative study”’ by K. S. Taber, Chem. Educ. Res. Pract., 25, 958–965 10.1039/D4RP00031E.
  134. Pölloth B., Diekemper D. and Schwarzer S., (2023), What resources do high school students activate to link energetic and structural changes in chemical reactions? – A qualitative study, Chem. Educ. Res. Pract., 24, 1153–1173 10.1039/D3RP00068K.
  135. Popova M. and Bretz S. L., (2018a), “It's Only the Major Product That We Care About in Organic Chemistry”: An Analysis of Students’ Annotations of Reaction Coordinate Diagrams, J. Chem. Educ., 95, 1086–1093 DOI:10.1021/acs.jchemed.8b00153.
  136. Popova M. and Bretz S. L., (2018b), Organic chemistry students' challenges with coherence formation between reactions and reaction coordinate diagrams, Chem. Educ. Res. Pract., 19, 732–745 10.1039/C8RP00064F.
  137. Popova M. and Bretz S. L., (2018c), Organic chemistry students' interpretations of the surface features of reaction coordinate diagrams, Chem. Educ. Res. Pract., 19, 919–931 10.1039/C8RP00063H.
  138. Popova M. and Jones T., (2021), Chemistry instructors’ intentions toward developing, teaching, and assessing student representational competence skills, Chem. Educ. Res. Pract., 22, 733–748 10.1039/d0rp00329h.
  139. Raker J. R., Yik B. J. and Dood A. J., (2022), Development of a Generalizable Framework for Machine Learning-Based Evaluation of Written Explanations of Reaction Mechanisms from the Postsecondary Organic Chemistry Curriculum, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 304–319.
  140. Ralph V. R., Scharlott L. J., Schwarz C. E., Becker N. M. and Stowe R. L., (2022), Beyond instructional practices: characterizing learning environments that support students in explaining chemical phenomena, J. Res. Sci. Teach., 59, 841–875 DOI:10.1002/tea.21746.
  141. Rau M. A., (2017), Conditions for the Effectiveness of Multiple Visual Representations in Enhancing STEM Learning, Educ. Psychol. Rev., 29, 717–761 DOI:10.1007/s10648-016-9365-3.
  142. Rau M. A., (2018), Making connections among multiple visual representations: how do sense-making skills and perceptual fluency relate to learning of chemistry knowledge? Instr. Sci., 46, 209–243 DOI:10.1007/s11251-017-9431-3.
  143. Rau M. A., Kennedy K., Oxtoby L., Bollom M. and Moore J. W., (2017), Unpacking “Active Learning”: A Combination of Flipped Classroom and Collaboration Support Is More Effective but Collaboration Support Alone Is Not, J. Chem. Educ., 94, 1406–1414 DOI:10.1021/acs.jchemed.7b00240.
  144. Rodemer M., Eckhard J., Graulich N. and Bernholt S., (2020), Decoding Case Comparisons in Organic Chemistry: Eye-Tracking Students’ Visual Behavior, J. Chem. Educ., 97, 3530–3539 DOI:10.1021/acs.jchemed.0c00418.
  145. Rodriguez J.-M. G., (2024), Using Analytic Autoethnography to Characterize the Variation in the Application of the Resources Framework: What Is a Resource? J. Chem. Educ., 101, 3676–3690 DOI:10.1021/acs.jchemed.4c00309.
  146. Rodriguez J.-M. G., Stricker A. R. and Becker N. M., (2020), Students’ interpretation and use of graphical representations: insights afforded by modeling the varied population schema as a coordination class, Chem. Educ. Res. Pract., 21, 536–560 10.1039/C9RP00249A.
  147. Rosenberg J. M., Kubsch M., Wagenmakers E.-J. and Dogucu M., (2022), Making Sense of Uncertainty in the Science Classroom, Sci. Educ., 31, 1239–1262 DOI:10.1007/s11191-022-00341-3.
  148. Rost M. and Knuuttila T., (2022), Models as Epistemic Artifacts for Scientific Reasoning in Science Education Research, Educ. Sci., 12, 276.
  149. Rotich F., Ward L., Beck C. and Popova M., (2024), Attention is currency: how surface features of Lewis structures influence organic chemistry student reasoning about stability, Chem. Educ. Res. Pract., 25, 1071–1089 10.1039/D4RP00030G.
  150. Rozumko A., (2017), Adverbial Markers of Epistemic Modality Across Disciplinary Discourses: A Contrastive Study of Research Articles in Six Academic Disciplines, Studia Anglica Posnaniensia, 52, 73–101 DOI:10.1515/stap-2017-0004.
  151. Russ R. S., Scherr R. E., Hammer D. and Mikeska J., (2008), Recognizing mechanistic reasoning in student scientific inquiry: a framework for discourse analysis developed from philosophy of science, Sci. Educ., 92, 499–525 DOI:10.1002/sce.20264.
  152. Sandi-Urena S., Loría Cambronero G. and Jinesta Chaves D., (2019), Conceptualisation of Lewis structures by chemistry majors, Chem. Teach. Int., 2, 1–9 DOI:10.1515/cti-2018-0019.
  153. Schönborn K. J. and Anderson T. R., (2008), A Model of Factors Determining Students’ Ability to Interpret External Representations in Biochemistry, Int. J. Sci. Educ., 31, 193–232 DOI:10.1080/09500690701670535.
  154. Schwarz C. E., DeGlopper K. S., Esselman B. J. and Stowe R. L., (2024a), Tweaking Instructional Practices Was Not the Answer: How Increasing the Interactivity of a Model-Centered Organic Chemistry Course Affected Student Outcomes, J. Chem. Educ., 101, 2215–2230 DOI:10.1021/acs.jchemed.3c01127.
  155. Schwarz C. E., DeGlopper K. S., Greco N. C., Russ R. S. and Stowe R. L., (2024b), Modeling Student Negotiation of Assessment-Related Epistemological Messages in a College Science Course, Sci. Educ., 1–19 DOI:10.1002/sce.21914.
  156. Sevian H., Bernholt S., Szteinberg G. A., Auguste S. and Perez L. C., (2015), Use of representation mapping to capture abstraction in problem solving in different courses in chemistry, Chem. Educ. Res. Pract., 16, 429–446 10.1039/c5rp00030k.
  157. Sevian H. and Talanquer V., (2014), Rethinking chemistry: a learning progression on chemical thinking, Chem. Educ. Res. Pract., 15, 10–23 10.1039/c3rp00111c.
  158. Shemshack A., Kinshuk and Spector J. M., (2021), A comprehensive analysis of personalized learning components, J. Comput. Educ., 8, 485–503 DOI:10.1007/s40692-021-00188-7.
  159. Shiroda M., Franovic C. G.-C., de Lima J., Noyes K., Babi D., Beltran-Flores E., Kesh J., McKay R. L., Persson-Gordon E., Cooper M. M., Long T. M., Schwarz C. V. and Stoltzfus J. R., (2024), Examining and Supporting Mechanistic Explanations Across Chemistry and Biology Courses, CBE—Life Sci. Educ., 23, 1–21 DOI:10.1187/cbe.23-08-0157.
  160. Silva F. C. and Sasseron L. H., (2025), The Positioning of Visual Representations As Epistemic Objects for the Teaching of Organic Chemistry, J. Chem. Educ DOI:10.1021/acs.jchemed.4c01018.
  161. Sim J. H., Daniel E. G. S. and Elwood J., (2014), Representational competence in chemistry: a comparison between students with different levels of understanding of basic chemical concepts and chemical representations, Cogent Educ., 1, 991180 DOI:10.1080/2331186X.2014.991180.
  162. Steinbach M., Eitemüller C., Rodemer M. and Walpuski M., (2024), The influence of representations on task difficulty in organic chemistry: an exploration using a novel paired-items test instrument, Int. J. Sci. Educ., 1–23 DOI:10.1080/09500693.2024.2378218.
  163. Stieff M. and DeSutter D., (2021), Sketching, not representational competence, predicts improved science learning, J. Res. Sci. Teach., 58, 128–156 DOI:10.1002/tea.21650.
  164. Stieff M., Scopelitis S. and Lira M., (2022), Disciplining Perception Spatial Thinking in Organic Chemistry Through Embodied Actions, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 232–247.
  165. Stieff M., Scopelitis S., Lira M. E. and DeSutter D., (2016), Improving Representational Competence with Concrete Models, Sci. Educ., 100, 344–363 DOI:10.1002/sce.21203.
  166. Stone D. C., (2021), Student success and the high school-university transition: 100 years of chemistry education research, Chem. Educ. Res. Pract., 22, 579–601 10.1039/D1RP00085C.
  167. Stowe R. L. and Esselman B. J., (2022), The Picture Is Not the Point: Toward Using Representations as Models for Making Sense of Phenomena, J. Chem. Educ., 100, 15–21 DOI:10.1021/acs.jchemed.2c00464.
  168. Stowe R. L., Scharlott L. J., Ralph V. R., Becker N. M. and Cooper M. M., (2021), You Are What You Assess: The Case for Emphasizing Chemistry on Chemistry Assessments, J. Chem. Educ., 98, 2490–2495 DOI:10.1021/acs.jchemed.1c00532.
  169. Strickland A. M., Kraft A. and Bhattacharyya G., (2010), What happens when representations fail to represent? Graduate students' mental models of organic chemistry diagrams, Chem. Educ. Res. Pract., 11, 293–301.
  170. Swiecki Z., Khosravi H., Chen G., Martinez-Maldonado R., Lodge J. M., Milligan S., Selwyn N. and Gašević D., (2022), Assessment in the age of artificial intelligence, Comput. Educ.: Artif. Intell., 3, 100075 DOI:10.1016/j.caeai.2022.100075.
  171. Talanquer V., (2013), When Atoms Want, J. Chem. Educ., 90, 1419–1424 DOI:10.1021/ed400311x.
  172. Talanquer V., (2014), Chemistry Education: Ten Heuristics To Tame, J. Chem. Educ., 91, 1091–1097 DOI:10.1021/ed4008765.
  173. Talanquer V., (2022), The Complexity of Reasoning about and with Chemical Representations, JACS Au, 2, 2658–2669 DOI:10.1021/jacsau.2c00498.
  174. Talanquer V., (2024), How We Simplify, Circumvent, or Distort Causal Mechanistic Reasoning in Chemistry, J. Chem. Educ., 101, 1793–1797 DOI:10.1021/acs.jchemed.4c00281.
  175. Talanquer V., Cole R. and Rushton G. T., (2024), Thinking and Learning in Nested Systems: The Classroom Level, J. Chem. Educ., 101, 295–306 DOI:10.1021/acs.jchemed.3c00839.
  176. Talanquer V. and Kelly R., (2024), Thinking and Learning in Nested Systems: The Individual Level, J. Chem. Educ., 101, 283–294 DOI:10.1021/acs.jchemed.3c00838.
  177. Teo T. W., Goh M. T. and Yeo L. W., (2014), Chemistry education research trends: 2004–2013, Chem. Educ. Res. Pract., 15, 470–487 10.1039/C4RP00104D.
  178. Topczewski J. J., Topczewski A. M., Tang H., Kendhammer L. K. and Pienta N. J., (2017), NMR Spectra through the Eyes of a Student: Eye Tracking Applied to NMR Items, J. Chem. Educ., 94, 29–37 DOI:10.1021/acs.jchemed.6b00528.
  179. Vygotsky L. S., (1978), Interaction between learning and development, in Cole M., John-Steiner V., Scribner S. and Souberman E. (ed.) Mind in society the development of higher psychological processes, Harvard University Press, pp. 79–91.
  180. Walsh K. H., Karch J. M. and Caspari-Gnann I., (2022), In-the-moment learning of organic chemistry during interactive lectures through the lens of practical epistemology analysis, in Graulich N. and Shultz G. V. (ed.) Student reasoning in organic chemistry, Royal Society of Chemistry, pp. 151–158.
  181. Wang Y., Apkarian N., Dancy M. H., Henderson C., Johnson E., Raker J. R. and Stains M., (2024), A National Snapshot of Introductory Chemistry Instructors and Their Instructional Practices, J. Chem. Educ., 101, 1457–1468 DOI:10.1021/acs.jchemed.4c00040.
  182. Ward L. W., Rotich F., Hoang J. and Popova M., (2022), Representational Competence Under the Magnifying Glass—The Interplay Between Student Reasoning Skills, Conceptual Understanding, and the Nature of Representations 1, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 36–56.
  183. Ward L., Rotich F., Raker J. R., Komperda R., Nedungadi S. and Popova M., (2025), Design, development, and evaluation of the organic chemistry representational competence assessment (ORCA), Chem. Educ. Res. Pract., 26(1), 244–258 10.1039/D3RP00188A.
  184. Watts F. M., Dood A. J. and Shultz G. V., (2022a), Developing machine learning models for automated analysis of organic chemistry students' written descriptions of organic reaction mechanisms, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 285–303.
  185. Watts F. M., Dood A. J. and Shultz G. V., (2023a), presented in part at the LAK23:13th International Learning Analytics and Knowledge Conference, Arlington, TX, USA.
  186. Watts F. M., Dood A. J., Shultz G. V. and Rodriguez J. M. G., (2023b), Comparing Student and Generative Artificial Intelligence Chatbot Responses to Organic Chemistry Writing-to-Learn Assignments, J. Chem. Educ., 100, 3806–3817 DOI:10.1021/acs.jchemed.3c00664.
  187. Watts F. M., Park G. Y., Petterson M. N. and Shultz G. V., (2022b), Considering alternative reaction mechanisms: students’ use of multiple representations to reason about mechanisms for a writing-to-learn assignment, Chem. Educ. Res. Pract., 23, 486–507 10.1039/D1RP00301A.
  188. Watts F. M., Schmidt-McCormack J. A., Wilhelm C. A., Karlin A., Sattar A., Thompson B. C., Gere A. R. and Shultz G. V., (2020), What students write about when students write about mechanisms: analysis of features present in students’ written descriptions of an organic reaction mechanism, Chem. Educ. Res. Pract., 21, 1148–1172 10.1039/C9RP00185A.
  189. Webber D. M. and Flynn A. B., (2018), How Are Students Solving Familiar and Unfamiliar Organic Chemistry Mechanism Questions in a New Curriculum? J. Chem. Educ., 95(9), 1451–1467 DOI:10.1021/acs.jchemed.8b00158.
  190. Weinrich M. and Britt R., (2022), Students’ Attention on Curved Arrows While Evaluating the Plausibility of an Organic Mechanistic Step, in Graulich N. and Shultz G. V. (ed.) Student Reasoning in Organic Chemistry, The Royal Society of Chemistry, pp. 1–18.
  191. Weinrich M. L. and Sevian H., (2017), Capturing students' abstraction while solving organic reaction mechanism problems across a semester, Chem. Educ. Res. Pract., 18, 169–190 10.1039/C6RP00120C.
  192. Williams W. L., Zeng L., Gensch T., Sigman M. S., Doyle A. G. and Anslyn E. V., (2021), The Evolution of Data-Driven Modeling in Organic Chemistry, ACS Cent. Sci., 7, 1622–1637 DOI:10.1021/acscentsci.1c00535.
  193. Xue D. and Stains M., (2020), Exploring Students’ Understanding of Resonance and Its Relationship to Instruction, J. Chem. Educ., 97, 894–902 DOI:10.1021/acs.jchemed.0c00066.
  194. Yik B. J. and Dood A. J., (2024), ChatGPT Convincingly Explains Organic Chemistry Reaction Mechanisms Slightly Inaccurately with High Levels of Explanation Sophistication, J. Chem. Educ., 101, 1836–1846 DOI:10.1021/acs.jchemed.4c00235.
  195. Yik B. J., Dood A. J., Frost S. J. H., Cruz-Ramírez de Arellano D., Fields K. B. and Raker J. R., (2023), Generalized rubric for level of explanation sophistication for nucleophiles in organic chemistry reaction mechanisms, Chem. Educ. Res. Pract., 24, 263–282 10.1039/d2rp00184e.
  196. Zaimi I., Dood A. J. and Shultz G. V., (2024a), The evolution of an assignment: how a Writing-to-Learn assignment's design shapes organic chemistry students’ elaborations on reaction mechanisms, Chem. Educ. Res. Pract., 25, 327–342 10.1039/D3RP00197K.
  197. Zaimi I., Haas D. B., Silverstein M. J. and Shultz G. V., (2024b), A case study on graduate teaching assistants’ teacher noticing when enacting a case-comparison activity in organic chemistry, Chem. Educ. Res. Pract., 25, 1268–1288 10.1039/D4RP00093E.
  198. Zaimi I., Watts F. M., Kranz D., Graulich N. and Shultz G. V., (2025), “That's not a super important point”: second-semester organic chemistry students’ lines of reasoning when comparing substitution reactions, Chem. Educ. Res. Pract., 26(1), 112–125 10.1039/d4rp00086b.
  199. Zhai X., Yin Y., Pellegrino J. W., Haudek K. C. and Shi L., (2020), Applying machine learning in science assessment: a systematic review, Stud. Sci. Educ., 56, 111–151.
  200. Zotos E. K., Tyo J. J. and Shultz G. V., (2021), University instructors’ knowledge for teaching organic chemistry mechanisms, Chem. Educ. Res. Pract., 22, 715–732 10.1039/D0RP00300J.

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.