Utility of creative exercises as an assessment tool for revealing student conceptions in organic chemistry

Krystal Grieger and Alexey Leontyev*
Department of Chemistry and Biochemistry, North Dakota State University, Fargo, North Dakota 58108, USA. E-mail: alexey.leontyev@ndsu.edu

Received 31st October 2024 , Accepted 25th February 2025

First published on 4th March 2025


Abstract

Creative exercises (CEs) consist of open-ended prompts to which students provide a series of relevant, distinct, and accurate statements, thus requiring that students make connections between concepts. In this study, CEs were incorporated into a one-semester Survey of Organic Chemistry course to identify what connections between chemistry concepts students made and what incorrect conceptions or misconceptions about chemistry students held. Students (N = 79) enrolled in the course first completed a practice CE as an in-class group activity followed by individually responding to a CE bonus problem on each of their four course exams. The number of different concepts students addressed for each CE increased over the semester, indicating that students made increasing content connections about course material; however, misconceptions about early concepts, such as nomenclature and assigning configurations, remained consistent throughout the semester. Furthermore, the CEs were found to be instrumental in shedding light on misconceptions and knowledge structures of students across varying performance levels. Overall, students reported that they viewed the CEs favorably and would like to see CEs incorporated in future courses.


Introduction

Organic chemistry is often considered a “weed-out” course, with some universities’ having reported attrition rates as high as 30–50% (Grove et al., 2008; Grove and Bretz, 2012). Unfortunately, many students struggle to understand the material which results in students failing or withdrawing from the course. This difficulty often arises because students rely on rote memorization of individual reactions rather than learning how each of the concepts connect and work together as part of a bigger whole (Grove and Bretz, 2012; Anzovino and Bretz, 2016).

Instead of rote memorization, organic chemistry requires that students forge connections among diverse chemistry concepts to perform tasks such as interpreting chemical representations, proposing mechanisms, and making structure–reactivity judgments (Graulich, 2015). This process of actively interlinking concepts and developing schemas fosters meaningful learning, which is crucial for students to be able to apply their knowledge to novel situations. Given the superiority of meaningful learning over rote memorization, a variety of instructional approaches for cultivating meaningful learning in organic chemistry have been reported, including writing-to-learn activities (Gupte et al., 2021), using a flipped organic chemistry classroom (Fautch, 2015; Flynn, 2015), incorporating “Bridge activities” in flipped classrooms (Stewart and Dake, 2019), and completely transforming the organic chemistry curriculum (Grove and Bretz, 2012; Cooper et al., 2019). In turn, reliable assessments capable of measuring meaningful student learning and connection-making within organic chemistry must be developed and evaluated. Two types of assessments already reported in the literature for measuring student connection-making include concept maps (CMs) and creative exercises (CEs).

CMs have garnered widespread use in chemistry courses due to their capacity to facilitate the integration of ideas. CMs, originally developed by Novak in 1972 (Novak and Cañas, 2006), are visual tools that use a series of nodes and links to summarize, organize, and represent meaningful relationships between concepts (Wong et al., 2021). CMs have been found to enhance student learning, and thus instructors have been encouraged to incorporate CMs into the curriculum (Nesbit and Adesope, 2016; Schroeder et al., 2018). Within organic chemistry, CMs have found utility as both a teaching tool (Šket et al., 2015) and an assessment tool (Lopez et al., 2011; Vachliotis et al., 2014; Burrows and Mooring, 2015; Anzovino and Bretz, 2016).

However, due to their open-ended and individualistic nature, a key limitation of CMs arises from the difficulty to ensure consistent scoring. In fact, a study by Johnstone and Otis (2006) revealed that student-generated CMs exhibited only a weak correlation with proficient students’ course content understanding. This discrepancy arose because the students with a proficient understanding utilized CMs as keys to access their broader knowledge base rather than as accurate representations of their content comprehension. Consequently, CMs have been recognized as valuable primarily for formative assessments in college-level chemistry courses (Schwendimann, 2015; Gilewski et al., 2019).

In contrast to CMs, CEs, which were originally developed by Trigwell and Sleet (1990), offer distinct advantages because they are simpler to grade, rendering them suitable for both formative and summative assessments in college-level chemistry courses (Ye and Lewis, 2014; Gilewski et al., 2019). CEs consist of a prompt to which students provide a list of relevant, distinct, and accurate statements. The instructor sets a minimum number of correct statements necessary to receive full credit, typically ranging from one-third to one-half of those identified by the instructor (Ye and Lewis, 2014). A key advantage of CEs is their capacity to enable instructors to discern not only what students comprehend, but also to identify any misconceptions or inappropriate conceptions students may hold (Lewis et al., 2010). In addition, like CMs, CEs have also been shown to promote connection-making among concepts in chemistry (Ye and Lewis, 2014; Gilewski et al., 2019; Ye et al., 2020).

CEs have been utilized across various chemistry courses, showcasing their versatility and effectiveness. For instance, in the realm of general chemistry, there is a wide literature base addressing CEs’ validity (Lewis et al., 2010, 2011; Gilewski et al., 2019), utility as a learning tool (Gilewski et al., 2019; Ye et al., 2020), and use as an assessment tool in instructional settings (Trigwell and Sleet, 1990; Ye and Lewis, 2014; Ye et al., 2020). Furthermore, student responses from CEs in general chemistry have been used to generate Measure of Linked Concepts true/false assessments that, when coupled with metacognitive activities, were found to enhance student final exam scores (Gilewski et al., 2022). Recently, CEs were also found to be instrumental in revealing the stability of student conceptions of covalent and ionic bonding in a longitudinal study conducted by Bowe et al. (2022) in which students completed a CE about SCl2 and CaCl2 at the end of general chemistry and then again at six months and one year later and found that students consistently incorrectly applied the covalent bonding model to CaCl2.

In addition to general chemistry, CEs have also been incorporated into biochemistry courses to analyze how students connect chemical and biochemical concepts (Warfa and Odowa, 2015). In their study, Warfa and Odowa (2015) found that students were able to successfully connect foundational chemistry concepts to the biochemical problems when completing the CEs. Similarly, Ngai and Sevian (2018) used CEs in a second semester biochemistry course to evaluate how biochemistry students engaged in chemical identity (CI) thinking to determine how CI is applied in biochemical contexts. They found that the student responses included seven different types of CI thinking. However, while a range of CI thinking was observed, 41% of the responses focused on the CI precursor “composition and structure” in which students provided general observations about the provided substance or molecule, such as degrees of unsaturation or identity of functional groups (Ngai and Sevian, 2018). Finally, Nix et al. (2023) explored how biochemistry students approach solving CEs through think-aloud interviews and found that student approaches varied individually and that students completing the CEs in an online course due to the pandemic varied in their approaches from students who, in later years, took the course in person.

While used more extensively within general chemistry and biochemistry courses, it has also been incorporated into first-year organic chemistry and upper level analytical and inorganic chemistry courses. For example, within a first-year organic chemistry course, CEs were employed in ungraded group activities to identify which concepts students chose to address and what connections students made between concepts (Mai et al., 2021). In their study, Mai et al. (2021) found that students tended to struggle with linking the various chemistry concepts from across the semester since they often did not address previously learned concepts as the semester progressed. In an analytical chemistry course, Wang and Lewis (2020) employed CEs to examine the alignment between student responses and instructor explanations, as well as to assess student ability to explain macroscopic phenomena using submicroscopic statements. Finally, within an inorganic chemistry course, Shaw (2023) found that student CE responses made connections to content from general, organic, and physical chemistry, thus providing evidence of meaningful learning. However, Shaw found that student responses typically fell into either the knowledge or comprehension levels of Bloom's taxonomy with only a few responses reaching the application level (Shaw, 2023) indicating that CEs should be used in conjunction with other assessment types.

Currently, the use of CEs in the organic chemistry domain remains largely unexplored, with only one documented instance in a first-year organic chemistry course (Mai et al., 2021). Because upper-division chemistry courses require the application of knowledge acquired from both the current class and the coursework from preceding classes, there arises a crucial need to instruct students on establishing connections not only from within a specific course but also across lower and upper division courses. More specifically, because the concepts in organic chemistry are intertwined and inherently hierarchical, to be successful in organic chemistry, it is essential that students create links between the key organic chemistry concepts. Therefore, we explored the use of CEs as a viable assessment tool for identifying connections that students make in organic chemistry.

Research questions

The overall goal of this study was to evaluate the use of creative exercises (CEs) in a Survey of Organic Chemistry course through answering the following four research questions:

RQ1. What evidence exists for the validity of the data collected from CEs in a Survey of Organic Chemistry course?

RQ2. What connections are organic chemistry students making between concepts in their CE responses? Is there a difference between connections made for high and low-scoring student responses?

RQ3. What student misconceptions about interpreting structures and analyzing organic reactions do CEs reveal?

RQ4. What are students’ perceptions of CEs and their contribution to the connections made throughout the semester?

Theoretical framework

Within prior literature reports, the use of CEs has been grounded in the cognitive constructivist learning theory (Lewis et al., 2010, 2011; Ye and Lewis, 2014; Warfa and Odowa, 2015; Nix et al., 2023). This is because the constructivist perspective specifies that students are not blank slates to be given information, but rather they construct new knowledge by actively integrating that knowledge into their existing schemas. Within the constructivist perspective, Piaget described learning as the interplay of assimilation and accommodation (Kalpana, 2014) where assimilation refers to fitting new information into existing knowledge schemes and accommodation refers to adjusting existing schemes based on new information. Both assimilation and accommodation require students to evaluate how the new information connects to their prior knowledge and schemes (Cakir, 2008). For example, learning about resonance structures, which is the focus of the third CE (CE3) in this study, requires that students connect electron movement and resonance-related structures with prior knowledge about hybridization, electronegativity, the octet rule, charge separation, molecular geometry, and molecular models (Betancourt-Pérez et al., 2010).

To this end, when used as a learning tool, CEs prompt students to identify and build connections between new material and previously learned concepts. This propensity for eliciting connections to prior knowledge is due to both (1) the CE requirement that each response addresses a distinct topic, which promotes the generation of multiple connections, and (2) the CE scoring criteria in which incorrect responses are not counted against the score, allowing students the freedom to be creative in identifying all the connections they make between new and prior material. This promotion of recalling and applying prior knowledge when learning new things in a different context leads to knowledge transfer (Torres et al., 2023) which is an essential step for meaningful learning to occur. Similarly, when used as a consistent assessment tool on exams, it can promote students to think about and identify connections between concepts as they prepare for exams and simultaneously be used as a measure of meaningful learning through counting the number of connections that students make when responding to the CE (Shaw, 2023).

Setting

This study was conducted at a medium-sized, research-intensive university located in the upper Midwest. Data was collected from students (N = 79) enrolled in a one-semester Survey of Organic Chemistry course offered in the Fall 2019 semester with the approval of the university's institutional review board (Protocol SM20060). All students in the course consented to participating in the study. However, if a student had chosen not to participate in the study, they still could have completed the CEs for bonus points, but we would not have collected their response data for analysis. Because two students dropped the class partway through the semester, their responses for the first exam are included in this study; however, we did not collect responses from them for the rest of the CEs or the Student Perception of CEs Survey. Therefore, for CE2–CE4 and the survey, only 77 students were included in this study. This three-credit course was delivered through two weekly 75-minute sessions for 15 weeks. Commonly declared majors for students enrolled in the course included engineering, animal sciences, biological sciences, agriculture and biosystems, and crop and weed sciences. Furthermore, most students were between their second through final year of degree completion.

This one-semester course used Brown and Poon's 6th edition of the Introduction of Organic Chemistry textbook and covered chapters 1–10 and 12–14. Therefore, the following topics were addressed in the course: covalent bonding and shapes of molecules, acids and bases, alkanes and cycloalkanes, alkenes and alkynes, reactions of alkenes and alkynes, chirality, haloalkanes, alcohols, ethers and thiols, benzene and its derivatives, amines, aldehydes, ketones, carboxylic acids, and functional derivatives of carboxylic acids. Overall, student grades were based on three midterm exams (100 points each), a comprehensive final exam (125 points), homework (50 points), and participation in in-class and online assessments and activities (25 points).

This course was held in a SCALE-UP (Student-Centered Active Learning Environment for Undergraduate Programs) classroom (Foote et al., 2016) with 15 round tables and approximately three to six students sitting at each table; students were free to choose where they sat. During the semester, we utilized the Classroom Observation Protocol for Undergraduate STEM (Smith et al., 2013), also known as COPUS, to obtain a COPUS profile of what the instructor and students were doing during the course periods. The COPUS analysis was completed via four classroom observations taken approximately one month apart. The decision to perform four observations was based on the work of Stains et al. (2018), in which they stated that, based on their data, at least four observations were necessary for reliable teaching characterization.

The obtained COPUS data was then analyzed using an online COPUS Analyzer tool (https://www.copusprofiles.org), which indicated that the course was classified as a cluster 5: a student-centered course utilizing a large amount of group work (Stains et al., 2018). During each observation, students were primarily observed listening, working in groups, answering questions, and answering clicker questions. Similarly, the instructor was primarily observed doing real-time writing, lecturing, moving between groups, posing questions, and following up on those questions.

Implementation of creative exercises (CEs)

The CEs were incorporated as bonus problems, worth up to five points each, in each of the three course exams and the final exam. Prior to the first exam, students completed one introductory CE (provided in Fig. 1) as an in-class group activity, in which they worked both individually and with their peers at their table. After students completed this introductory CE activity, the instructor led a class discussion on the correctness of the student-generated responses and then provided examples of additional correct responses that had not been mentioned. This first CE was incorporated as a class activity to ensure that all students were familiar with CEs and how to correctly answer them. We did not expect students to have prior exposure to CEs, but because the students came from different disciplines and educational backgrounds, some may have completed CEs previously. However, within the chemistry and biochemistry department at this institution, this was the first course to integrate CEs.
image file: d4rp00310a-f1.tif
Fig. 1 CE prompt used in the classroom activity.

After learning how to respond to CEs during the in-class CE activity, students then completed a CE on each of the three course exams and on the one final exam. The instructions on all four of the exams were as follows: “Write down five correct, distinct, and relevant facts about [the following]: Five (5) statements will get you full credit for the problem, which is worth a total of 5 points. The information you use should be information you learned in chemistry courses.” (Lewis et al., 2011). A total of four CEs were utilized in the course exams; they are shown in Table 1.

Table 1 Descriptive statistics of the scores of the CE prompts used on each of the four exams
Exam # Provided prompt Focus of CE N M (SD) Med. Mode
1 image file: d4rp00310a-u1.tif Structure 73 3.41 (1.62) 3 5
             
2 image file: d4rp00310a-u2.tif Reaction 70 3.37 (1.53) 3 3
             
3 image file: d4rp00310a-u3.tif Resonance structure of intermediate 69 2.79 (1.60) 3 2
             
4 image file: d4rp00310a-u4.tif Reaction 73 2.89 (1.52) 3 4


For each test, several students did not complete the CE, thus the reported number (N) varies and represents the number of students who attempted each test's CE. While neither author was the instructor for this course, the first author graded the student responses for the CE prompts, and the second author provided feedback and guidance on the grading decisions. The graded responses were then provided to the students when the exams were returned.

Prior to grading the student responses, the responses were first photocopied for future analysis. The obtained responses were then deidentified by assigning a randomly generated number for each student that was used in place of their name throughout the semester. The collected student responses were then coded for both topic and correctness by the first author. The assigned codes were then checked by the second author. Any discrepancies in decisions were discussed and resolved.

Data analysis

Examining the relationship between CE scores and exam scores

Analysis of the quantitative data was conducted using StataIC 16 software (StataCorp, 2019). To identify the relationship between the CE assessment score and the overall score of the corresponding exam, the Pearson product-moment correlation coefficients were calculated for each of the CE-exam pairs.

Coding of student responses by topics and correctness

Student CE responses were thematically analyzed and subsequently grouped with other statements that addressed the same themes. For each CE, a category miscellanea was created for unique incorrect, irrelevant, or unclassifiable statements that did not fit into any of the other categories.

Student responses within each category were then classified as correct, incorrect, irrelevant, or unclassifiable, consistent with the procedure outlined by Ye and Lewis (2014). A response was given the code irrelevant if it was not relevant to the prompt or if the response was not distinct from another prompt that was provided by the student. A response was classified as unclassifiable if the statement was too short to identify what the author was trying to convey such as “Na+ masks the negative charge” for the fourth CE. In this instance, what the “negative charge” is referring to and what is meant by “being masked” is unclear due to the brevity of the statement.

Visualizing connections made between concepts

To illustrate the connections made between concepts, a visual map for each CE was generated with Gephi software (https://gephi.org), using the method reported by Gilewski et al. (2019). To generate each of the Gephi images for the CEs, two files were prepared and uploaded into the Gephi software. The first file contained the information for the nodes and the second file contained the information for the edges. The file for the nodes included the list of topics and the weight for each topic as calculated from the total number of students who mentioned that topic. The file for the edges contained the list of topics and the frequencies of connections between each topic. Within the visual map generated by Gephi, the size of the node corresponded to the number of students mentioning the concept and likewise the width of the edges connecting the nodes corresponded to the number of students who made connections to both of the concepts.

Evaluating student perception of CEs

To assess student perception of CEs, a Student Perception of CEs Survey was administered at the end of the semester after students had completed the first three exams but before the final exam, which contained the last CE. The Student Perception of CEs Survey items, illustrated in Box 1, were developed through combining survey items previously reported by Gilewski et al. (2019) with two independently prepared survey items. The obtained student responses to the survey items were analyzed using thematic coding, with initial codes based on those reported by Gilewski et al. (2019).

Box 1 Student perception of CEs survey items

These questions refer to the extra credit questions on exams where you are asked to “write down as many distinct, correct, and relevant facts about…” We’ll refer to these questions as Creative Exercises for the purpose of this survey.

1. Did you attempt to earn extra credit by answering the Creative Exercises questions?

If not, what was the primary reason for not attempting to earn the extra credit?

2. Do you think the Creative Exercises helped you make connections among the content in this class? Please explain why you chose the answer for question.

3. How easy did you find it to answer the Creative Exercises?

4. How does this style of questions help you understand chemistry conceptually?

5. Given a choice, would you want Creative Exercises incorporated as part of the assignments and test questions in future courses? You may also want to include any suggested revisions to the Creative Exercises here or on the other side of this page.


Results and discussion

RQ1. What evidence exists for the validity of the data collected from creative exercises in a Survey of Organic Chemistry course?

The validity of the data obtained from CEs has previously been evaluated for both undergraduate general chemistry (Lewis et al., 2011; Ye and Lewis, 2014; Gilewski et al., 2019) and biochemistry (Nix et al., 2023) courses. However, the validity of an item's data is dependent on both the item itself and the target population. Because these were newly developed CE items and because we wanted to use them with a new target population, students enrolled in a Survey of Organic Chemistry course, we first sought to evaluate both (1) evidence based on test content and (2) evidence based on the relationship between the assessment and other variables in order to support the validity of the inferences from the data. These two aspects of validity were chosen for analysis due to their relevance for decisions about future implementations into organic chemistry courses and thus are important considerations for instructors interested in using these CE prompts within their own courses.
Evidence for validity of the data based on the content of the CE. To support the validity of the collected data, prior to administration, the content of each CE was evaluated by two organic chemistry professors, the second author and the course instructor, to ensure that the CEs used were scientifically accurate and had appropriate content coverage and level of difficulty. Both experts agreed that the CEs met the criteria for scientific accuracy and suitable content coverage and difficulty, and therefore no further revisions were necessary.

Furthermore, the authors then compared each CE's focus with the content of the corresponding exam that was prepared by the course instructor to ensure consistency between the assessed topics. Each 100-point exam contained a series of multiple choice, true-false, and short answer questions along with the extra credit CE. The first exam's CE (CE1) required that students analyze a Newman projection of an alkane; this was similar to one exam question that required students to identify the relationship between two alkanes that were drawn as Newman projections. The second exam's CE (CE2) required that students analyze a hydrochlorination reaction of 1-pentene. Likewise, one of the second exam's questions asked students to predict the product for a hydrobromination reaction. The third exam's CE (CE3) required that students describe the resonance structures of the intermediate from the electrophilic aromatic substitution (EAS) of toluene with Br2/FeBr3. Similarly, this exam asked students to analyze the EAS reaction of anisole, identify the major product, and circle the most stable intermediate resonance structures. Finally, the CE on the final exam (CE4) asked students to analyze an SN2 substitution reaction. While this exam's questions addressed EAS reactions and various addition to alkene reactions, the exam did not directly address SN2 reactions. However, one question did ask students to assign R or S configuration to 6 different molecular structures. While not directly asked about on the final exam, substitution reactions such as the SN2 reaction were taught in the course during the chapter on alkyl halides. Overall, the developed CEs were found to be of appropriate level of difficulty and were found to be consistent in focus with the test content thus providing evidence for the validity of the data.

Evidence for validity based on the relationship between the assessment and other variables. Convergent evidence refers to how well an assessment relates with other measures of the same construct. To evaluate the convergent evidence of validity for this prompt, the Pearson product-moment correlation coefficients were calculated between the score students received on the CE and that of the corresponding exam, which are provided in Table 2. For each CE, the correlation coefficient indicated a statistically significant moderate relationship between the two scores, thus providing further evidence for the validity of the data.
Table 2 Correlation of CE score to corresponding exam score using Pearson's coefficients (r)
  CE1 CE2 CE3 CE4
N 73 70 69 73
r 0.570 0.396 0.432 0.339
p <0.0001 0.0004 0.0002 0.0031


When compared to prior studies, the correlation coefficients are similar, albeit slightly lower, in value to those observed by Lewis et al. (2011) for students completing CEs as part of an exam in general chemistry. However, Shaw (2023) observed a stronger correlation between the midterm exam and the CE on that exam for inorganic chemistry (r = 0.76); although, in that study the CE was graded as part of the exam, whereas in our study the CE was graded only for bonus points and minimized some students’ efforts when completing the CEs which a few students indicated in their survey responses. For example, one student who indicated that the CEs were unhelpful wrote the following in response to whether the CE's helped them make connections: “Not really, by the time I reached the end of the test, I just kind of guessed because I knew it wouldn't hurt me to.” Therefore, future studies should explore the impact of scoring methods on the validity of student responses to CEs.

RQ2. What connections are organic chemistry students making between concepts in their CE responses? Is there a difference between connections made for high and low-scoring student responses?

To answer the second research question, student responses were first coded by topic addressed and then grouped with other responses addressing the same theme. A complete list of all coded student responses and their assigned classifications is provided in the ESI.

A visual map illustrating connections between the various concepts addressed in all student responses was generated for each CE using Gephi (https://gephi.org/). In each of the visual maps, the size of the node for each concept corresponded to the number of students who addressed that topic with larger nodes corresponding to more responses. Similarly, the width of the edges (the line connecting two of the nodes) was proportional to the number of students who addressed both connected topics, with wider lines indicating more students addressed both concepts in their CEs. Although students were not directly asked to make connections between concepts, for students to be able to answer the question, connections between concepts must have been made, even if only subconsciously (Gilewski et al., 2019).

As expected, due to the limited content knowledge at the start of the semester, responses for the first CE (CE1), which prompted students to analyze the structure of an alkane using a Newman projection, exhibited the least number of concepts with a total of 14 different concepts addressed as shown in Fig. 2. CE1 was attempted by 73 students, resulting in a total of 388 responses. The most common topics that students addressed, indicating that they were the most readily available conceptions, included describing the conformation which some students included as multiple responses (n = 93, 24%), addressing the atomistic or substituent level (n = 66, 17%), addressing the stability (n = 45, 12%), and converting to alternative representations (n = 39, 10%). Analysis of the edges indicated that the most frequently made connections for this CE included describing the conformation and addressing either its conversion to alternative representations, its stability, or its atomistic/substituent level.


image file: d4rp00310a-f2.tif
Fig. 2 Visual map illustrating students’ linking of concepts observed in the first creative exercise (CE1).

After analyzing the overall connections that students made in CE1, as shown in Fig. S2 (ESI), we compared the connections that students who scored four or more on CE1 made with those who scored zero to three. We found that correctly describing the conformation of the molecule had the highest frequency in both student groups. Interestingly, only students who provided at least four correct statements addressed the energy of the molecule, and all did so correctly. Finally, it was notable that only students who scored three or less on the CE provided incorrect statements about the types of bonds, the rotation of the molecule, the description of the model, the properties of the molecule, and the molecular geometry of the molecule, and were also the only ones to make incorrect miscellaneous statements.

Subsequently, as illustrated in Fig. 3, responses for the second CE (CE2), which prompted students to consider a hydrohalogenation reaction, garnered a greater number of concepts, with a total of 19 different concepts addressed. CE2 was attempted by 70 students, resulting in a total of 369 responses. The most addressed topics for CE2 included nomenclature of the reactant and/or product (n = 82, 22%), classifying the chemical reaction (n = 59, 16%), mechanism (n = 45, 12%), stereochemistry (n = 33, 9%), and addressing the atomistic or substituent level (n = 23, 6%). The most common connections between concepts represented by the edges (Fig. 3) were between nomenclature and either classifying the chemical reaction, addressing the stereochemistry, or identifying the mechanism and between classifying the chemical reaction and addressing the stereochemistry.


image file: d4rp00310a-f3.tif
Fig. 3 Visual map illustrating student linking of concepts observed in the second creative exercise (CE2).

Further analysis of the responses from students who scored four or more and those that scored 0–3 on CE2 again indicated differences in their response patterns as shown in Fig. S4 (ESI). The task with the highest percentage of correct answers from both groups was that of correctly assigning the proper nomenclature to the molecules. However, the starkest contrast between the two groups was in their ability to correctly classify the reaction. While commonly addressed for both groups, 53% of all the students who scored four or more correctly classified this reaction, whereas 53% of all the students who scored zero to three incorrectly classified the reaction as one of their responses. Furthermore, it was observed that only students who scored three or less incorrectly addressed the topics relating to the atomistic/substituent level, the types of bonds in the molecules, the identification of nucleophiles and electrophiles, the bonding rules, the configuration, and the stability and properties of the molecules.

As shown in Fig. 4, responses for the third CE (CE3), which prompted students to consider the mechanistic steps of an electrophilic aromatic substitution reaction (EAS), resulted in the greatest number of concepts with a total of 21 different concepts addressed. Overall, 69 students completed CE3, resulting in a total of 345 responses. The most addressed topics for CE3 included stability (n = 70, 20%), arene substitution patterns (n = 44, 13%), directing groups (n = 42, 12%), resonance structures (n = 38, 11%), and addressing the atomistic or substituent level (n = 28, 8%).


image file: d4rp00310a-f4.tif
Fig. 4 Visual map illustrating students’ linking of concepts observed in the third creative exercise (CE3).

Overall, students tended to score lower on CE3, which is potentially due to its nature of only showing the resonance structures of a reaction's intermediate. There are a variety of literature reports which indicate students struggle with understanding and explaining resonance structures (Betancourt-Pérez et al., 2010; Finkenstaedt-Quinn et al., 2020; Xue and Stains, 2020; Braun et al., 2024) for which student responses towards CE3 provides further evidence.

Therefore, to accommodate this overall lower score, for CE3 student responses were grouped into those that provided at least three correct responses versus those that only provided zero to two correct responses (Fig. S6, ESI). Student responses in both groups exhibited the highest rate of correctly addressing the topics of arene substitution patterns, resonance structures, and directing groups. Interestingly, only students that provided at least three correct responses correctly addressed the topic of atomistic/substituent level indicating that although it can be perceived as an easier topic, students still held incorrect conceptions about it, which CEs can help elicit. Finally, it was again observed that only students who scored two or less provided incorrect statements about a variety of topics including the types of bonds, the identification of leaving groups, the description of bonding, reaction schemes, mechanism, energy, reaction rates, and other statements classified as miscellaneous.

Responses for the fourth CE (CE4), which prompted students to consider an SN2 substitution reaction, resulted in an equal number of concepts as CE3 with a total of 21 different concepts addressed as illustrated in Fig. 5. Overall, 73 students completed CE4, resulting in a total of 352 responses. The most addressed topics for CE4 included classification of the reaction (n = 61, 17%), addressing the configuration (n = 50, 14%), addressing the atomistic or substituent levels (n = 39, 11%), assigning stereochemistry (n = 27, 8%), classifying the structure (n = 26, 7%), identifying the leaving groups (n = 21, 6%), and providing a description of bonding (n = 20, 6%). As shown in Fig. 5, CE4 resulted in the greatest number of edges, indicating that students continued to address a wider range of topics by the end of the semester. The most frequent simultaneously addressed concepts included the classification of the reaction, the configuration, and the atomistic or substituent level.


image file: d4rp00310a-f5.tif
Fig. 5 Visual map illustrating students’ linking of concepts observed in the fourth creative exercise (CE4).

Finally, analysis of students scoring four or more and those scoring from zero to three points on CE4 again indicated differences in which topics students chose to address and in their accuracy of addressing those topics (Fig. S8, ESI). For both groups the most frequent correctly addressed topics included the classification of the reaction, the atomistic/substituent level, stereochemistry, and the classification of structure. However, only students who scored between zero and three points made connections, albeit incorrect statements, to nomenclature, the mechanism, reaction energy diagrams, and other miscellaneous statements. This may indicate that students who scored lower on the CEs have less developed cognitive structures of this core chemistry content.

Overall, the theme of each CE greatly influenced the student responses. However, throughout all four CEs, students frequently identified features to classify or describe at the atomistic or substituent level. This could be due to the associated ease of providing these details or from experience providing similar responses on prior CEs. However, it should be noted that although this concept category is often assumed easier than others, on each of the CEs approximately half of the responses for it were marked as either incorrect or irrelevant, which helped to identify the misconceptions that students possess about these fundamental concepts.

RQ3. What common misconceptions do students in organic chemistry possess?

One key advantage of open-ended prompts, such as CEs, is the ability to probe into student misconceptions that may not surface using close-ended prompts. Thus, to answer the third research question, we analyzed the student responses for each of the CEs for both accuracy and trends in the provided incorrect responses with the complete list of classified statements provided in the ESI. Because the prompt was open-ended, the following reported percentages are indicative of the number of students who provided a particular response and are not representative of the percentage of students who may hold that incorrect conception. Therefore, future studies exploring the commonality of these conceptions may be warranted.

CE1, which prompted students to analyze the structure of an alkane using a Newman projection, resulted in 388 responses from 73 students. Understanding representational models such as Newman projections requires students to develop representational competencies so that they can use representations to think about, communicate, or create meaning about a phenomenon (Kozma and Russell, 2005; Ward et al., 2023). Unfortunately, many students struggle with interpreting the chemical representations (Ward et al., 2023) and preserving spatial relations when translating between representations (Padalkar and Hegarty, 2015). Prior studies assessing students’ conceptions about Newman projections found that students struggle with assigning stereochemistry (Mistry et al., 2020), draw incorrect forms of the Newman projection skeleton or use incorrect formulas (Farhat et al., 2019), do not consider the free rotation of the C–C bond and instead consider the structure as fixed (Boukhechem et al., 2011), lack understanding about what the Newman projection represents (Boukhechem et al., 2011), and struggle with translating from dash-wedge notation to a Newman projection when the Newman projection undergoes extensive rotation (Olimpo et al., 2015).

Within our study, one misconception observed from CE1 was the application of alkene properties to an alkane molecule by assigning the configuration of the substituents as either E or Z or as cis or trans. This was observed in multiple situations including students directly addressing the configuration of the compound (n = 16, 22%) or assigning the configuration within the name of the compound (n = 2, 3%). Another misconception, which was also observed in studies by Farhat, Stanford and Ruder (2019) and Ward et al. (2023), was that students did not count the carbons that were not explicitly shown in the projection. Thus, students indicated there were only 4 or 5 carbons instead of the expected 6 carbons because they missed counting the carbon denoted by either the front or the back atom or both front and back atoms. This was observed when students provided either the chemical formula or number of atoms present (n = 8, 11%), classified the structure (n = 2, 3%), and assigned the name of the compound (n = 2, 3%).

CE2, which prompted students to consider a hydrohalogenation reaction, resulted in 369 responses from 70 students. While limited in number, prior studies assessing students’ conceptions about hydrohalogenation reactions found that students misclassify them as substitution reactions (Sendur and Toprak, 2013), possess incomplete understanding of resonance structures and the effect of resonance stabilization on carbocation formation (Finkenstaedt-Quinn et al., 2020), may be able to draw the mechanism correctly without understanding the meaning behind the mechanism (DeGlopper et al., 2022), and misuse the term carbocation rearrangement by instead linking it to perceptual attributes such as substituent distribution and functional group changes (Graulich and Bhattacharyya, 2017).

The most common misconception observed from CE2 was students identifying the reaction as either an SN2 (n = 14, 20%) or SN1 (n = 6, 9%) reaction which is in accordance with findings previously reported by Sendur and Toprak (2013). Additionally, some students again provided incorrect nomenclature when trying to name the compounds (n = 8, 11%) with several students attempting to assign either the R or S or E configurations to the product (n = 4, 6%). Finally, another common misconception was that students incorrectly assigned the R configuration to the stereogenic carbon (n = 7, 10%) which was also observed in the naming of the compound and in their reporting of its configuration (n = 6, 9%).

CE3, which prompted students to consider the mechanistic steps of an electrophilic aromatic substitution reaction (EAS), resulted in 345 responses from 69 students. Reaction mechanisms and resonance have been identified as two of the more difficult organic chemistry concepts (Duis, 2011). Prior studies assessing students’ conceptions about aromatic compounds found that students view resonance structures as separate oscillating entities (Taber, 2002; Duffy, 2006), think the ring drawn in the structure of benzene indicates there is an electron reservoir inside the ring (Taber, 2002; Carle and Flynn, 2020), and categorize benzene as an alkene (Sendur, 2020). Additionally, a prior study by Duffy that assessed students’ conceptions about EAS reactions found that students viewed EAS reactions as an addition of a substituent to the ring instead of as a substitution reaction and identified the reaction as SN1 because they viewed the breaking apart of bromine as the first step of the mechanism (Duffy, 2006).

For CE3, the most common student misconception was indicating that “intermediate B” was the most stable intermediate (n = 18, 26%). Another common misconception was that the students referred to the intermediate resonance structures as products (n = 11, 16%). Additionally, students again provided incorrect nomenclature when trying to name the compounds (n = 5, 7%) providing incorrect names such as “1-bromo 2 methylcyclohexene” or “1-bromo, 2-methyl, 2,5-cyclohexene.” Finally, several students referred to the reaction as SN1 or E1 (n = 5, 7%) instead of as an EAS reaction, a misconception previously observed by Duffy (2006).

Lastly, CE4, which prompted students to consider an SN2 substitution reaction, resulted in 352 responses from 73 students. SN2 reactions have previously been identified by organic faculty to be one of the difficult organic chemistry concepts (Duis, 2011). Prior studies assessing students’ conceptions about SN2 reactions found that students often memorize trends for identifying good leaving groups without learning the principles why (Popova and Bretz, 2018; Dood et al., 2020), conflate SN2 and SN1 reaction mechanisms (Crandell et al., 2020), associate reaction schemes presented in wedge-dash notation as substitution reactions and reaction schemes in planar notation as elimination reactions (Bucat, 2004; Ladhams Zeiba, 2004), believe that the stereochemistry is conserved during the SN2 reaction (Cruz-Ramírez De Arellano and Towns, 2014), and assume that the presence of aprotic solvents indicates that an elimination reaction occurred (Cruz-Ramírez De Arellano and Towns, 2014).

Within this study, one of the more commonly observed misconceptions was that the sodium first reacts with the bromine to remove it from the molecule (n = 8, 11%). Another common misconception was that DMSO is a catalyst for the reaction (n = 8, 11%). As was also observed in the CE1 responses, some students again tried to describe the configuration of the chiral substituent as either cis or trans (n = 4, 5%) indicating the persistence of this misconception. Finally, all students who tried assigning the IUPAC name provided incorrect nomenclature (n = 4, 5%) such as naming the reactant “3-bromylcyclohexane” or “R-2-bromo-propane attached to a cyclohexene,” suggesting that students continued to have difficulties with naming structures even at the end of the semester.

RQ4. What are students’ perceptions of CEs and their contribution to the connections made throughout the semester?

Student perceptions of the CEs were evaluated based on their responses (n = 52, 68% response rate) to an end-of-semester Student Perception of CEs Survey which is shown in Box 1. Amongst those that responded to the survey, only one student responded in question 1 that they did not attempt the CEs indicating that they focused on the exam instead and therefore their survey response was excluded from the analysis. Among those that indicated they had attempted the CEs (n = 51), 37 students (72%) indicated the CEs helped them make connections throughout the course content areas, 5 (10%) indicated the CEs did not help make connections, 8 (16%) indicated the CEs both did and didn’t help them make connections, and 1 (2%) indicated that they only guessed on the CEs. For those that indicated it both did and did not help make connections (n = 8, 16%) students gave a myriad of reasons such as it helped them remember things but not learn things, that they sometimes guessed or wrote down the things that came to mind most readily, or that it was difficult because if you didn’t understand the concepts when first learned it's difficult to apply on future problems. Furthermore, 49 students provided their perceived difficulty of completing the CEs for question 3, with 18 students (37%) indicating they felt the CEs were easy, 20 (41%) indicating they felt the CEs were of mixed difficulty, and 11 (22%) indicating that they felt the CEs were hard.

Student responses (n = 51) to questions 2, 3, and 4 (shown in Box 1) were then coded together as having either an overall “helpful” (n = 38, 74%), “unhelpful” (n = 2, 4%) “both helpful and unhelpful” (n = 10, 20%), or “undefined” (n = 1, 2%) perception of how completing the CEs impacted their course concept understanding and connections. The single response classified as “undefined” indicated that the student guessed on the CEs, but they felt that the CEs could have helped them make connections. Responses classified as “helpful” perceptions of CEs (n = 38, 74%) were those that found the CEs to either enhance understanding, build connections between concepts, allow for flexibility in their responses, or promote metacognition. Responses classified as “unhelpful” perceptions of CEs (n = 2, 4%) described CEs as being challenging due to their open-ended nature and not being a specific problem or that they didn’t put enough effort into the class to be able to make the connections. Responses classified as having “both helpful and unhelpful” perceptions of CEs (n = 10, 20%) addressed both aspects either in their response to one survey item or addressed “helpful” features in one survey item and “unhelpful” features in another survey item.

Using themes originally described by Gilewski et al. (2019), student responses classified as “helpful” (n = 38) or “both helpful and unhelpful” (n = 10) were analyzed together to identify what helpful subthemes they addressed. The resulting subthemes included “knowledge integration” (n = 38), “comprehension and awareness” (n = 19) and “flexibility” (n = 17). Since some students addressed multiple concepts in their responses to the three questions, the combined values for these themes add up to more than the original 48 analyzed responses. Student responses classified as addressing “knowledge integration” described making connections to concepts learned previously in chemistry. These included responses such as “It makes me understand how most of the material is connected and related to each other. All chemistry builds on each other” and “it brings all information used on the test and past test [sic] into one question.” Furthermore, responses classified as “comprehension and awareness” addressed students’ conceptual understanding of the material. These included responses such as “I can put two and two together and get a better understanding” [sic] and “It gives a different way to understand material.” Finally, responses classified as “flexibility” addressed the benefits of the open-ended nature of CEs and how that open-ended characteristic allowed them to draw on the knowledge that they already have. These included responses such as “since they weren't asking a specific question, it was easier to draw on information previously learned” and “the open-endedness gave me a lot of freedom.”

To identify what led students to feel that CEs were unhelpful, the student responses classified as “both helpful and unhelpful” (n = 10) were further analyzed. Two students indicated that CEs helped them remember concepts from previous chapters, but did not help them learn the material which is warranted since the CEs were administered as part of the exams in this study. Two students indicated it was unhelpful because they would just write down the easiest concepts that came to their mind and another two students felt they were unhelpful because it's hard to answer the prompt if you don’t understand the chemistry concepts. Both of these responses provide further evidence that CEs have potential for use in identifying learning progressions. In addition, two students indicated that they were unhelpful because they only guessed on the CEs since they were bonus points. Finally, one student felt that sometimes the CEs were hard to understand, and the last student felt it depended on the exam for whether it did or did not help but did not explain why.

Question 5 of the survey prompted students to provide their perceptions about whether they would like to see CEs implemented as assignments and tests in their future courses. Of the 49 students who responded to this prompt, 22 students (43%) indicated that they would like to see them incorporated in future courses without specifying a preferred method of implementation. An additional 6 students (12%) indicated they would but only as assignments; whereas 2 students (4%) indicated they would but only on tests. Furthermore, another 14 students (27%) indicated they would but only as bonus problems. Finally, four students (8%) indicated that they would not like to see CEs incorporated in future classes and another 3 students (6%) were unsure about whether they would like to have CEs in future courses. Interestingly, seven students (14%) referenced the impact of CEs on grades as their reason for why CEs should be kept as bonus points on either tests or assignments. This may be due to only requiring five statements for full credit, resulting in each statement accounting for 20% of the points awarded. Due to these student responses, further research on the impact of how CE scores are assigned to student perception of CEs is warranted. Overall, similar to the findings by Gilewski et al. (2019), most students viewed the incorporation of CEs favorably and would like to see them implemented in future courses indicating that CEs enhanced their understanding, helped build connections between concepts, and allowed for flexibility in their responses.

Implications for teaching and research

CEs are open-ended prompts that allow instructors to uncover student conceptions about the course material which a traditional close-ended assessment might not reveal. CEs can serve as a valuable early diagnostic tool for identifying at-risk students or incorrect student conceptions which need to be addressed. In addition, the open-ended nature of CEs can allow instructors to identify the conceptions students prioritize or have most readily available (Mai et al., 2021), which can be used to help establish student learning progressions. Furthermore, CEs allow students to write out their conceptions about a prompt, providing students an opportunity to practice using the proper chemistry terminology in their response which in turn allows instructors insights into what terms students are grasping and with which they struggle.

CEs ask students to identify what they have learned in the course that can be applied to the prompt and do not penalize incorrect connections; therefore, CEs can help support a growth mindset for students. Having students develop and maintain a growth mindset is crucial because whether students have a growth or fixed mindset impacts their persistence, willingness to tackle difficult tasks, and ultimately their academic success (Limeri et al., 2020; Naibert et al., 2024; Santos and Mooring, 2024). Similarly, it has been found that student academic performance in organic chemistry has an impact on student mindset with students who struggled in the course shifting towards a fixed mindset (Limeri et al., 2020). Thus, using assessments such as CEs which help students see their growth in knowledge and allow them to stretch their conceptions without fear of losing points may be beneficial in supporting their growth mindset.

While used as part of the course exams in this study, CEs have also been used in other chemistry education studies as formative assessments via either homework or classroom activities (Gilewski et al., 2019; Mai et al., 2021; Nix et al., 2023; Shaw, 2023). When used as formative assessments, it has been reported that the reliability and validity of the data collected by CEs is stronger for in-class CEs than for take-home, which is likely due to students’ ability to look up answers for the prompt when not in the classroom (Lewis et al., 2011). Therefore, it is recommended that instructors give special consideration to how they will implement the CEs into their course activities based on the instructional role they wish the CE to fulfill. Finally, to alleviate student concerns when CEs are used as part of a summative assessment, it is recommended that instructors first incorporate the CEs within either group activities or assignments for students to gain experience in how to correctly answer the prompt before the exam.

In this study, students scored lower on CE3 which is likely due to the prompt only showing the intermediate resonance structures of an EAS reaction and thus required students to be familiar with both the reaction and its intermediate to be able to provide meaningful statements. Thus, when implementing CEs instructors must be mindful of the complexity of their developed prompts and tailor the content and complexity to fit both the course needs and learning objectives.

When creating CEs, it has been recommended that the instructor sets a maximum number of correct statements necessary to receive full credit that is typically one-third to one-half that which the instructor identifies (Ye and Lewis, 2014). Like previous studies, we also found it works best to generate a written list of correct responses prior to grading the CEs and add new correct student responses to the list as they are identified to ease grading and promote consistency of grading especially when multiple graders are grading the CEs. In addition, to minimize grading time for large class sizes, instructors can have students use either self-grading or peer-grading by providing an answer key and then going over it as a class or via a discussion board in case students have responses not already on the key.

While CEs allow incorrect conceptions and struggling students to be identified, due to their open-ended nature, they do not indicate how many students hold the generated incorrect conceptions. Therefore, as recommended by Ye et al., instructors can use the student CE responses to generate a Measure of Linked Concepts assessment in which students mark each selected student response as either true or false (Ye et al., 2015) which will allow instructors to identify the commonality of the incorrect conceptions. When used in combination with metacognitive activities, the incorporation of these Measure of Linked Concepts assessments was found to improve students’ final exam scores in general chemistry courses (Gilewski et al., 2022). Therefore, future work should evaluate its impact on student learning within organic chemistry courses.

For discipline-based education researchers, CEs are a useful tool for revealing both the conceptions student hold about a topic and the connections students are making between topics. The open-ended nature of the prompts has been found suitable for eliciting students’ readily available conceptions about a topic and is quick to administer and evaluate, thus allowing for the generation of responses from many students instead of just a select few as in the case of interviews. The student-generated responses can then be used in the design of a myriad of assessment tools. One key advantage of CEs is that they allow instructors to identify student conceptions with a range of accuracies allowing for the generation of assessments containing distractors with varying levels of correctness. These generated assessments may then be used to measure student learning progressions. Similarly, researchers can provide similar CEs at selected timepoints and conduct a longitudinal study to measure student learning progressions. While CEs have primarily been used within the chemistry education community, further research should explore their modification for other fields to explore student conceptions both within and across disciplines.

Limitations

There were several key limitations associated with this study which must be acknowledged in order to aid instructors in their decision about whether to implement CEs into their own courses. These limitations include that the study was only conducted at a single institution, not all student responses could be uniquely categorized, the impact of unknown factors on decision to complete the CEs, the potential of student familiarity with completing the CEs being a confounding variable, and the impact of the selected CE topics on the obtained student responses.

Study was only conducted at a single institution

This study was conducted within a single course at one research-intensive institution, and therefore the results may not be generalizable to other settings. Therefore, future studies should investigate its implementation across multiple institution types to allow for further generalization of these results.

Difficulties in categorizing some of the student responses

In this study, because students were not specifically prompted to justify their responses, the depth of their responses varied. Thus, not all statements could be categorized due to the brevity of their nature. We therefore recommend that future implementations explore the impact of requiring students to include a justification for each of their responses to further aid in identifying student conceptions. In addition, student responses sometimes contained both correct and incorrect statements when students provided a more elaborate answer. While we chose to award points for the correct portions, other faculty implementing CEs will need to decide how they wish to handle this prior to assigning the CEs.

Impact of unknown factors on decision to complete the CEs

This study did not explore why some students chose not to complete the CE on the exam. As noted previously, not all 79 students in the course completed each CE. In fact, the completion rate for each of the CEs ranged from 87 to 92%. For each of the CEs it was different students that did not respond, with several students not completing one or two of the CEs. This limits our ability to gain a complete picture of the connections that students were making, since it was typically the lower achieving students who did not complete the CE. Therefore, future studies should both uncover what contextual factors impact students’ decisions about whether to complete the CEs and evaluate student responses when the CEs are for course credit instead of bonus points which should enhance the response rate.

Potential of student familiarity of completing CEs being a confounding variable

While we observed an increase in concepts addressed for each CE as the semester progressed which we expect is due to students learning more concepts, it may also be attributed to an increase in student ability to answer the CEs as they became more comfortable with completing them. Furthermore, we did not evaluate whether students have previously completed CEs in their other courses. Although we do know that CEs were not administered within other chemistry courses at our institution, students who participated in this study were from a variety of majors and backgrounds and may have had the opportunity to complete them in their other college or high school courses. Therefore, it is recommended for future studies that students are given multiple opportunities to complete CEs early in the semester to help eliminate the potential for this confounding variable.

Impact of selected CE topics on student responses

In this study, we observed that the topics students chose to address were semi-dependent on the focus of the CE. Since only four CEs were analyzed, we could not evaluate all connections that students were making between the course concepts. Therefore, future studies should incorporate weekly CEs which exhibit a broad range of topics so progress in connections between concepts as the semester progresses can be further evaluated.

Conclusions

In conclusion, in this study CEs were incorporated into a one-semester Survey of Organic Chemistry course to identify what connections between chemistry concepts students made and what conceptions students held about the course content. The CEs revealed that the number of topics students addressed for each CE increased throughout the semester, indicating that students were making an increasing number of connections between current and past material. Furthermore, the CEs uncovered that the students held misconceptions about assigning nomenclature and configuration throughout the semester. In addition, it was found that students who scored higher versus lower on the CEs elicited different connections which may be indicative of their knowledge structures. Finally, students in this study viewed the CEs favorably with most indicating that they would like to see CEs implemented in future courses. Therefore, it is our hope that organic chemistry instructors can use CEs for both stimulating meaningful learning and as an assessment for measuring student learning within their courses.

Ethical considerations

This study was conducted with approval from the Institutional Review Board (Protocol SM20060) of North Dakota State University and all students consented to participate in the study.

Data availability

The data supporting this article have been included as part of the ESI.

Conflicts of interest

There are no conflicts to declare.

Acknowledgements

This study was supported by the National Science Foundation (2411805, DUE-1560142, and DUE-2021285), the Department of Chemistry and Biochemistry of North Dakota State University, and under EPSCoR Track-1 Cooperative Agreement OIA #1946202. In part, this research was also supported by NDSU's NDUS ECOR Award FAR0035075. The authors wish to thank Dr Gregory Cook for his help and willingness to include the use of CEs in his course and the students for participating in the study. We would also like to thank Amanda Lam for her help with analyzing the student responses to the Student Perception of CEs Survey questions. We also want to give a special thanks to Kristina Caton from the NDSU Center for Writers for her help with revising this manuscript for clarity.

References

  1. Anzovino M. E. and Bretz S. L., (2016), Organic chemistry students’ fragmented ideas about the structure and function of nucleophiles and electrophiles: a concept map analysis, Chem. Educ. Res. Pract., 17(4), 1019–1029 10.1039/c6rp00111d.
  2. Betancourt-Pérez R., Olivera L. J. and Rodríguez J. E., (2010), Assessment of Organic Chemistry Students’ Knowledge of Resonance-Related Structures, J. Chem. Educ., 87(5), 547–551 DOI:10.1021/ed800163g.
  3. Boukhechem M. S., Dumon A. and Zouikri M., (2011), The acquisition of stereochemical knowledge by Algerian students intending to teach physical sciences, Chem. Educ. Res. Pract., 12(3), 331–343 10.1039/C1RP90040D.
  4. Bowe K. A., Bauer C. F., Wang Y. and Lewis S. E., (2022), When All You Have Is a Covalent Model of Bonding, Every Substance Is a Molecule: A Longitudinal Study of Student Enactment of Covalent and Ionic Bonding Models, J. Chem. Educ., 99(8), 2808–2820 DOI:10.1021/ACS.JCHEMED.2C00188.
  5. Braun I., Lewis S. E. and Graulich N., (2025), A question of pattern recognition: investigating the impact of structure variation on students’ proficiency in deciding about resonance stabilization, Chem. Educ. Res. Pract., 26, 158–182 10.1039/D4RP00155A.
  6. Bucat R., (2004), Pedagogical Content Knowledge as a Way Forward: Applied Research in Chemistry Education, Chem. Educ. Res. Pract., 5(3), 215–228 10.1039/B4RP90025A.
  7. Burrows N. L. and Mooring S. R., (2015), Using concept mapping to uncover students’ knowledge structures of chemical bonding concepts, Chem. Educ. Res. Pract., 16(1), 53–66 10.1039/c4rp00180j.
  8. Cakir M., (2008), Constructivist Approaches to Learning in Science and Their Implications for Science Pedagogy: A Literature Review, Int. J. Environ. Sci. Ed., 3(4), 193–206.
  9. Carle M. S. and Flynn A. B., (2020), Essential learning outcomes for delocalization (resonance) concepts: How are they taught, practiced, and assessed in organic chemistry? Chem. Educ. Res. Pract., 21(2), 622–637 10.1039/C9RP00203K.
  10. Cooper M. M., Stowe R. L., Crandell O. M. and Klymkowsky M. W., (2019), Organic Chemistry, Life, the Universe and Everything (OCLUE): A Transformed Organic Chemistry Curriculum, J. Chem. Educ., 96(9), 1858–1872 DOI:10.1021/acs.jchemed.9b00401.
  11. Crandell O. M., Lockhart M. A. and Cooper M. M., (2020), Arrows on the Page Are Not a Good Gauge: Evidence for the Importance of Causal Mechanistic Explanations about Nucleophilic Substitution in Organic Chemistry, J. Chem. Educ., 97(2), 313–327 DOI:10.1021/acs.jchemed.9b00815.
  12. Cruz-Ramírez De Arellano D. and Towns M. H., (2014), Students’ understanding of alkyl halide reactions in undergraduate organic chemistry, Chem. Educ. Res. Pract., 15(4), 501–515 10.1039/C3RP00089C.
  13. DeGlopper K. S., Schwarz C. E., Ellias N. J. and Stowe R. L., (2022), Impact of Assessment Emphasis on Organic Chemistry Students’ Explanations for an Alkene Addition Reaction, J. Chem. Educ., 99(3), 1368–1382 DOI:10.1021/acs.jchemed.1c01080.
  14. Dood A. J., Dood J. C., Cruz-Ramírez De Arellano D., Fields K. B. and Raker J. R., (2020), Analyzing explanations of substitution reactions using lexical analysis and logistic regression techniques, Chem. Educ. Res. Pract., 21(1), 267–286 10.1039/c9rp00148d.
  15. Duffy A. M., (2006), Students’ ways of understanding aromaticity and electrophilic aromatic substitution reactions, PhD Dissertation, University of California, San Diego. Available at: https://escholarship.org/uc/item/8mb6v54x (accessed 27 October 2024).
  16. Duis J. M., (2011), Organic chemistry educators’ perspectives on fundamental concepts and misconceptions: an exploratory study, J. Chem. Educ., 88(3), 346–350 DOI:10.1021/ed1007266.
  17. Farhat N. J., Stanford C. and Ruder S. M., (2019), Assessment of Student Performance on Core Concepts in Organic Chemistry, J. Chem. Educ., 96(5), 865–872 DOI:10.1021/ACS.JCHEMED.8B00913.
  18. Fautch J. M., (2015), The flipped classroom for teaching organic chemistry in small classes: is it effective? Chem. Educ. Res. Pract., 16(1), 179–186 10.1039/C4RP00230J.
  19. Finkenstaedt-Quinn S. A., Watts F. M., Petterson M. N., Archer S. R., Snyder-White E. P. and Shultz G. V., (2020), Exploring Student Thinking about Addition Reactions, J. Chem. Educ., 97(7), 1852–1862 DOI:10.1021/acs.jchemed.0c00141.
  20. Flynn A. B., (2015), Structure and evaluation of flipped chemistry courses: organic & spectroscopy, large and small, first to third year, English and French, Chem. Educ. Res. Pract., 16(2), 198–211 10.1039/C4RP00224E.
  21. Foote K., Knaub A., Henderson C., Dancy M. and Beichner R. J., (2016), Enabling and challenging factors in institutional reform: the case of SCALE-UP, Phys. Rev. Phys. Educ. Res., 12(1), 010103 DOI:10.1103/PhysRevPhysEducRes.12.010103.
  22. Gilewski A., Litvak M. and Ye L., (2022), Promoting metacognition through measures of linked concepts with learning objectives in introductory chemistry, Chem. Educ. Res. Pract., 23(4), 876–884 10.1039/D2RP00061J.
  23. Gilewski A., Mallory E., Sandoval M., Litvak M. and Ye L., (2019), Does linking help? Effects and student perceptions of a learner-centered assessment implemented in introductory chemistry, Chem. Educ. Res. Pract., 20(2), 399–411 10.1039/c8rp00248g.
  24. Graulich N., (2015), The tip of the iceberg in organic chemistry classes: how do students deal with the invisible? Chem. Educ. Res. Pract., 16(1), 9–21 10.1039/C4RP00165F.
  25. Graulich N. and Bhattacharyya G., (2017), Investigating students’ similarity judgments in organic chemistry. Chem. Educ. Res. Pract., 18(4), 774–784 10.1039/C7RP00055C.
  26. Grove N. P. and Bretz S. L., (2012), A continuum of learning: from rote memorization to meaningful learning in organic chemistry, Chem. Educ. Res. Pract., 13(3), 201–208 10.1039/C1RP90069B.
  27. Grove N. P., Hershberger J. W. and Bretz S. L., (2008), Impact of a spiral organic curriculum on student attrition and learning, Chem. Educ. Res. Pract., 9(2), 157–162 10.1039/B806232N.
  28. Gupte T., Watts F. M., Schmidt-Mccormack J. A., Zaimi I., Gere A. R. and Shultz G. V., (2021), Students’ meaningful learning experiences from participating in organic chemistry writing-to-learn activities, Chem. Educ. Res. Pract., 22(2), 396–414 10.1039/D0RP00266F.
  29. Johnstone A. H. and Otis K. H., (2006), Concept mapping in problem based learning: a cautionary tale, Chem. Educ. Res. Pract., 7(2), 84–95 10.1039/B5RP90017D.
  30. Kalpana T., (2014), A Constructivist Perspective on Teaching and Learning: A Conceptual Framework, Int. Res. J. Soc. Sci., 3(1), 27–29.
  31. Kozma R. and Russell J., (2005), Modelling students becoming chemists: developing representational competence, in Gilbert J. K. (ed.) Visualization in Science Education, Springer, pp. 121–145.
  32. Ladhams Zeiba M., (2004), An investigation of teaching and learning processes in the study of reaction mechanisms in organic chemistry, PhD Dissertation, School of Biomedical and Chemical Sciences at the University of Western Australia. Available at: https://research-repository.uwa.edu.au/en/publications/teaching-and-learning-about-reaction-mechanisms-in-organic-chemis, (accessed 27 October 2024).
  33. Lewis S. E., Shaw J. L. and Freeman K. A., (2010), Creative Exercises in General Chemistry: A Student-Centered Assessment, J. Coll. Sci. Teach., 40(1), 48–53 DOI:10.2307/42992839.
  34. Lewis S. E., Shaw J. L. and Freeman K. A., (2011), Establishing open-ended assessments: investigating the validity of creative exercises, Chem. Educ. Res. Pract., 12(2), 158–166 10.1039/c1rp90020j.
  35. Limeri L. B., Carter N. T., Choe J., Harper H. G., Martin H. R., Benton A. and Dolan E. L., (2020), Growing a growth mindset: characterizing how and why undergraduate students’ mindsets change, Int. J. STEM Educ., 7(1), 35 DOI:10.1186/s40594-020-00227-2.
  36. Lopez E., Kim J., Nandagopal K., Cardin N., Shavelson R. J. and Penn J. H., (2011), Validating the use of concept-mapping as a diagnostic assessment tool in organic chemistry: implications for teaching, Chem. Educ. Res. Pract., 12(2), 133–141 10.1039/c1rp90018h.
  37. Mai A., George-Williams S. R. D. and Pullen R., (2021), Insights into Student Cognition: Creative Exercises as an Evaluation Tool in Undergraduate First-year Organic Chemistry, Int. J. Innov. Sci. Math. Educ., 29(3), 48–61 DOI:10.30722/IJISME.29.03.004.
  38. Mistry N., Singh R. and Ridley J., (2020), A Web-Based Stereochemistry Tool to Improve Students’ Ability to Draw Newman Projections and Chair Conformations and Assign R/S Labels, J. Chem. Educ., 97(4), 1157–1161 DOI:10.1021/ACS.JCHEMED.9B00688.
  39. Naibert N., Mooring S. R. and Barbera J., (2024), Investigating the Relations between Students’ Chemistry Mindset, Self-Efficacy, and Goal Orientation in General and Organic Chemistry Lecture Courses, J. Chem. Educ., 101(2), 270–282 DOI:10.1021/acs.jchemed.3c00929.
  40. Nesbit J. C. and Adesope O. O., (2016), Learning with Concept and Knowledge Maps: A Meta-Analysis, Rev. Educ. Res., 76(3), 413–448 DOI:10.3102/00346543076003413.
  41. Ngai C. and Sevian H., (2018), Probing the Relevance of Chemical Identity Thinking in Biochemical Contexts, CBE—Life Sci. Educ., 17(4), ar58 DOI:10.1187/cbe.17-12-0271.
  42. Nix C. A., Hughes H. and Saitta E. K. H., (2023), Exploration of Student Approaches to Creative Exercises in Undergraduate Biochemistry, J. Chem. Educ., 100(10), 3784–3794 DOI:10.1021/acs.jchemed.3c00175.
  43. Novak J. D. and Cañas A. J., (2006), The Origins of the Concept Mapping Tool and the Continuing Evolution of the Tool, Inf. Vis., 5(3), 175–184 DOI:10.1057/PALGRAVE.IVS.9500126.
  44. Olimpo J. T., Kumi B. C., Wroblewski R. and Dixon B. L., (2015), Examining the relationship between 2D diagrammatic conventions and students’ success on representational translation tasks in organic chemistry, Chem. Educ. Res. Pract., 16(1), 143–153 10.1039/C4RP00169A.
  45. Padalkar S. and Hegarty M., (2015), Models as feedback: developing representational competence in chemistry, J. Educ. Psychol., 107(2), 451–467 DOI:10.1037/a0037516.
  46. Popova M. and Bretz S. L., (2018), Organic Chemistry Students’ Understandings of What Makes a Good Leaving Group, J. Chem. Educ., 95(7), 1094–1101 DOI:10.1021/ACS.JCHEMED.8B00198.
  47. Santos D. L. and Mooring S. R., (2024), The complexity of chemistry mindset beliefs: a multiple case study approach, Chem. Educ. Res. Pract., 25(4), 1210–1228 10.1039/D4RP00068D.
  48. Schroeder N. L., Nesbit J. C., Anguiano C. J. and Adesope O. O., (2018), Studying and Constructing Concept Maps: a Meta-Analysis, Educ. Psychol. Rev., 30(2), 431–455 DOI:10.1007/S10648-017-9403-9.
  49. Schwendimann B. A., (2015), Concept maps as versatile tools to integrate complex ideas: from Kindergarten to higher and professional education, Knowl. Manage. E-Learn., 7(1), 73–99 DOI:10.34105/j.kmel.2015.07.006.
  50. Sendur G., (2020), An examination of pre-service chemistry teachers’ meaningful understanding and learning difficulties about aromatic compounds using a systemic assessment questions diagram, Chem. Educ. Res. Pract., 21(1), 113–140 10.1039/C9RP00080A.
  51. Sendur G. and Toprak M., (2013), The role of conceptual change texts to improve students’ understanding of alkenes, Chem. Educ. Res. Pract., 14(4), 431–449 10.1039/C3RP00019B.
  52. Shaw J. L., (2023), Promoting Meaningful Learning through Incorporation of Creative Exercises in Inorganic Chemistry, J. Chem. Educ., 100(1), 69–79 DOI:10.1021/acs.jchemed.2c00598.
  53. Šket B., Glažar S. A. and Vogrinc J., (2015), Concept maps as a tool for teaching organic chemical reactions, Acta Chim. Slov., 62(2), 462–472 DOI:10.17344/acsi.2014.1148.
  54. Smith M. K., Jones F. H. M., Gilbert S. L. and Wieman C. E., (2013), The classroom observation protocol for undergraduate stem (COPUS): a new instrument to characterize university STEM classroom practices, CBE Life Sci. Educ., 12(4), 618–627 DOI:10.1187/cbe.13-08-0154.
  55. Stains M., Harshman J., Barker M. K., Chasteen S. V., Cole R. and DeChenne-Peters S. E. et al., (2018), Anatomy of STEM teaching in North American universities, Science, 359, 1468–1470 DOI:10.1126/science.aap8892.
  56. softwareStataCorp, (2019), Stata Statistical Software: Release 16.
  57. Stewart J. J. and Dake G. R., (2019), Activating Students’ Prior Knowledge Using a Bridge Activity as an Initial Interactive Discussion in a Flipped Organic Chemistry Course, J. Chem. Educ., 96(11), 2426–2431 DOI:10.1021/ACS.JCHEMED.9B00370.
  58. Taber K. S., (2002), Compounding quanta: probing the frontiers of student understanding of molecular orbitals, Chem. Educ. Res. Pract., 3(2), 159–173 10.1039/B2RP90013K.
  59. Torres D., Pulukuri S. and Abrams B., (2023), Step Back, Translate, and Extend: An Instructional Framework for Enhancing Knowledge Transfer and Self-Efficacy Across Chemistry Courses, J. Chem. Educ., 100(12), 4696–4706 DOI:10.1021/acs.jchemed.3c00964.
  60. Trigwell K. and Sleet R., (1990), Improving The Relationship Between Assessment Results And Student Understanding, Assess. Eval. High. Educ., 15(3), 190–197 DOI:10.1080/0260293900150302.
  61. Vachliotis T., Salta K. and Tzougraki C., (2014), Meaningful Understanding and Systems Thinking in Organic Chemistry: Validating Measurement and Exploring Relationships, Res. Sci. Educ., 44(2), 239–266 DOI:10.1007/S11165-013-9382-X.
  62. Wang Y. and Lewis S. E., (2020), Analytical chemistry students’ explanatory statements in the context of their corresponding lecture, Chem. Educ. Res. Pract., 21, 1183–1198 10.1039/d0rp00063a.
  63. Ward L. W., Rotich F., Hoang J. and Popova M., (2023), Representational Competence Under the Magnifying Glass – The Interplay Between Student Reasoning Skills, Conceptual Understanding, and the Nature of Representations, in Graulich N. and Shultz G. (ed.) Student Reasoning in Organic Chemistry: Research Advances and Evidence-based Instructional Practices, Royal Society of Chemistry, pp. 36–55.
  64. Warfa A. R. M. and Odowa N., (2015), Creative exercises (CEs) in the biochemistry domain: an analysis of students’ linking of chemical and biochemical concepts, Chem. Educ. Res. Pract., 16(4), 747–757 10.1039/c5rp00110b.
  65. Wong R. M., Sundararajan N., Adesope O. O. and Nishida K. R. A., (2021), Static and interactive concept maps for chemistry learning, Educ. Psychol., 41(2), 206–223 DOI:10.1080/01443410.2020.1761299.
  66. Xue D. and Stains M., (2020), Exploring Students’ Understanding of Resonance and Its Relationship to Instruction, J. Chem. Educ., 97(4), 894–902 DOI:10.1021/acs.jchemed.0c00066.
  67. Ye L., Eichler J. F., Gilewski A., Talbert L. E., Mallory E. and Litvak M. et al., (2020), The impact of coupling assessments on conceptual understanding and connection-making in chemical equilibrium and acid–base chemistry, Chem. Educ. Res. Pract., 21, 1000–1012 10.1039/d0rp00038h.
  68. Ye L. and Lewis S. E., (2014), Looking for links: examining student responses in creative exercises for evidence of linking chemistry concepts, Chem. Educ. Res. Pract., 15(4), 576–586 10.1039/c4rp00086b.
  69. Ye L., Oueini R. and Lewis S. E., (2015), Developing and Implementing an Assessment Technique to Measure Linked Concepts, J. Chem. Educ., 92(11), 1807–1812 DOI:10.1021/acs.jchemed.5b00161.

Footnote

Electronic supplementary information (ESI) available. See DOI: https://doi.org/10.1039/d4rp00310a

This journal is © The Royal Society of Chemistry 2025
Click here to see how this site uses Cookies. View our privacy policy here.