Abstract: Writing plays an integral role in any disciplinary course setting. In the sciences, WAC and WID initiatives primarily focus on using writing to deepen student understanding of scientific concepts. Scholars, however, have paid less attention to how writing may facilitate an understanding of the link between concepts and their quantitative expressions and applications. This study, therefore, takes as its subject the use of writing to encourage metacognition. Using a speak-aloud protocol, we taped forty students completing math and science questions. Students would, in other words, talk through their approaches to answering math and science questions. Analysis of the transcripts revealed three distinct cognitive processes demonstrated by students. We discuss the characteristics of each cognitive process and suggest concrete approaches to encourage metacognition through writing in science classrooms. These suggestions are a result of both our analysis of the students' transcripts and our investigation into new approaches to encourage metacognition in the sciences. Our study adds to the general discussion of how writing might facilitate student transfer between scientific concepts and their quantitative applications.
Although writing plays an integral role in courses offered in most, if not all, disciplines, many science students view writing as an activity that is only relevant to scientific practice in specific circumstances, such as lab reports (Beall, 1998; Driskill, 1998). Theoretical treatments of knowledge generally (Foucault, 2002) and science specifically (Latour, 1987) as well as guides for professional scientists (Montgomery, 2003), however, describe the integral nature of writing and communicating in both the practice and public discourse of science. Unfortunately, the inherent relationship between the kinesthetic and formal, technical aspects of science and their theoretical and quantitative meanings are not always obvious to undergraduate students. Often, professors incorporate writing into their science classrooms to reinforce basic comprehension of foundational scientific concepts (Beall, 1998; Driskill, 1998).
Writing across the curriculum (WAC) and writing in the disciplines (WID) initiatives tend to focus on the idea of using writing as a means to deepen student understanding of concepts and rhetorical practices in the discipline (Fulwiler and Young, 2000; Kokolla and Gessell, 2003). However, as Powell (1985) and others—notably Childers and Lowry (2001)—observe student writing in the sciences requires attention to visual and other concerns not generally addressed in mainstream composition theory. Beall (1998) integrated informal, reflective writing in chemistry classes as a means of considering concepts. Similarly, Driskill (1998) found that students could use writing to better understand scientific concepts in a nontraditional chemistry class that omitted laboratory work, as did Jamison (2000) who explored the rhetoric of math in a course that emphasized conceptual understanding and omitted quantitative problem-solving.
Jamison (2000) concludes that students employ a cognitive process of translation to succeed in undergraduate science, and he observes that students must learn the superficial rhetorical practices (how things are said) of a field before they can understand content (what is being said), which is necessary to overall comprehension (why it is said). Introductory science courses focus on formal techniques (p. 45).^{[1]} Similarly, Sokoloff and Thornton (1997) developed interactive lecture demonstrations that make concepts visible to students and led to a seven-fold increase in student understanding and performance.^{[2]}
In the current study, we examine student thought patterns in quantitative problem solving to identify a possible series of applications for thinking and expression to foster conceptual understandings of quantitative applications. In order to do so, we investigate student responses for evidence of metacognitive competence. We based our observations on a series of student interviews conducted at Hofstra University. Students involved in the study were enrolled in first- and second-year chemistry courses at the university and received a nominal stipend for their participation. This study received IRB approval.
We obtained 40 tape recordings made by students who had completed or were enrolled in first and second year general, inorganic and/or organic chemistry courses as they answered quantitative questions. The questions were provided in hard copy, with multiple-choice answers. Each student had access to a calculator and a periodic table of elements. The students had as much time as they liked to complete the problems.
This methodology was designed to provide a window on student thinking. All questions drew on the baseline knowledge covered in the first semester of a typical freshman chemistry course. Students also answered mathematical competency questions dealing with skills essential to solving chemistry problems, taken from the Scholastic Aptitude Test.^{[3]} The chemistry questions tested students' ability to calculate moles and grams, molarity, limiting reagents and stochiometry (see Appendix 1). The problems were chosen to be progressively more difficult and also to build on the principles tested in earlier questions. If a student could not answer a question, he or she was asked to provide a possible approach to solving the problem.
Approximately five recordings were unusable because the students ceased speaking while answering the study questions. Relatively few students were able to complete all problems correctly. Fewer than half of the students who provided usable data answered all the chemistry questions correctly and less than a third answered all the math questions correctly.^{[4]}
During review of the usable transcripts in consultation with an expert in the field of writing and science pedagogy,^{[5]} two highly distinct patterns of student responses emerged. Students either exhibited an understanding of the way that problems were expressions of scientific concepts and phenomena—what we understand as a metacognitive apprehension of the relationship of the quantitative expression to its conceptual referent—or they attempted to solve problems mechanically. We found that students who solved problems correctly tended either to self-check or annotate their responses while students who had more difficulty solving the problems tended to apply algebraic, symbolic, or algorithmic methods without expressing an understanding of why these methods might work. Individual students did not consistently follow one type of practice; however, the use of purely superficial means of problem solving was strongly associated with wildly incorrect answers. If the patterns of behavior that informed largely successful attempts at problem solving can be used as potential rubrics for student learning, then WAC and WID approaches could be used to adapt and incorporate these rubrics for classroom practice and writing assignments.
Several students appeared to be captured quantitatively in their approach to specific problems, mimicking the type of superficial performance Beall (1998) describes in laboratory settings. These students express an understanding of few or no real world or qualitative concepts when problem solving. It seemed that inadequate comprehension of the qualitative underpinnings of quantitative expressions and poor mathematical skills frequently overlapped in these students. However, a significant minority of students were able to apply mathematical skills in the context of chemistry problems even when they could not answer the SAT math questions.
One of the baseline questions, for example, provided the information that one stone equals 14 pounds and asked students to calculate half a pound in stones (1/28). One student who fluidly performed conversion from grams to moles to grams in association with a chemistry question was unable to convert pounds to stones when looking at an SAT question:
(Reads question four times). I'm really not sure what this question means. (Rereads question). If 1 stone is 14 pounds, ½ of a pound is obviously less than the 1 stone, so I don't get how that makes any sense. (Rereads question). I don't know this one. I'm going to skip it for now.
Interestingly, this student did not immediately give up, rereading the question repeatedly. She also clearly understood that ½ pound would be less than a stone, but seemed to miss the requirement for dealing with fractions when she finds it confusing to convert from a larger system of measure to a smaller one.^{[6]} (Students understood that a stone was, like a pound, a measurement of weight, and not a physical entity).
Some students, whom we termed symbolic workers, tended to assign arbitrary symbols to terms when working on problems because they associated this practice with high-school algebra. Symbolic workers often did not recognize when they were going astray because their intellectual methods are grounded in superficial markers of quantitative practice. Wildly incorrect answers often resulted from this method of problem solving.
Ashley, a second-year chemistry student, was unable to calculate a ratio between two terms given only as parts of other ratios in question one. In her tape recording, she calls this question "nerve-wracking" and moves on, returning after successfully answering chemistry questions that also require proportions and conversions.
Let's flip that one over … I think. Maybe I don't want that. … Let's throw some X's in there. I like algebra. We know that there are five times as many. … I'll do Y here and have 5Y to 2Y. … I don't even know what I'm doing. Let's just say if we wanted Y, divide by five. … [Inaudible] I don't even know what I'm doing. I really don't. I feel this is very simple and I just can't figure it out ….
Ashley felt that a clearly algebraic problem—"if there were letters in there"—would be easy to solve. Ironically, Ashley successfully performed the same operations in solving chemistry problems, for which she exhibited an understanding of the context and underlying principles. In the context of the test question, which described the ratios between postage stamps rather than chemicals, Ashley did not see the underlying mathematical similarity between all problems dealing with ratios. Although she understood that some type of mathematical connection should be made, she fell back onto the comfort of an algebraic method rather than probing the numerical relationships between the ratios given in the problems.
Some students used algorithmic methods as a crutch without reflecting an understanding of the conceptual basis for the equations. These students attempted to derive formulas, perhaps modeling this behavior on college texts, in order to solve confusing problems. Christina attempted to write a formula to determine the outcome of a series in math question two rather than performing the individual calculations:
After the first bounce … it reaches a height … so, bounce one is equal to 125 inches. They want to know how many after bounce four. So, after the first bounce … so there's a relationship between the two; I just have to figure out how to write an equation to relate the two relationships.
Her attempts to create a formula prevented her from reaching the correct answer by applying basic arithmetic.
Similarly, Dan, a second-year chemistry student who answered all but one of the chemistry questions correctly, was unable answer the second baseline math question. Like Ashley, he attempts to find an algorithm and forgets to apply his knowledge of physical reality:
(Reads question). I'll start with marking the main numbers that will help in solving that and write the main numbers. We know that after the first bounce it jumped to 2/5 of its first height meaning after four times it will bounce to 2/5 times four, so 2/5 times four equals to 1.6. The, okay, so we'll divide the 1.5, to 1.6, 125 after its first bounce, and we divide it by four times it bounced. That's 125 divided by 1.6. It's 78.125 inches.
Dan's approach appears to take into account the idea that he should perform multiple operations to obtain the correct answer; however, he did not notice that he replaces the height with the number of bounces in his algorithm and therefore arrives at a grossly incorrect answer.
Some symbolic workers tended to be purely algorithmic in solving problems in chemistry. These students seemed to depend solely on rote training in both series of problems and therefore may have a generally poor understanding of math and science. Other students successfully applied mathematical principles in specific disciplinary contexts but were unsuccessful at dealing with the logic behind those principles, indicating a need for better metacognitive training.
The distinctive characteristic of the students we considered to be self-checkers is a consistent questioning about whether the work and answers make sense in the context in which they are working. In addition, self-checkers will either backtrack or review their steps to confirm their answers before moving to the next problem. The basis for self-checking can either be knowledge of the physical reality that informs the problem or an understanding of the algorithm that should be used to solve the problem.
For example, Christine quickly reversed course because her answer would not make sense:
[S]ince we have three moles of oxygen … no, that doesn't sound right though. I'm going to go with you can only make one mole because, for every three moles of iron you need one mole of oxygen and there are only four moles of iron ....^{[7]}
Christine was unsure of the precise algorithm but understood the proportions necessary to solve the problem. Self-checkers were often able to cross check against qualitative or conceptual reality, then revise initially incorrect answers in response.^{[8]} Self-checking signals a facility to move between two cognitive areas: the qualitative and the quantitative and thus gestures towards metacognition.
When dealing with the stones to pounds conversion that stumped many students, Shannon used self-checking behavior, becoming quite annoyed at her own initial thought process, before solving the problem:
[N]o, no, no, cut that out. So 1 over 1 equals 14 over 1. …that times 14 will give you 1…will give you .5 so multiply each side by 1 over 28. So then you get 1 over 28 equals 1 over 2. So that is ½ a pound equals 1/28th of a stone. Cool. So I just had to think about it.
The "cut that out" in her tape recording indicates Shannon's realization that her initial strategy is faulty. Her vocalization is noteworthy as few students narrated their process so clearly.
Some students did not self-check in the math problems but did so in chemistry problems. One such student caught himself in computational errors because of his ability to check the sense of his answer against chemical concepts and physical reality.
That's 0.50. What? Grams of calcium carbonate. Again that's not in the answer. I must have made an error somewhere. I'm going to do this again. 56.519 is the initial mass minus 57.152, the difference is 0.633 grams. That's the amount of stuff that was left in the solution. Calcium doesn't leave the solution, well neither does water, the only thing that leaves is the carbon, I mean carbon dioxide. You throw in 0.853 grams of stuff and your left with 0.633 of that stuff, after it's reacted with the antacid. 0.22 grams went somewhere. It dissolved in the air. It had to be carbon dioxide.
Although this student made some mistakes in computation, his practice of self-checking allowed him to correct these mistakes, based on his knowledge of how chemicals will function in the physical world.
Students who are familiar with the conceptual basis for common algorithms (or formulas) also perform a sort of reality check on their work. We differentiated students who apply their conceptual understanding of algorithms from those who base their checking on other types of knowledge because this conceptual understanding is more symbolic and therefore less intuitive than other forms of self-checking. In their responses to the baseline chemistry and math questions, these students, whom we classified as annotators, verbalized the scientific logic underlying the algorithms they use.
Almeida, a junior who had not taken chemistry since her freshman year, skipped several questions. When she did attempt a question, she was unsure of the exact formula, but she remembered that it should solve for the conservation of matter:
Yeah, not going to do that. It's been quite some time since I've taken chemistry. What I do remember is that outside of the reaction the products and the reactants have to equal the same amount of atoms so you would go about solving this problem by finding the amount of atoms of each element on each side of the equation and thereby adding numbers in front or to the entire relation to increase the amount of atoms to make it equal. Something about conservation of matter. Moving on.
Almeida exhibited metacognition when she identified both an understanding of the type of mathematical process involved and their application, the conservation of matter, when considering how to solve the problem. The conservation of matter is a key concept in the physical sciences and the algorithm for balancing formulas is a quantitative expression of this basic principle.^{[9]} Although Almeida did not remember the formula, she exhibits an understanding of both the conceptual principle and the mathematical, or algorithmic, requirements for completing the quantitative portion of the task.
A first-year chemistry student encountered a similar difficulty because she did not recall the algorithm to compute the number of moles in 5.3 grams of magnesium oxide. Instead of giving up like Almeida, she applied the conceptual basis of a mole, Avogadro's number, to solve the conversion.
That's not making sense. … I'm going over the periodic table. … Wait. I'm thinking I have to do the molecular mass for the whole thing …. 6.022 x 10^{23}, which is Avogadro's number. So that number of atoms or molecules equals one mole of that element or substance. Ok, so I have one molecule of magnesium, so, so I have 6.022 x 10^{23} and then I have two oxygens, which is … well that is a really humongous number ….
Eventually, Megan solved the problem because she applied her knowledge of the relationship between Avogadro's number and molecular mass. She was able to annotate the problem with the mathematical and scientific principles that form the foundation of the algorithm for converting to moles.
Formal aspects of the question initially confused another student, Alana:
I don't like how they give it in moles and not grams that is so odd. Um, I react 0.2 moles of calcium with excess oxygen to get calcium oxide. So, it's Ca plus O….um, Ca….again, I don't know if it's supposed to be Ca plus O2 or….I'm just going to do Ca plus O comes to CaO. Ok, and…..um, zero point two. I know going from the very beginning of the problem, I go from moles to grams every time, but that's just because I don't like working with it in moles.
Alana, like Ashley, expresses a preference for certain formal parameters and allows her practice to be complicated by this preference. Unlike Ashley, however, Alana applies her knowledge of the algorithm for determining a limiting reagent and comes up with an answer:
Ok, I react 0.2 moles of calcium with excess oxygen to get calcium oxide. You can't figure out a limiting reagent unless you have two … like, information on each one. What? Ok, this is … with a one to one ratio … ok, I'm just going to write that there isn't one.
Alana's answer here is a critique of the test question. Since she understood that the algorithm for identifying a limiting reagent requires her to choose the element in shortest supply, Alana elects not to choose an answer.
Similarly, Dan began to solve a problem about molarity with an equation that applies to relationship between volume and molarity but determined a better way to solve the problem. Dan also crosschecked his answer against the choices given on the problem set, exhibiting an understanding of the possible uses of the test questions in determining an answer.
We will use the equation C1B1=C2B2, which is a relationship between the volume and the molarity. We will first get how much moles we have from 1.928 potassium nitrate. … We won't use the equation C1B1=C2B2. We can use a different equation, that is volume divided by molarity . … Now let's try to find this answer here. … the answer that fits that is C, which is 7.627 x 10-2, which is moving the dot twice to the left. C is the right answer.
Although choosing the closest answer from a multiple-choice list is not the best way to solve the problem, Dan used a series of considerations when identifying a solution, exhibiting an attention to all the information he is given and an understanding of the conceptual basis of the formulas he applied.
Kelly, an organic chemistry student, exemplified what we termed metacognition. She successfully calculated the molarity^{[10]} of potassium nitrate^{[11]} by exhibiting an understanding of the formula, the logic behind it, and a conceptual understanding of the physical relationships of the chemicals and the questions in the test. The problem provides potassium nitrate in grams and volume in milliliters, requiring a conversion.
I know that the equation for molarity is moles over liters. Since they give you grams, you can convert grams to moles, and since they give you milliliters, you can convert them to liters. First, I'm going to do grams to moles, so they tell me what we have 1.928 grams of NKO3. Now, just like I did in the problem before that, I'm going to calculate the molecular mass of KNO3, and then I'm going to divide 1.928 into that.
Kelly exhibited a metacognitive understanding by applying knowledge from her coursework and also by expressing the relationship between individual questions used in the problem set. In the previous problem, Kelly self-checked and discovered an error in converting from grams to moles by reminding herself of the proportional relationship between grams per moles and moles: "if I just divide that by five grams I'll get moles. Actually that's backwards. 169.8749 is grams per mole."
Students who produced sensible narratives about problem-solving and grounded their observations in an understanding of the physical world or the theoretical underpinnings of common formulas were consistently more successful in problem-solving, even if they did not apply these skills to all problems. Applying an understanding of the workings of algorithms was a less common skill than applying an understanding of physical reality among these students.
Another interesting observation was the tendency of confused students to attempt to reinvent a certain type of academic practice, either by reframing problems as algebra or deriving formulas as a first step in problem-solving. The language used by these students mimics that observed in student composition papers considered by foundational figures in composition theory who note that students fall back on modeling academic language in order to mask ignorance or confusion (Bartholomae, 1997, Bazerman, 2005, Graff, 2003). Given the similarities in these two circumstances, we feel that the application of common methods in composition studies, as outlined by these authors and Bean (1996), Harris (2006), and Graff (2003), could help students frame narrative competencies in problem-solving.
Based on our findings, we suggest that writing assignments that require students to account for the conceptual sense of calculations and the mathematical principles applied could benefit many students. Students would also benefit from writing assignments that required them to describe problems in terms of their relationship to the physical world. Assignments that encourage self-checking behaviors and annotation of algorithms would likely facilitate metacognition.
Given that many students exhibited a poor understanding of general application of certain mathematical principles, it is likely that they would benefit from writing essays that explain these relationships. Students could, for example, write short essays in which they identify categories of problems that can be solved with the same algorithmic approach and explain why this is the case. Such an assignment draws on Lakoff's (1987) work in cognitive science, a theory of framing that is also often used in composition studies, and is designed to aid students who attempt to plug numbers into formulae without understanding their choices. Another way to use the idea of framing in writing assignments is to ask students to write about whether it is necessary to derive equations for problems that can be solved in five computational steps or fewer. This type of assignment would encourage students to consider the practical and logistical requirements of developing formulas and applying simple arithmetical principles.
Some of these suggestions have been employed in applied science courses such as engineering. Case and Gunstone (2003), for example, have identified two styles of approaching engineering problems: deep and surface learning. Deep learning is encouraged by metacognitive learning activities, while the latter is consistent with "plug and chug" (symbolic) approaches. Although the characteristics of each of these styles are complex, generally, deep learning involves investigating conceptual paradigms and their expression in quantitative terms. Surface learning is more concerned with completing tasks efficiently without achieving a clear understanding of the concepts in question. Case and Gunstone (2003) and Case and Marshall (2004) have identified writing-to-learn activities as key to promoting deep learning and its associated metacognitive properties. Case and Marshall (2004) included journaling as part of a second-year chemical engineering course to achieve metacognitive understanding of the subject matter. Case and Gunstone (2003) describe the use in class of conceptual questions that asked students to think through the problem contexts. Describing classroom activities in a second year chemical engineering class, they note, "the lectures involved students thinking about problem contexts, discussing these amongst themselves, and feeding back responses for the lecturer to use in the whole class context. Conceptual questions often started with the phrase 'what if …' or 'explain why …' and required students to reason through problems in which specific numerical values had not been provided" (Case & Gunstone, 2003, p. 805). Clearly, these researchers rely upon writing—in the form of journaling and communicating problem solving approaches—as a means for achieving student understanding of the engineering concepts under consideration.
In a similar vein, Hanson and Williams (2008) used writing-to-learn activities as the structural backbone of their pedagogical approach in an introductory engineering statics class. While they could not assert a direct improvement on the part of students with regard to the latter's successful completion of problem sets, they did find that some students became more self-aware about the problems and limitations of their learning processes in engineering. Commenting on the effectiveness of their "explain a problem" type of assignment, Hanson and Williams (2008) note "The …assignment does appear to help students achieve the self-assessment objectives: students discover what they do and do not know, and students recognize the difference between understanding how to solve a problem and blindly plugging numbers into formulas" (Hanson & Williams, p. 522). In this way, students achieve a metacognitive understanding of their own learning processes, one that may lead them to seek assistance in a more effective way from tutors and professors in the sciences.
Similarly, studies of the application of writing in mathematics pedagogy support our view and suggest similar approaches to the ones outlined above. Shield and Galbraith (1998) defined a series of activities as necessary for "effective knowledge": writing a generalized statement of procedure, demonstrating it, linking it to prior knowledge, and justifying its use. Pugalee (2004) notes that writing, as opposed to speaking, was more beneficial in terms of problem solving and in terms of obtaining evidence of metacognition. Miller (1992) used impromptu writing prompts in algebra classes to produce texts that enabled teachers to clarify classroom lecture topics and to reinforce quantitative principles. Impromptu student writing also alerted teachers to gaps in students' understanding and command of terminology. A useful observation was that students were less likely to understand abstract principles than more concrete examples, an observation we made as well.
Finally, compositionists might benefit from studies conducted by science educators in order to understand and align our "Writing in the Disciplines" curriculum with the cognitive issues of translation that arise in the sciences. This kind of attention to the cognitive context of applied science is a crucial step in enabling faculty in both composition and science to appreciate more fully the particular learning issues that trouble students in this area.
Afful, Joseph Benjamin Archibald. (2006). Introductions in examination essays: The case of two undergraduate courses. Across the Disciplines, 3. Retrieved from https://wac.colostate.edu/atd/articles/afful2006.cfm
Bartholomae, David. (1997). Inventing the university. In Victor Villanueva, Jr. (Ed.), Cross-talk in composition theory: A reader. Urbana, IL: NCTE.
Bazerman, Charles. (2005). Reference guide to writing across the curriculum. New York: Parlor Press.
Beall, Herbert. (1998). Expanding the scope of writing in chemical education. Journal of Science Education and Technology, 7(3), 259-270.
Bean, John. (1996). Engaging ideas: The professor's guide to integrating writing, critical thinking, and active learning in the classroom. San Francisco: CA: Jossey-Bass.
Berkenkotter, Carol. (2000). Writing and problem solving. In Toby Fulwiler & Art Young (Eds.), Language connections: Writing and reading across the curriculum (pp. 33-44). Fort Collins, CO: WAC Clearinghouse Landmark Publications in Writing Studies. Retrieved from https://wac.colostate.edu/books/language_connections/
Case, Jennifer, & Gunstone, Richard F. (2003). Approaches to learning in a second year chemical engineering course. International Journal of Science Education, 25(7), 801-819.
Case, Jennifer, & Marshall, Della. (2004). Between deep and surface: procedural approaches to learning in engineering education contexts. Studies in Higher Education, 29(5), 605-615.
Childers, Pamela, & Lowry, Michael J. (2006). Connecting visual and written text in science. Across the Disciplines, 3. Retrieved from https://wac.colostate.edu/atd/visual/childers_lowry.cfm
Cooper, Duane A. (2002). Assessing writing across the curriculum. The Mathematics Teacher, 95, 170-72.
Driskill, Linda, Lewis, Karin, Stearns, Jenny, & Volz, Tracy. (1998). Students’ reasoning and rhetorical knowledge in first-year chemistry. Language and Learning Across the Disciplines, 2(3), 3-24.
Duffy, Andrew. Interactive lecture demonstrations. Retrieved from http://buphy.bu.edu/~duffy/ILD.html
Foucault, Michel. (2002). The archaeology of knowledge. New York: Routledge.
Fulwiler, Toby, & Young, Art. (2000). Language connections: Writing and reading across the curriculum. Fort Collins, CO: WAC Clearinghouse Landmark Publications in Writing Studies. Retrieved from https://wac.colostate.edu/books/language_connections/
Graff, Gerald. (2003). Clueless in academe: How schooling obscures the life of the mind. New Haven: Yale University Press.
Hanson, James E., & Williams, Julia M. (2008). Using writing assessments to improve students’ self assessment and communication in an engineering statics course. Journal of Engineering Education, 97(4), 515-529.
Harris, Joseph. (2006). Rewriting: How to do things with texts. Logan, UT: Utah State University Press.
Heckelman, Ronald J., & Dunn, Will-Matthis, III. (2003). Models in algebra and rhetoric: A new approach to integrating writing and mathematics in a WAC learning community. Language and Learning Across the Disciplines, 6(3), 74-88.
Herrington, Anne. (1981). Writing to learn: Writing across the disciplines. College English, 43(4), 379-387.
Jamison, Robert. (2000). Learning the language of mathematics. Language and Learning Across the Disciplines, 4(1), 45-54.
Kokkala, Irene, & Gessell, Donna A. (2002-2003). Writing science effectively: Biology and English students in an author-editor relationship. Journal of College Science Teaching, 32(4), 252-257.
Lakoff, George. (1987). Women, fire, and dangerous things: What categories reveal about the mind. Chicago: Chicago University Press.
Latour, Bruno. (1987). Science in Action. New York: Cambridge University Press.
Law of conservation of matter. (2005). Science course module: Integrated physics and chemistry. Retrieved from the University of Houston, College of Education website: http://atlantis.coe.uh.edu/texasipc/content.htm
Lerner, Neal. (2007). Laboratory lessons for writing and science. Written Communication, 24(3), 191-222.
McLeod, Susan. & Maimon, Elaine. (2000). Clearing the air: WAC myths and realities. College English, 62(5), 573-583.
Merrill, Yvonne. (2004). Writing as situated thinking in general education. Across the Disciplines, 1. Retrieved from https://wac.colostate.edu/atd/articles/merrill.cfm
Miller, Diane L. (1992). Teacher benefits from using impromptu writing prompts in algebra classes. Journal for Research in Mathematics 23(4), 329-340.
Montgomery, Scott L. (2003). The Chicago guide to communicating science. Chicago: University of Chicago Press.
Moore, Randy. (1994). Writing to learn biology: Let’s stop neglecting the tool that works best. Journal of College Science Teaching, 23(5), 289-295.
Poe, Mya. (2000). On writing Instruction and a short game of chess: Connecting multiple ways of knowing and the writing process. Language and Learning Across the Disciplines, 4(1), 30-44.
Powell, Alfred. (1985). A chemist’s view of writing, reading and thinking across the curriculum. College Composition and Communication, 36(4), 414-418.
Pugalee, David. (2004). A comparison of verbal and written descriptions of students’ problem-solving processes. Educational Studies in Mathematics, 55(1/3), 27-47.
Rutz, Carol, & Grawe, Nathan D. (2009). Pairing WAC and quantitative reasoning through portfolio assessment and faculty development. Across the Disciplines, 6. Retrieved from https://wac.colostate.edu/atd/articles/rutz.cfm
Shield, M & Galbraith, P. (1998). The analysis of expository writing in mathematics. Educational Studies in Mathematics, 36(1), 29-52.
Sokoloff, David R., & Thornton, Robert K. (1997). Using interactive lecture demonstrations to create an active learning environment. Physics Teacher, 35, 340-347.
Stromberg, Arnold J., & Ramanathan, Subathra. (1996). Easy implementation of writing in introductory statistics courses. The American Statistician, 50(2), 159-163.
Walvoord, Barbara E. (1996). The future of WAC. College English, 58(1), 58-79.
Base-line math questions
Please show your work (the following two questions are taken from the SAT MATH SECTION)
Conversion questions
Chemistry Questions
How many moles in
If I start with 4 moles of iron and 3 moles of oxygen, how many moles of Iron (III) oxide can I make?
If I start with 4 moles of iron and 4 moles of oxygen, how many moles of Iron (III) oxide can I make? What is my limiting reagent?
Determine the limiting reagent in the following problem
Combination problems (limiting reagant, calculation of moles, grams)
Taken from "Chemistry: The Basics Chem 130 Workbook," by William Sweeney, with contributions from Pamela Mills, Chemistry Department, Hunter College, CUNY
Fall 2000
[1] Jamison’s chosen methodology for making the syntactical structure explicit is to have students write mathematical definitions. He feels that this is a crucial exercise since learning a mathematical language is roughly equivalent to learning a formal language, and the basis of mathematical language, like a foreign language, is the construction of viable definitions: "Early in the semester I present the students with a list of roughly twenty common geometrical terms, such as, circle, square… and for homework, ask them to write out definitions" (53). Jamison provides students with a list of criteria for composing "good definitions."
[2] David R. Sokoloff describes the format of an interactive lecture demonstration as follows:
Professor describes the experiment, and carries it out without recording data. Students record their predictions of the outcome on a Prediction Sheet. Peer discussion follows, with the students discussing their predictions in small groups. Professor engages class, soliciting predictions and highlighting common predictions. Students record their final prediction on the Prediction Sheet (this is collected). The experiment is run. Real data is recorded and plotted by the computer, with the results displayed graphically for all to see. Professor engages class, discussing what students say about their predictions and focusing in particular on any common misconceptions. Students record the results on a Results Sheet, which they keep. Professor discusses variations of the experiment and similar physical situations based on the same underlying concepts. (1999, 340)
[3] Although the degree of difficulty is always hard to determine, the SAT quantitative section is composed of questions that gradually increase in difficulty and require more and more mathematical knowledge. The questions we chose were among the first questions asked on a recent SAT exam.
[4] When we reviewed a sample of ten written answers notes from the test-takers, we found that two students answered all of the math questions incorrectly; one answered all of the math questions incorrectly, while the remaining seven answered on average three out of the four math questions correctly. Usually they were unable to answer the first question. As for the chemistry questions, three out of the ten students answered all of the chemistry questions correctly, while the rest answered an average of six out of the nine questions correctly.
[5] As a graduate student, Dr. Calvin was awarded "Writing across the Curriculum" fellowship at the City University of New York to study the implementation of writing in introductory undergraduate physics and astronomy courses. Since becoming a faculty member at Sarah Lawrence College, he has gone on to develop the Student Scientist Model of pedagogy, which incorporates multiple modes of writing into courses at all levels.
[6] A colleague in mathematics commented that the actual quality of things being measured should have no bearing in quantitative ability. She even suggested substituting pig and piglets for stones and pounds in future iterations of this conversion question.
[7] Please note that we have italicized all examples of self-checking and bolded examples of annotation in our discussion of the student responses.
[8] While reviewing a preliminary draft of our article, Scott Calvin noted the following about Christine's answer: "Notice that she doesn't quite get this one right! Aside from the fact that she made some mistakes leading up to this passage, she doesn't consider that moles can be fractional. With four moles of iron and a 3:1 ratio, she can make 4/3 moles of Fe3O, which itself is not the right chemical formula. Nevertheless, the result of her self-checking is an answer that is closer than the wrong answer she was initially heading toward (3 moles of Fe3O), because she could tell that the 3 moles of Fe3O ‘didn't make sense.'”
[9] Briefly, the Law of Conservation of Mass (or matter) states that mass is neither created nor destroyed in an ordinary chemical reaction (Law of Conservation, 2005).
[10] Molarity is moles of a solute per liter of water.
[11] The question asks if 1.928 g KNO3 is dissolved in enough water to make 250.0 mL of solution, what is the molarity of potassium nitrate?
Rich, Jennifer, Miller, Daisy & DeTora, Lisa. (2011, June 27). From concept to application: Student narratives of problem-solving as a basis for writing assignments in science classes. Across the Disciplines, 8(1). Retrieved from https://wac.colostate.edu/atd/articles/richetal2011.cfm