WAC Clearinghouse home page
About ATD   Submissions   Advanced Search

READING AND WRITING ACROSS THE CURRICULUM

It's Not That They Can't Read; It's That They Can't Read: Can We Create "Citizen Experts" Through Interactive Assessment?

Abstract: Examining deficits in the development of adolescent literacy, this article explores conceptions of literacy not as a "simple" exercise that merely requires layperson understandings of words, but rather, as research shows, as a deeper action that requires disciplinary knowledge and standing. Vygotsky's distinction between tool users vs. sign users further explicates why educators should not consider true literacy to be the acquisition of terminology, but rather the internalization of language such that the language actually changes paradigms and shapes thought. As one potentially new way to cultivate literacy in students, this article describes a study of interactive writing assessment. Students who summatively assessed each other's work also reported improvement in their ability to read more deeply, presumably because of their inclusion in a disciplinary activity. Involving students in writing assessment can potentially result in dramatic gains in their reading abilities because doing so invests students in (1) the reading of course texts as resources for their own papers, (2) the reading of other students' papers, (3) the reading of how other students read the course texts, and (4) the discussion of meaning making about all of the above.

Background

The picture of adolescent literacy is as clear as it is dark: Few students demonstrate deep reading skills (Graham & Perin, 2007); according to the National Assessment of Educational Progress (NAEP), only 5% of tested students could satisfactorily comprehend a given text, identify the piece's point, and make meaningful connections between one text and another (NCES, 2007). Even placing such deeper reading aside, the NAEP's Grade 12 Reading and Mathematics 2009 National and Pilot State Results found that "The average reading score [for 12th graders] in 2009 was higher than in 2005 but lower than in 1992" (1), and only "Thirty-eight percent of twelfth-graders performed at or above the Proficient level in reading in 2009" (1).

But proficient at reading what?

Referencing Renaissance Learning's 2012 (2012) report on the books read by almost 400,000 students in grades 9–12 in 2010–2011, Sandra Stotsky notes that "the average reading level of the top 40 books is a little above fifth grade (5.3 to be exact)" (p. 4). As Stotsky notes, "a fifth-grade reading level is obviously not high enough for college-level reading. Nor is it high enough for high school-level reading, either, or for informed citizenship" (p. 4).

Yet while informed citizenship is an important strategic goal, the lack of deep reading poses other and more immediate problems. Temporarily setting aside the obvious importance of literacy for its own sake, students' frail reading abilities impact other aspects of their academic work, especially their writing. As part of The Citation Project, in "Writing from Sources, Writing from Sentences" (2010), Rebecca Moore Howard, Tricia Serviss, and Tanya K. Rodrigue studied 18 student papers and all of their referenced sources in order to determine the caliber of the in-text references made to the referenced material. They broke the references down into four categories: summary—"restating … with fresh language and reducing … by at least 50%," paraphrasing—"restating … though sometimes with keywords retained from that passage," patchwriting—"reproducing source language" with minor modifications to wording and structure, and copying—"exact transcription" (181).

Howard, Serviss, and Rodrigue reference research that shows that summary emerges from deep comprehension (Angelil-Carter, 2000; Brown and Day, 1983; Harris 2006), but that patchwriting signifies a lack of understanding (Angelil-Carter, 2000; Roessig, 2007; Roig, 2001). Viewing their research through that lens, the same authors determined that the student authors of the 18 papers generally did not comprehend the works they cited: Although 89% of the papers contained "one or more incidences of patchwriting," none of the papers contained any summary (182). (All of the papers also contained paraphrasing, and 72% of the papers also involved "one or more incidences in which direct copying is not marked as quotation" (182).) This is consistent with other research by Horning (2010), whose review of literature revealed that "it is at least in part the lack [of strong reading skills] that leads to inadvertent plagiarism" (p. 154).

Such high incidences of patchwriting, not to mention uncited copying, lead to all-too-frightening extrapolations that students lack the ability to read texts for minimal levels of comprehension. Supposing that's the case (and that students aren't just lazy in applying otherwise capable reading skills), it certainly begs a question: Why? We can attribute at least part of the students' failure to read to how faculty understand students' reading abilities. Too many faculty members hold "simple views of reading" that approach adolescent and older readers as if they already learned to read in early grades and need no further development of more complex textual skills (Greenleaf, Schoenbach, Cziko, & Mueller, 2001, p. 84; Gough & Tumner, 1986; Gough, 1983).

Literacy means more than just reading

But authentic literacy isn't that simple. Moje, Young, Readence & Moore (2000), for example, who identify an essential difference between reading words and more meaningful disciplinary literacy, argue that for "many youth who have mastered the basic processes of reading and writing by the time they reach fourth grade, there is still much to learn about the practices associated with literacy, especially the ones unique to different disciplines, texts, and situations (p. 401). Based on analyses of contemporary research, Biancarosa & Snow (2004) similarly criticize conceptions of reading as a simple, finite skill, arguing that "most older struggling readers can read words accurately, but they do not comprehend what they read, for a variety of reasons" (p. 8).

Chief among those reasons: the complex relationship between reading and subject matter, a relationship that spurs many contemporary researchers to call for viewing the teaching of subject matter as an act of teaching literacy itself (Moje, 2008; Conley, 2008; Shanahan & Shanahan, 2008). That same view involves increasing recognition for the fact that literacy requires understandings of discipline (Heller & Greenleaf, 2007; Best, Rowe, Ozuru & McNamara, 2005) and genre (Bawarshi, 2003). "Construing comprehension" disciplinarily, "as part of domain knowledge" rather than as a skill that exists prior to or independently of domain knowledge, "may mean that teachers would attend to the complexities of teaching [reading comprehension] as part of their academic instruction" (Ehren, 2009, p. 192-193).

To clarify that distinction between reading words and real literacy, we should consider Gee's (2004) discussion of "specialist language"—that language particular to any field or subculture that exists outside of common vernacular. "It is obvious," Gee contends, "that once we talk about learning to read and speak specialist varieties of language, it is hard to separate learning to read and speak this way from learning the sorts of content or information that the specialist language is typically used to convey. That content is accessible through the specialist variety of language and, in turn, that content is what gives meaning to that form of language. The two—content and language—are married" (17-18). Thus, Gee contends that language, terminology, definition, etc., gain meaning only through context and culture, which means that reading any word beyond its pedestrian definition requires situatedness within a discipline. Such a situated conception of literacy recalls Fish's (1980) point that although "text" comes with literal meanings, its deeper meanings depend entirely on the context in which the word emerges. Therefore, both Gee and Fish unpack Biancarosa & Snow's point about how older readers can read a word for its more literal meaning but fail to comprehend that word's deeper meaning and value within a given discipline.

And doesn't the lack of deeper reading comprehension also speak to why students engage in patchwriting? If students cannot deeply comprehend the real meaning of what they read as it applies to the disciplinary construct, doesn't it also make sense that they would forego attempts to summarize effectively? In other words, if they lack the ability to summarize the deep meaning of a text, why not at least incorporate seemingly meaningful snippets of the text into their own work so as to manufacture a disciplinary-like, but not disciplinary-at-all, paper?

A Path Forward: Modeling Disciplinary Literacy Practices

As evidence for my contention that students turn to patchwriting because they intuitively understand that they cannot engage the subject matter in a truly meaningful way, I turn to Vygotsky's (1978) distinction between language that plays an external role vs. language the transforms thought internally. Vygotsky articulates that distinction in language use through the concepts of the tool and the sign:

A most essential difference between sign and tool, and the basis for the real divergence of the two lines, is the different ways that they orient human behavior. The tool's function is to serve as the conductor of human influence on the object of activity; it is externally oriented; it must lead to change in objects. It is a means by which human external activity is aimed at mastering, and triumphing over, nature. The sign, on the other hand, changes nothing in the object of a psychological operation. It is a means of internal activity aimed at mastering oneself; the sign is internally oriented. (1978, p. 55)

Thus tools, though lacking in internal value and unable to shape thought, serve their purposes by affecting the external world. Signs, in contrast, literally change how the thinker views the world.

Consider, for example, the concept of "experiment." Students entering a scientific discipline assuredly possess some layperson understandings of "experiment," a definition perhaps akin to loosely playing around with different things to see what works. When initially required to complete an "experiment" in a science course, the students might work from their more vernacular conceptions of what "experimenting" entails. When discovering conflict between their vernacular understandings and disciplinary understandings, students will complete the "experiment" according to the professor's specifications: They will follow the prescribed steps. They will use the prescribed materials. They will move through the scientific process. "Experiment," though perhaps taking on some new, more scientific meaning, nevertheless will function as a tool, as that thing to do when doing "science," if not just as that thing to do in order to get a good grade in the science course. But at that stage, "experiment" will not function internally. It will be a word and concept to employ for external result.

Eventually, however, students who persist in the sciences will come to adopt "experiment" for purposes beyond the external. They will start to think in terms of "experiments," and they will come to appreciate the social currency that an effective "experiment" holds in the scientific community. They will begin to think more through the scientific method by looking at the world in terms of testable hypotheses. They will view the world more in terms of "experiments" that they could conduct. They will find internal, intellectual rewards in "experiments," and those internal rewards will match or exceed the value of external rewards, i.e., grades.

When that transition occurs, and it will be a continuing process, students will move from using "experiment" as a tool for success in a class to "experiment" as a sign that shapes how they interact with the world. The notion of "experiment" will come to hold meaning for them. It will, in Vygotsky's words, "come to organize the child's [or students] thought, that is, become an internal mental function" (1978, p. 89).

If developing this "internal mental function" defines the true goal of literacy, we must exercise great care in how we define and teach "content." Consider that content-oriented educators often lack requisite expertise to facilitate student literacy skills (Jetton & Dole, 2004), or that, if they possess the skills, they do not exercise them (Jetton & Dole, 2004; Pressley, 2002), or that "supports for students in the forms of explicit reading instruction" might even be "diminishing" (Jetton & Alexander, 2004, p. 15).

But then we should remember that there are paths forward: modeling disciplinary literacy practices and meaning construction produces dramatic effects in students' literacy abilities (Monte-Sano, 2001; Hynd-Shanahan, Holschuh, and Hubbard, 2004; Rouet, Favart, Britt, & Perfetti, 1997). That affirms both the need and the value in moving students from tool users to sign users, and it seems to incite an important question: Since content cannot exist independently of the literacy that understands (or produces) that content, e.g., there is no such thing as an "experiment" aside from how a discourse community understands it, isn't the process of teaching "content" always really the process of teaching literacy? Shouldn't "content" educators really only worry about teaching the process of literacy in their discipline?

Returning to the "experiment" example, a literacy approach to content means that teaching an "experiment" as content will always suffer relative to teaching "experiment" as a construct within a discipline that requires a disciplinary literacy to appreciate. Content literacy about "experiment" becomes a discussion about what constitutes best practices in "experiment," and what would "really" constitute a "good experiment," i.e. what "experiment" really means, something only those who are highly literate in that discipline can define.

Alternately, without substantive guidance in how to read like a member of the discipline reads, it is unlikely students will move beyond surface reading and patchwriting. After all, if students can "read" words but not disciplinarily "comprehend" words, then summary becomes impossible. Faced with an inability to summarize but a requirement to do so, it makes logical (but not ethical) sense that students will embrace what they read as tools—passages they can use for external purpose, such as meeting page length requirements—rather than as signs—meaningful concepts that shape their vision of the world itself.

If we therefore conceptualize literacy as a movement from tool reading to sign reading, the imperative question is this: how do we make students sign readers? To that end, there are far too many valuable resources on teaching literacy to list here. Yet I want to add a perhaps unlikely pedagogy for teaching reading to that list: interactive writing assessment.

Peer Assessment and the Practices of the Discipline

In 2010, I conducted a study to determine how my writing students' conceptions of grading and writing changed over a semester in which they participated in the summative assessment of one another's writing. Specifically, I was interested in determining if participating in summative assessment increased or decreased the importance students placed on the grades they received.

The study was conducted at a private college in the northeast. The college is ranked by U.S. News and World Report as one of the top ten Regional Universities in the North. Demographically, the college serves a student body of about 5,000 who hold an average high school GPA of 3.3. 54% of the population is female. On the very first day of their first-year writing course, the eight women and six men in the course were told that they would interactively and summatively assess, i.e., grade, one another's papers through a guided process involving teacher input and a common rubric. The interactive grading process moved from whole-class workshops, to group workshops, to paired assessment, to self-assessment.

The study, which received exempt IRB approval, involved four touch points during the semester, each one asking the students to complete a written survey about their experience in interactive assessment. Using both qualitative narrative and quantitative measures, the surveys were held anonymously by the dean's office out of my purview until the deadline for changing grades had passed. All of the students who completed the surveys earned an A for 10% of their course grade; I determined their "completion" merely by noting that they submitted a document to the folder that a student later sealed and delivered to the dean's office. On the first day of the survey, students also submitted a consent form to the dean's office, and I had no knowledge of who consented or refused consent until after the grade-change deadline. All of the students in the course consented to be part of the study.

Although the study in no way sought to examine how students' reading skills developed through interactively assessing one another's writing, interesting results nevertheless emerged. Perhaps in part because the students used a common body of texts (partially submitted by the students themselves), all of the students spoke directly to reading text differently as a result of their involvement in summative assessment. All spoke to deeper abilities to read text, e.g. "I am learning how to read" more critically, and some even spoke to how the process allowed them to deepen their conceptions of textuality: "I like reading what other people have written because it is interesting to see how different … papers are when we have a limited amount of articles to use." Thus, students engaged reading not as mere ingestion of words but for comprehension and interpretation, and equally interpreted and analyzed one another's interpretations of text. The interactive assessment process therefore sparked a fourfold reading experience: (1) the reading of course texts as resources for their own papers, (2) the reading of other students' papers, (3) the reading of how other students read the course texts, and (4) the discussion of meaning making about all of the above. All of those processes explain why one student asserted that the process "made [him] an astute reader."

Viewing my study's results through Vygotsky's tool/sign lens, the interactive assessment process helped create astute readers by placing students in a construct where they could not merely read words, nor merely place those words in their own texts. Rather, although no explicit reading instruction of course text was given, the summative assessment of one another's papers help transform students from tool users to sign users because, based on my observations and their reports, assessing interpretations forced students to change how they interacted with text:

First, it habituated them to become active readers of one another's work. They could not merely read the work and ingest the words, but, in order to grade one another's works, needed to become meaning-makers who could analyze the construction of ideas relative to established disciplinary standards.

Second, and as a result of the first, students learned to engage course texts more critically because interactive assessment forced them to examine how their peers constructed meaning from those texts. As such, the students came to see the course texts, as well as other students' invocations of those texts, as symbolic constructs that held varying values of meaning relative to the course (read: disciplinary) standards. In other words, by critiquing their peers' use of course texts, students saw that texts could yield different meanings of varying validity and value; they saw that there were interpretations the class and discipline would embrace or reject.

All told, interactive assessment made students into valued agents within the discipline itself; it taught them to exercise text for disciplinary purposes, how members of the discipline read and apply concepts (such as grading standards and textual interpretation), and how to think like a disciplinary member. And they had to do that. If they didn't, they would not grade one another fairly, which meant that they would not receive fair grades themselves.

In other words, the students could not use the vernacular of "experiment." Rather, through a guided process, they not only came to understand how the discipline employed text and terms, they participated in the construction of meaning. The process of assessing one another, of assessing the value of one another's interpretations of student text and course text alike, made them not just understand "experiment," it made them internalize "experiment" as a world view.

That said, it seems imperative to note that, aside from grades on the papers, which improved dramatically from the start of the course to the end, the study contained no independent assessment of deep reading ability. Therefore, although increased grades on papers that required deep interpretation of text is an indicator that such deeper interpretation actually occurred, some of this report also relies on the students' perceptions of their development as readers rather than on external evidence of that growth.

Furthermore, it becomes clear that future studies must more explicitly and directly investigate how students' reading abilities change through interactive assessment. Studies that test how students read course text prior to and then after a semester of interactive assessment could prove valuable in determining the full worth of interactive assessment as a literacy pedagogy.

Finally, it seems well to note some limitations of the study, such as the small sample size, the lack of significant ethnic diversity in the students, and the fairly affluent nature of the college. All of my conclusions need to be tested in other collegiate environments.

Nevertheless, the study suggests the power that comes when faculty involve students in disciplinary activity, in this case the assessment of writing, something that makes them privy to, if not active agents of, the kind of specialized understanding (and power) that empowers disciplinary understanding and agency.

To borrow Sidler's (2005 WAC citizen experts) phrase, bringing students into disciplinary thinking not only means helping them understand the language of the discipline but rather developing them to "become responsible 'citizen-experts' within their com­munities" (p. 49), much like students who possess the ability to assess one another's texts and one another's interpretation of course text. The implications therefore seem clear: Teaching students to understand the conventions of a discipline, though inarguably valuable, might not go far enough; if we really want students to comprehend in that word's deeper meanings, we need to involve students in the practices of the discipline. Making students "citizen experts" means that they will participate in the ownership of that discipline and its literacy practices; it means, therefore, that we cannot conceptualize content as existing outside of how expertly anyone, students included, can participate in the literacy practice that unrelentingly conceptualizes—creates—that content.

References

Angelil-Carter, Shelley. (2000). Stolen language? Plagiarism in writing. New York: Longman.

Bawarshi, Anis. (2003). Genre and the invention of the writer: Reconsidering the place of invention in composition. Logan, UT: Utah State University Press.

Best, Rachel M., Rowe, Michael, Ozuru, Yasuhiro, & McNamara, Danielle S. (2005). Deep-level comprehension of science texts: The role of the reader and the text. Topics in language disorders, 25(1), 65.

Biancarosa, Gina, & Snow, Catherine. (2004). Reading next--A vision for action and research in middle and high school literacy: A report to Carnegie Corporation of New York. Washington, DC: Alliance for Excellent Education.

Brown, Ann L. & Day, Jeanne D. (1983). Macrorules for summarizing texts: The development of expertise. Journal of Verbal Learning and Verbal Behavior, 22(1), 1-14.

Conley, Mark. W. (2008). Cognitive Strategy Instruction for Adolescents. What we know about the promise, what we don't know about the Potential. Harvard Educational Review, 78(1), 84-106.

Ehren, Barbara J. (2009). Looking through an adolescent literacy lens at the narrow view of reading. Language, Speech, And Hearing Services In Schools, 40(2), 192-195.

Fish, Stanley. (1980). Is there a text in this class? The authority of interpretive communities. Cambridge, MA: Harvard University Press.

Gee, James. (2004). Situated Language and Learning: A Critique of Traditional Schooling. New York: Routledge.

Gough, Phillip B., & Tunmer, William E. (1986). Decoding, reading, and reading disability. Remedial And Special Education (RASE), 7(1), 6-10.

Graham, Steve & Perin, Dolores. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools - A report to Carnegie Corporation of New York. Washington, DC: Alliance for Excellent Education.

Greenleaf, Cynthia L., Schoenbach, Ruth., Cziko, Christine., & Mueller, Faye L. (2001). Apprenticing adolescent readers to academic literacy. Harvard Educational Review, 71(1), 79-129.

Harris, Joseph. (2006). Rewriting: How to do things with texts. Logan, UT: Utah State University Press.

Heller, Rafael, & Greenleaf, Cynthia L. (2007). Literacy instruction in the content areas: Getting to the core of middle and high school improvement. Washington, DC: Alliance For Excellent Education.

Horning, A. (2010). A potential (solution) to the plagiarism problem: Improving reading. Journal of Teaching Writing, 25(2), 143-175.

Howard, Rebecca M., Serviss, Tricia, & Rodrigue, Tanya K. (2010). Writing from sources, writing from sentences. Writing and Pedagogy, 2(2), 177-192.

Hynd-Shanahan, C., Holschuh, J. P., & Hubbard, B. P. (2004). Thinking like a historian. College students' reading of multiple historical documents. Journal of Literacy Research, 36(2), 141-176.

Jetton, Tamara, & Alexander, Patricia A. (2004). Domains, Teaching, and Literacy. In Jetton, Tamara, & Dole, Janice. (Eds.) (2004). Adolescent literacy research and practice. New York: Guilford.

Jetton, Tamara, & Dole, Janice. (Eds.) (2004). Adolescent literacy research and practice. New York: Guilford.

Moje, Elizabeth. (2008). Foregrounding the disciplines in secondary literacy teaching and learning: A call for change. Journal Of Adolescent & Adult Literacy, 52(2), 96-107.

Moje, Elizabeth, Young, Josephine, Readence, John E., & Moore, David W. (2000). Reinventing adolescent literacy for new yimes: Perennial and millennial issues. Journal of Adolescent & Adult Literacy, 43(5), 400-10.

Monte-Sano, C. (2011). Beyond reading comprehension and summary: Learning to read and write in History by focusing on evidence, perspective, and interpretation. Curriculum Inquiry, 41(2), 212-249.

Pressley, Michael. (2002). Comprehension strategies instruction: A turn-of-the-century report. In Collins, Cathy C., & Pressley, Michael. (Eds.). Comprehension instruction: Research-based best practices (pp. 11-27). New York: Guilford Press.

Roessig, Lesley. (2007). Making Research Matter. English Journal, 96(4), 50-55.

Roig, Miguel. (2001). Plagiarism and paraphrasing criteria of college and university professors. Ethics & Behavior, 11(3), 307-323.

Rouet, J. F., Favart, M., Britt, M. A., & Perfetti, C. A. (1997). Studying and using multiple documents in history: Effects of discipline expertise. Cognition and Instruction, 15(1), 85–106.

Shanahan, Timothy, & Shanahan, Cynthia. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78(1), 40-59.

Sidler, Michelle. (2005). Claiming research: Students as "citizen experts" in WAC-oriented composition. The WAC Journal, 16, 49-60.

Stotsky, Sandra. (2012). What should kids be reading? In What kids are reading: The book-reading habits of students in American schools (4-5). Wisconsin Rapids, WI: Renaissance Learning.

The Citation Project. (n.d.). http://site.citationproject.net

Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Contact Information


Complete APA Citation

Pearlman, Steven J. (2013, December 11). It's not that they can't read; it's that they can't read: Can we create "citizen experts" through interactive assessment? Across the Disciplines, 10(4). Retrieved from https://wac.colostate.edu/atd/reading/pearlman.cfm