WAC Clearinghouse home page
About ATD   Submissions   Advanced Search

CCCC 2006 in Review

F11 WAC in the New Millennium: A Campus-Wide Survey of Attitudes and Practices

This panel told the story of five colleagues (officially titled "The UGA Writing Alliance") who realized that no full-scale survey of writing on their campus had been done, and so they got together to conduct such a survey.

Christy Desmet, "What Came Before"

Desmet began by providing some historical precedents from what she described as "the first wave of WAC," including Mike Rose's 1979 article "When Faculty Talk about Writing" and C. W. Griffin's WAC surveys. What Rose, Griffin, and others found, said Desmet, was that nearly everyone surveyed in the early years of WAC valued argument over fact, style, and grammatical felicity and that nearly everyone agreed that writing was not the sole responsibility of English departments but a shared responsibility across the curriculum. No one, however, agreed on matters of style; there were widely varying differences across the disciplines, and most respondents found the style taught in departments of English to be too "flowery" and suggested that students ought to be learning to write "clear and precise" prose instead. Based on the consistent evaluation of writing as important to all disciplines and the prevailing attitude that "writing is writing," WAC proponents began to seek a rubric for writing that could stretch across the curriculum, but such a utopian vision was never realized.

Desmet then described the circumstances at the University of Georgia that provided both the impetus and the support for The UGA Writing Alliance's own WAC survey. At UGA, the university's Writing Intensive Program headed by Michelle Ballif was founded in 1996. Initially 18 sections, the WIP program has grown to 42 sections within the College of Arts and Sciences at UGA. In 2000, UGA's Teaching Academy sought to learn more about writing and the teaching of writing. In 2004-2005, the National Survey of Student Engagement scores indicated that students report that they do little writing in college after required first-year English classes and claim that they write very few drafts when they do receive writing assignments in their college courses. In this same academic year, UGA's Task Force on General Education came up with several recommendations for faculty. Their number one recommendation was to incorporate more writing in the curriculum. Recommendation number ten was to offer more writing intensive courses throughout the university so that undergraduates would be exposed to significant writing assignments throughout their college careers. Finally, UGA's First-year Composition program had recently undergone a review of its own goals for writers and as a consequence had devised a new rubric and moved from a high stakes final written essay exam to a portfolio model of assessment.

Parker Middleton, "Constructing the Survey" (aka "Making Sausage")

Middleton began with the good news—in the first large-scale survey of faculty attitudes towards writing on the UGA campus, the survey evidence indicated that faculty are assigning a lot of writing across the curriculum. The respondents also indicated that they were eager for assistance; they had questions about how spending time on teaching writing would affect their abilities to cover the content of their courses, and they had general questions about how to work with student writing in their courses.

Middleton described the survey responses as "snapshot" of faculty attitudes; 21.6% of the teaching faculty responded. However, Middleton remarked, those responses were "rich and varied."

Middleton then proceeded to describe the survey process itself. The survey was Web-based, since most faculty are used to doing things online, and it was conservatively designed, since Middleton's research suggested that plain surveys test more positively for response (Don Dillman is the "go to guy" for Internet surveys, mentioned Middleton). The request for survey responses was initially sent to 1625 faculty in an email memo from the Vice President for Instruction with an embedded URL that redirected the recipient to the survey. This method of solicitation provided the survey team with some instant credibility. Of those people who began the survey, more than ¾ completed it fully. While the survey team chose not to send out "nagging" follow-up emails, the survey did receive some additional publicity in the campus-wide faculty and staff newsletter and in the student newspaper.

According to Middleton, when writing the survey, the team tried to edit out as much discipline-specific prose as possible and tried to write questions that presented reasonable demands for the respondents and that fueled thought without co-opting responses or providing answers. In other words, explained Middleton, the team intentionally tried to design the survey as a "feel good" experience for the respondents. The team was gratified to see that many respondents wrote lengthy answers and also thanked them for conducting the survey.

Middleton concluded by returning to the primary survey finding—that most faculty do care about writing—and by stating the team's follow-up plans, which include face-to-face interviews with respondents, a student survey, and a query of non-respondents.

Deborah Miller, "What We Found: The Perspective of First-Year Composition"

Desmet and Hayes filled in for the absent Miller. Hayes began by displaying a chart of responses to the question, "How important is each of these qualities when you grade undergraduate writing in this course?" ("This course" refers to one that the faculty member had described in an earlier question). These were the survey responses in descending order of importance:

More than 75%
   Factual correctness

   Quality of ideas or insight
   Quality of argument (described as a "distant 3rd")
   Support or evidence
   Quality of research
   Analytical sophistication
   Grammar, usage, punctuation

   Writing style

   Awareness of audience
   Disciplinary conventions (including documentation style


Desmet then identified the UGA First-year Composition program's writing goals in descending order of importance. According to Desmet, FYC defines writing as:

The "B" threshold in UGA's FYC program is described as "Skillful/Persuasive" on the grading rubric. A student writer moves up into "Skillful/Persuasive" from "Credible/Competent/Complete" (or "C") when he or she demonstrates audience awareness and coherence.

When comparing the FYC writing goals to the goals of writing instruction in other departments represented by the survey's respondents, the team found that most respondents also valued highly the notion of writing as a "mode of learning" and recognized the importance of evidence and content. Some of the respondents valued writing as a process, collaborative writing (or teamwork), and evidence of the writer's own voice. Few of the respondents valued writing as a means of communication, which led the team to conclude that most respondents were not considering writing as a means of reaching a public (non-classroom) audience. To the team's surprise, "disciplinary conventions" were totally unrecognized by the survey's respondents.

Next, Desmet compared the kinds of writing done in FYC to the kinds of writing done across the curriculum. Students in UGA's FYC classes write 4-5 "processed" papers as well as writing journals, responses to discussion questions, quizzes, and exercises. In their e-portfolio, they compile two pieces of reflective writing (a reflective introduction to their portfolio and a reflection on a revision assignment), a peer review response, two pieces of personal writing (a biography and a "wild card" of their choice), two of their "processed" papers, and a multimedia exhibit.

University-wide, most writing assignments were described as short answer, followed by short (3-6 page) essays, presentations or reports, and essay exams. Long essays (> 5 pages), lab reports, research papers, journals, and WebCT postings were occasionally assigned, and portfolios, websites, and blogs/wikis/other multimedia genres were rarely assigned.

These results suggested to the team that on the one hand FYC needs to assess its role as a service course meant to prepare writers to do the kinds of academic writing they'll be assigned in other college courses, but on the other hand FYC also needs to think of their roles as leaders and innovators when designing and assessing writing assignments.

In the interest of time, Christopher Hayes focused his presentation on an assessment of the comments on qualities of "good writing" rather than on the perspective of academic support units. Based on the team's interpretation of the respondents' comments, the faculty at UGA generally believe "good writing" has the following qualities (listed in descending order of importance):

Hayes next described some of the disciplinary differences. The Arts and Humanities faculty were the only ones who made comments about the aesthetic quality of the writing, and used descriptive words such as "evocative," "delights," "craft," and "performance." These respondents valued such qualities above accuracy. The Education faculty valued the personal in writing and used such descriptive terms as "personality of the writer," "personal input," "voice and style," and "reveals convictions and beliefs." The Social Sciences faculty also valued personal voice, but were seeking a balance of the writer's thoughts and a survey of the literature. The Biological and Physical Sciences faculty valued factual accuracy and logical thought and organization most highly. These departments were the most concerned with proofreading and also placed a premium on proper citations and students' use of sources. The Professional faculty (including Law, Journalism, Family and Consumer Sciences, etc.) had a range of responses, but generally sought clear, concise writing that "engaged the reader." Comparing the range of definitions of "good writing" to the low ranking of "disciplinary conventions" as grading criteria suggested that they faculty value but may not be self-consciously aware of those conventional differences.

In addition to seeking descriptions of "good writing," the team sought to discover what resources faculty were using to assist student writers. According to the survey results, 50% of faculty were not referring students to any outside sources; 50% were recommending that the students meet with the course instructor or course TA; 30% were referring their students to the Writing Center; and 25% were referring their students to the tutoring resources available through the Division of Academic Enhancement. Most did not refer their students to the research librarians.

Hayes concluded that one of the outcomes of the survey might be to reacquaint the faculty with the available resources on campus so that the faculty will refer more of their students to these resources and/or refer them more often to these resources.

Michelle Ballif, "What We Found: The Perspective of WIP (Writing Intensive Program)"

Hayes read the absent Ballif's paper, which focused on the misperception that "good writing" (generally understood as clear, concise, and grammatically correct) transcends disciplinary differences and the proverbial sense that "writing skills" are "out there" and should be picked up by students before they enter college. Referencing Lee Ann Carroll's research in Rehearsing New Roles: How College Students Develop as Writers, Ballif explained that while many professors simply assume that student don't know "how to write," what they are failing to recognize is that college students are often being asked to write for situations they haven't encountered before; that is, students are being asked to complete increasingly complex and disciplinary specific "literacy tasks" (to use Carroll's term). To complicate matters, most faculty, according to Ballif, are unable or unwilling to articulate what "good writing" means in their discipline or how "good writing" generates knowledge in their field. The survey of UGA faculty supported this conclusion, since the respondents showed little self-consciousness of disciplinary differences and less than 1% self-consciously specified a disciplinary characteristic of writing in response to the survey's questions. Therefore, Ballif concluded, amending faculty perceptions of the quality, or lack thereof, of student writing requires a revision of faculty perceptions and behaviors about "good writing." How, she wondered, might faculty members be taught to reconceive writing assignments as "literary tasks" and to identify the "ways of knowing" unique to their fields?

One revision in thinking, Ballif suggested, might be brought about by the foci of university-wide writing programs. WAC, she contended, has more of a "write and stir" methodology and emphasizes the commonalities and the portability of "writing across the curriculum." WID, or writing in the disciplines, programs, on the other hand, emphasize the disciplinary nature of writing and invite students to engage in the production of disciplinary-specific scholarly knowledge.

In addition to being made aware of the acculturating function of writing as a way for students to become disciplinary practitioners in their fields, faculty members in all the disciplines need to be trained on how to assign and respond to student writing, concluded Ballif, and they need to be rewarded by the system for their efforts.


When asked about the seeming disconnect between some of the responses the survey respondents provided to the choices supplied by the survey team and the respondents' written comments to the more open-ended questions, the panelists replied that they plan to follow up on those disconnects in the face-to-face interviews. They also want to follow up on some of the apparent differences between the respondents' definitions of "good writing" and the requirements of their writing assignments.

Another audience member suggested that even though the survey team tried to design the survey without any disciplinary specific biases, some of the terms they used (e.g., "audience awareness") might be skewed to the humanities. The panelists agreed that they might be so deeply embedded in their disciplinary conventions that they failed to see them and that they might not be giving the sciences, for instance, a "fair shake" in their interpretation.

— Alexis Hart

CCCC ConventionFor more information on the CCCC 2006 conference,
visit the NCTE Web site at http://www.ncte.org/profdev/conv/cccc/.