WAC Clearinghouse home page
About ATD   Submissions   Advanced Search

Writing Technologies and WAC: Current Lessons and Future Trends

Developing and Assessing an Online Research Writing Course

Abstract: "Developing and Assessing an Online Research Writing Course" discusses how the Writing Program at the University of California at Santa Barbara (UCSB) created a hybrid, online research writing course, Writing 50, and assessed that course. The assessment, which is at the center of this piece, was dovetailed with assessment literature in the field of Composition and previous assessment practices at UCSB. Using surveys, student work, transcripts from focus groups and other materials gathered during the assessment process for Writing 50, this piece ultimately offers a way of thinking about when it might make sense to engage in online instruction, how research writing does and doesn't get done online, the advantages that a hybrid, online class provides for engaging in peer review, and the fact that online courses allow students to do something they value and need to master: multitasking. But the over-riding point of this piece is how we might, as a field, dovetail assessment to create a system of assessment that makes sense for online WAC/WID courses.

Introduction

Writing programs of all kinds are moving into online instruction. Sometimes the move into online instruction is driven by market concerns (as Hohendahl, 2000, thinks is the case with Writing Instruction at the University of Phoenix); by concerns of consistency of student experience across many sections of a given composition class (as Hochman, 2003, points out is one of the driving concerns with ICON/TOPIC at Texas Tech); or even simply because someone in the upper administration is demanding it. At the University of California at Santa Barbara (UCSB), the reason for getting into online Writing instruction was even more prosaic: Summer Session administrators asked the then Writing Program Administrator, Susan McLeod, to set up an online writing class for a required Sophomore and Junior level class, Writing 50, because many students were finishing their studies without actually finishing this required, research-based writing class. The new class that was designed and implemented over the course of a year involved a week of face-to-face meetings, eight weeks of online work, and a final meeting face-to-face.

However, what started out as a way to help students who were either turned away from the class (turn away numbers for Writing 50 have averaged around 200 students a quarter for a number of years) or had "neglected" to take a required class, became something quite different. What happened is that assessment of students' experience of the course was built in, from the beginning, to see if there was, as Thomas L. Russell (2001) argues in his book of the same title, "no significant difference" between students' experience with the online and face-to-face versions of Writing 50. The end result is that students, and even their teachers, concluded that there was "no significant difference" between the online and face-to-face courses in terms of student satisfaction, student experience, and even student learning.

But this is not to say that there weren't significant differences in the particular ways that students viewed their experiences. Students in the online classes valued the "flexibility" of these classes, the learning in the classes, and some even stated a preference for the "multitasking" that they were allowed to do in the classes; and many students in the face-to-face classes thought that taking this required course online might be "more fun" and "more convenient".

The students' responses are interesting and important, but there is something even more significant that UCSB's experience with developing and assessing Writing 50, and this paper, offers: a way of thinking about when it might make sense to engage in online instruction, how research writing does and doesn't get done online, the advantages that a hybrid, online class provides for engaging in peer review, and the fact that online courses allow students to do something they value, and, I will argue, ultimately need to master, multitasking. But the over-riding point of this piece is how we might, as a field, dovetail assessment to create a system of assessment that makes sense for online WAC/WID courses.

The Background: UCSB's Writing Program, Writing 50 and Dovetailed Assessment

UCSB's Writing Program is an autonomous Writing Program, staffed mostly by lecturers, that has a WAC and WID emphasis. Our mission statement says that, "The mission of the Writing Program is to produce better undergraduate writers across the full range of academic disciplines, and to train graduate students from a variety of disciplines to teach writing effectively" ("Mission Statement," 2007), and we do both of these tasks through work in general education classes that emphasize writing in the disciplines, from Writing 2 (the required first-year composition class) to graduate courses that our faculty teach in writing studies in the Girvetz Graduate School of Education.

Writing 50, "Writing and the Research Process," is described in UCSB's online general course catalogue as being, "A writing course addressing the analytical skills underlying the research process of academic and professional communities" (2008). And, according to the Writing Program's outcomes statement on our homepage, the course is supposed to, among other things, teach students everything from how to "Identify and use the full range of university library services" to how to "Conduct a significant independent research project" ("Writing 50/50E Curricular Guidelines," 2006, p. 2). In short, the course is supposed to help all students learn how to do their own research. This happens in courses that are themed, and the approved course themes range from "Immigrant Community and Culture" to "The Vietnam War" to even "This Californian Life" ("Winter Course Schedule," 2008).

Thus, when the online course was to be taught for the first time in the winter of 2007, I, the instructor of record, went with an approved, classic Writing 50 theme, "Work in the 21st Century." This was, and is, a course dedicated to helping students understand the nature of work and their place in the world of work. Students read, for my class, economists like Robert Reich and Thomas Friedman, and they also read popular pieces such as excerpts of Studs Terkel's famed Working and selections of Gig, a text modeled on Working, but written with the contemporary job market in mind. Nonetheless, the bulk of their reading was their own research reading, and the bulk of their writing was based on the library, online, and primary research that they conducted.

However, I did not only teach an online section of Writing 50 in the winter of 2007, I also taught a face-to-face class. This class, while it did make extensive use of computers, did actually meet every day in a classroom on the campus of UCSB. As I said earlier, there was ultimately "no significant difference" in students' perceptions about their learning, and even the learning itself (Russell, 2001); however, we had to make sure of that; thus, the pairing of an online and face-to-face course taught by the same instructor.

Also, this aligned with the methodology that Susan McLeod and Karen Lunsford had set up in an assessment grant for studying Writing 50. The grant was an attempt to look systematically at the affective and learning outcomes that are part of our General Writing 50 outcomes, and to do this we planned on following the best practices in Writing Program Assessment and Writing Assessment generally.

In terms of Writing Program Assessment for WAC/WID programs, there is pretty general agreement that the enormous variability of programs demands that assessment of programs take into account the realities on the ground, that programmatic assessment must make use of multiple measures, and that the assessment has to take into account the many stakeholders involved in writing instruction across the campus (Condon, 2001; Fulwiler, 2000; McLeod, 2007). This vision of WAC programmatic assessment mirrors some of the general ideas that one sees in Writing Program Assessment in general, with researchers like Edward White (1996), Kathleen Blake Yancey (1999) and Brian Huot (2002) reminding the field that politics play a significant role in the way that writing assessment, at the programmatic level, gets done.

Thus, the assessment that Dr. McLeod and Dr. Lunsford set up for the Writing 50 would involve the course being taught as a pilot in the Winter of 2007 by me, the course being taught by a set of three instructors in the Spring of 2007, and then the course being taught again by me in the Summer of 2007. Every time the course was taught, the instructor would be teaching a section of the online course along with the section of the face-to-face course; thus, the program could see how the course was, and was not, working in terms of a given teacher at a given time—making our assessment consistent. In terms of assessment tools, Dr. McLeod and Dr. Lunsford planned for several tools to be used. In each course, there were to be three pen and paper surveys given—to get at student attitudes and experiences around the class (see appendix A for the survey questions). Along with the surveys, all student work was to be collected—especially the final research paper of the quarter, which ran between 10-20 pages. Finally, every student who was involved in one of the Writing 50 classes being researched would have the opportunity to be part of a focus group study on their experience with the class (see appendix B for focus group questions).

Out of all of this work, a number of things were learned about the Writing 50 courses taught online, which I will address in a moment. But the most significant thing we learned was how to begin to create an assessment for an online class that made use of previous assessments—while still acknowledging the difference of online instruction. In 2005, Susan McLeod, along with Rich Haswell and Heather Horn, published the results of the assessment of Writing 2, our first-year required course, in College English. Their assessment work involved, as ours did, the use of surveys and focus groups. Thus, there was a sort of dovetailing, a linking methodologically, of previous and current assessment of Writing 50—that was native to the place being assessed. This idea, of the appropriateness of dovetailing to a particular place, is a point that I will make more of at the end of the paper.

However, the survey that we used also drew very heavily on the work of Caroline Haythornthwaite and her compatriots who studied the LEEP program at the University of Illinois, Urbana-Champaign. According to Haythornthwaite (1998), the LEEP program is an "option of the Master of Science in Library and Information Science at the University of Illinois at Urbana-Champaign. The LEEP program offers a distance option for students with instruction delivered through communication and computer technologies, and through short, intensive on-campus meetings" (par. 1). The LEEP program, with its blend of face-to-face and online instruction, closely mirrored the way we planned to teach Writing 50, and its assessment structure was helpful as well. Thus we incorporated some questions directly from a LEEP questionnaire, hoping to dovetail our assessment structure not only with the previous work done on Writing 50 at UCSB, but with the larger category of work done in the field. I will discuss this dovetailing in more detail at the end of this piece—and "a dovetailed assessment system" is the phrase that I want to forward to describe the sort of work that might help us, as a field, assess the impact and effect of WAC/WID based, online classes.

Student Attitudes and Outcomes: Online vs. Face-to-Face Writing 50

There has been a good amount of research done on student attitudes towards online classes (Federico 2001; Liu, Papathanasiou, & Hao 2001; Yudko, Hirokawa, & Chi, 2008), and a number of fine articles in the field of computers and writing have touched on the issues of student attitudes—particularly as they affect student performance and perceptions of classes (McGrath, 2001; Palmquist, Kiefer, Hartvigsen & Goodlew, 2008). Out of this research, there seems to be a sense that students often are drawn to online classes for the flexibility they provide, and negative attitudes about classes often vary according to the level of mastery of the technology and the way that student learning styles do, or do not, match with the online environment students find themselves in (Federico, 2001; Palmquist et al., 2008).

Our research on students' attitudes and experiences comes out of a set of three surveys (the first done at the beginning of the class, one in the middle, and a final survey given at the end of the class) and focus group interviews. Out of this work, we came to realize a number of things about students and their approach to online learning at UCSB—at least in Writing 50. In the surveys (which were given to 87 online students and 74 face-to-face students from the winter quarter of 2007 to the summer quarter of 2007), a couple of notable things, in terms of students' view of online classes, came up. First, as indicated by the final survey, 100% of students owned some sort of computer and all but one student had some sort of internet connectivity. The near universality of internet access and computer ownership shows that, at least on the campus of UCSB, there has been a major shift in access from the early 1990s, when folks like Cindy Selfe (1999) were writing about the "perils of not paying attention" to the way that computer use was stratified by race and class (p. 102). This is still true, but it seems that at UCSB, where the average familial income of incoming freshmen is $ 90,559 (UCSB Office of Budget and Planning, 2004), that there has been a shift in access, and that this shift in access mirrors the shift in access seen nationally. This national shift indicated by a rather comprehensive report by the National Telecommunications and Information Administration called A Nation Online: Entering the Broadband Age—in which over 54.6% of U.S. households had internet ready computers available in 2003—up from 18.6% of households only seven years ago (2004, par. 12).

The significance of this is that now, at least on the campus of UCSB, it seems like we can begin to offer online courses like Writing 50 without deep concerns about students not having access to one aspect of computer technology: the internet. In certain ways, this issue of nearly universal access for students in their homes is a crucial point. Without ready and easy access, students will struggle, obviously, to do the work of the class.

Besides the obvious need to have access, there are other student needs that were met, according to our surveys, by Online Writing 50. Two questions showed, principally in online classes, that students saw real utility in being able to work in a setting that they chose and controlled. In Figure 1 below, you can see how students overwhelmingly thought that being able to do work according to their own schedule and from a location of their choice was very important to them.

Figure 1: Online vs. Face-to-Face Survey Answers

Question

Online Courses

Face-to-face Courses
Online courses allow me to do my academic work according to my own schedule:
  • Extremely Important 25 (29%
  • Very Important 25 (29%)
  • Important 23 (27%)
  • Slightly Important 11 (13%)
  • No Importance 2 (2%)

N= 86 total responses to question

  • Extremely Important 5 (13%)
  • Very Important 9 (24%)
  • Important 13 (35%)
  • Slightly Important 7 (19%)
  • No Importance 3 (8%)

N=37 total responses to question

Online courses allow me to do my academic work from any location:
  • Extremely Important 20 (23%)
  • Very Important 32 (37%)
  • Important 20 (23%)
  • Slightly Important 11 (13%)
  • No Importance 4 (4%)

N= 87 total responses to question

  • Extremely Important 4 (11%)
  • Very Important 9 (24%)
  • Important 16 (43%)
  • Slightly Important 4 (11%)
  • No Importance 4 (11%)

N=37 total responses to question

The significance of these findings is that they show a strong preference for self-directed learning, which is often true in online classes; however, I think that it indicates a deeper shift in terms of how students want to engage in writing in an online class.

In a revealing moment in a focus group discussion, the deeper reason for attitudes reflected in Figure 1 can be seen:

Interviewer: "Is there anything else that you'd like to tell us about the class? Or what you—or how you view the class?"

Respondent: [referencing working through multiple screens] "You're not completely focused but you do get all the information. It might fit our generation's mindset a little better."

Variations of this response were heard across many of the focus groups that were held at the end of all of the Writing 50 classes in this study. Students liked, and expressed a preference for learning while, "multitasking." This attitude, a preference for multitasking by the current generation, is explored by Marc Prensky in a 2001 article called "Digital Natives, Digital Immigrants." In this piece, Prensky says that current college students, who have grown up with digital technologies, are "digital natives" who are wired to learn differently from their teachers—the "digital immigrants" who have not grown up with cell phones, email, and computer gaming (2001, pp. 1-2). Prensky (2001) makes a key point for our discussion:

Digital Natives are used to receiving information really fast. They like to parallel process and multi-task. They prefer their graphics before their text rather than the opposite. They prefer random access (like hypertext). They function best when networked. They thrive on instant gratification and frequent rewards. They prefer games to "serious" work. (p. 2)

Prensky (2001) goes on to point out that our current students' learning preferences are a challenge to those of us, the digital immigrants, who are trying to get by in the digital age (pp. 3-4).

In terms of Writing 50, we have a model that seems, at least initially, to provide a way of appealing to some of the "digital natives'" preferences for multitasking and choice. If our students, as Prensky (2001) notes, "function best when networked," then maybe we need to begin to make sure that they have some opportunities to do this sort of networked learning—and do it quickly (p. 2). However, I want to throw in one caveat: multitasking may not always be the best way for students to learn complex skills and ways of thinking. In a 2008 article dealing with laptop use and multitasking in a large lecture class, Carrie B. Fried found that her in her study,

Students admit to spending considerable time during lectures using their laptops for things other than taking notes. More importantly, the use of laptops was negatively related to several measures of learning. The pattern of the correlations suggests that laptop use interfered with students' abilities to pay attention to and understand the lecture material, which in turn resulted in lower test scores. (p. 911)

Fried's findings point out something important: multitasking can result in poor work—which runs counter to the goals of any class. We want our students to learn, particularly in a writing class, in a deep and transformative way. It may be that multitasking can get in the way of that learning; however, such considerations might be beside the point.

I recently sat for a couple of hours in an interesting meeting in which the head of my college's Instructional Technology group mentioned something, in passing, that struck me powerfully. Alan Moses, the head of UCSB's College of Letters and Sciences Instructional Technology Department, said that with wireless networking (which most cellular phone service providers are now offering) teachers may be approaching a moment where turning on or off the internet is not possible. Basically, we are looking at a moment, very soon, when students' phones/PDAs/mp3 players will give them access to the internet and its attendant multitasking whether we, the teachers, want students to have that access or not. As teachers we have an important decision to make: what will we do with multitasking students when students have control over whether or not the internet is accessible? It seems fair to say that the "genie is out of the bottle," as Alan Moses claimed, and now teachers, particularly writing teachers, need to figure out how to teach students who want and need to multi-task to learn.

Student Learning: Student and Teacher Perspectives

It seems obvious that any class in Writing should help students learn how to become even better writers, and this obvious adage is true of the work in Writing 50—both online and face-to-face. In fact, Writing 50 at UCSB is supposed to be tied to discrete outcomes, which are codified in the following 2006 "outcomes statements" from the curricular guidelines for Writing 50:

After taking Writing 50, students should be able to:

  1. Conduct a significant independent research project, including developing questions; designing and planning research; analyzing, contrasting and synthesizing multiple primary and secondary sources; and drawing conclusions.
  2. Recognize differences among disciplinary approaches to a topic.
  3. Analyze the theoretical and disciplinary perspectives and rhetorical strategies underlying texts through critical reading and thinking.
  4. Identify and use the full range of university library services.
  5. Use both general and specialized catalogs, indices, and bibliographies.
  6. Build discipline-specific search strategies.
  7. Conduct Web-based research efficiently and selectively.
  8. Locate books, reference texts, journal articles, and other resources in the library.
  9. Distinguish among various types of sources—such as primary and secondary, popular and peer-reviewed, reference and circulating—as they evaluate those sources.
  10. Integrate, cite, and document sources correctly.
  11. Offer generously and receive readily assistance in collaborative projects.
  12. Present the results of their research in a poised and professional manner without the fear of public speaking.
  13. See a bridge between the world within academe and the world beyond it.

("Writing 50/50E Curricular Guidelines," 2006, pp. 2-3).

Clearly, we're an ambitious program, and we want to accomplish quite a bit. However, I want for a moment to focus on how four of the above outcomes (the ones in bold) were met by most students—in both sections of the class.

From the students' perspective, they did meet the outcomes of Writing 50. In our third and final survey in each class, we asked the following questions:

  1. I feel like I have learned a good deal about conducting college level research.
  2. I feel like I have learned a good deal about how to write an effective research paper.
  3. I received help with my writing from the following people during the course. (Circle all that apply)
  4. The response I received from my peers was very important to the quality of my writing.
  5. The response I received from my teacher was very important to the quality of my writing.
  6. After Writing 50, I feel ready to conduct my own collegiate level research.

Each of the above questions, with the exception of question three, were answered according to a likert scale, and the clear majority in each category fell into the strongly agree and agree area. As for the third question listed (the results of the question can be seen below in Figure 2) that response is interesting in and of itself.

Figure 2: Question on Receiving Help with Writing

7. I received help with my writing from the following people during the course. (Circle all that apply)

Online Class
N=87

Face-to-Face Class
N=74
A. My professor 79 (91%)
72 (97%)
B. My classmates 75 (86%) 51 (69%)
C. My family members 20 (23%) 12 (16%)
D. Former teacher(s) 5 (6%) 2 (3%)
E. A friend 40 (46%) 28 (38%)
F. A significant other 12 (14%) 8 (11%)
G. An Online Writing Lab 3 (3%) 3 (4%)
H. Librarian 12 (14%) 13 (18%)
I. Other ___________ 0 3 (4%)

The two largest categories of people who students received help from for their writing in both online and face-to-face classes, were people in the class. And what is very interesting is that more students in the online classes said that they received help from their professor and classmates, with 75 (86%) of the students in the online class indicating that they received help with their writing from classmates, versus only 51 (69%) of the students in the face-to-face class.

This belief by the students was mirrored by the things my colleagues (Doctor Peter Huk, Professor Randi Browning, and Professor Kathy Patterson) and I discovered about the online classes. Our feeling, collectively, was that the level of peer feedback was more engaged, and frankly better, in the online courses than in the face-to-face classes. My colleagues and I, as we prepared presentations on our experience teaching online Writing 50 for various audiences, grappled with why we felt the way we did, and what could account for the level of engagement by students in online peer feedback. The answer was ultimately fairly simple: students had to communicate with each other through writing as peer reviewers. So students often wrote more, and they wrote more specifically. Unlike students in the face-to-face classes, who weren't using something like the "insert comment" function in MS Word—which demands a certain precision in terms of its use—students online had to insert their comments and make social and intellectual connections with their texts (two things that Karen Spear (1988) points out are very important in peer review) only in writing.

However, peer review was not the only thing that students learned about in Writing 50. They also learned, both online and face-to-face, to use a variety of sources—including actual library resources. In our third and final survey of the quarter, we asked students to check off, and enumerate, any research skills they thought they acquired or perfected in Writing 50. The results are very interesting, as can be seen below in Figure 3.

Figure 3: Research Skills Learned

Research Skills

Online Writing 50
N=87

Face-to-Face Writing 50
N=74

1. During the Writing 50 course, I gained the following skills (circle all that apply): Number Percent Number Percent
A. Locating books—finding electronic or print books
68
78%
61
82%
B. Finding texts with an online library database
72
83%
63
85%
C. Effectively using a free web search (i.e. Google, dogpile, etc.)
47
54%
41
55%
D. Note taking
27
31%
24
32%
E. Organizing research as preparatory to writing
54
62%
48
65%
F. Creating a full 10-15 page research paper
60
69%
59
80%
G. Citation
59
68%
59
80%
H. Editing for grammar
34
39%
32
43%
I. Revision of my research
53
61%
51
69%
J. Creating and exploring a research question
61
70%
60
81%

It is interesting to note that 68 out of 87 students (78% of students) felt like they found books for their research project, and 61 out of 74 students (82% of students) felt the same in face-to-face classes. There is not a great deal of difference between the numbers of folks getting help with, and thereby using, library resources, and this is because of the construction of the class. All of the online classes had one meeting in the library, where the emphasis of the library session was on locating texts in the library. This is a key goal for Writing 50 (witness the fourth outcome for the class—"that students will be able to identify and use the full range of university library services"), and it is a key goal for any research class. There is a great deal of research that happens online, but still there is research that has to happen in the library itself.

What about the work that students did, what about the "significant independent research project" that they ran and wrote about for the class? The answer to this is, at this time for us, somewhat provisional. Grades for the final sections of Writing 50, taught by me in the summer of 2007, are not significantly different—with the mean grade in the online section of the course being 3.57 (on a 4.0 scale) and the mean grade of the face-to-face course being 3.45 (on a 4.0 scale). These numbers align with my pilot sections of the course in the winter of 2007, where the online average grade was 3.86 (on a 4.0 scale) and the face-to-face average was 3.63 (on a 4.0 scale). There is, to quote Thomas L. Russell (2001) again, "no significant difference" between these grades—the averages ranging from an A- to As generally.

Aside from grades, things get a little dicier in terms of actually determining student learning. Trying to parse out what student learning is, how it is identified, and what it looks like—particularly when looking at something as complex as collegiate research writing—is a difficult thing. At the moment, myself, Karen Lunsford, and a fine set of graduate students (Catherine Zusky, Marthine Satris, and Susan Cook) have tried to take a first run at the data, using a rubric developed for Lunsford and Lunsford (2008). We have added to this rubric questions pertaining to the way that students use online and library sources and the overall quality of their paper (e.g. coherence, order, depth, and grammar use).

Almost all of the students involved in the research seemed to have used good quality sources in their writing—most often drawing from the online databases (such as JSTOR and EBSCOhost) that are hosted by the UCSB library. This is the good news. The less good news is that students, and this should surprise no one, have some trouble handling quotes well (in terms of integrating them into the text naturally), and students even have trouble at times accounting for which ideas and language come from their sources and which come from them. These findings align with the work done by Rebecca Moore Howard on "patchwriting," which she defines (while quoting an earlier article) as "copying from a source text and then deleting some words, altering grammatical structures, or plugging in one-for-one synonym-substitutes" (1995, p. 788). A few students, even in the final class I taught, appear, in both online and face-to-face sections of Writing 50, to be engaging in patchwriting; however, this is only one class, and our results are provisional.

Still, I think that it is fair to say that students, ultimately, are learning a good deal from both the online and face-to-face sections of Writing 50, and I say that with some confidence because of the assessment model that we have created for online Writing 50.

A Model of Programmatic Assessment: Ongoing, Pedagogically Oriented Assessment

The literature on programmatic assessment of WAC and WID programs is wonderfully varied, reflecting the tremendous variety of WAC and WID programs that now exist. However, as I mentioned at the beginning of the paper, there are some general ideas about WAC/WID programmatic assessment that seem to be agreed upon:

Thus, for the programmatic assessment that we did on Writing 50, we tried to account for the local setting and concerns, the needs of various stakeholders, what assessment could teach us about student learning, and how our assessment practices could be consistent with the political realities we face at UCSB.

The Beginnings: Local Concerns

It is important for WAC/WID programs to actually take into account the local needs for assessment, and as Charles Bazerman, Joseph Little, Lisa Bethel, Teri Chavkin, Danielle Fouquette, and Janet Garufis point out in their 2005 book, Reference Guide to Writing across the Curriculum,

The methods, motives, subjects, and audiences of the assessment and evaluation of WAC programs are as varied and difficult to define as the programs themselves. Because, as Toby Fulwiler points out, "the local conditions that gave rise to WAC programs were always quite specific," (Fulwiler & Young, 1997, p. 1), the assessing and evaluating of those programs is largely dependent upon the needs and desires of the participants in those local programs. (p. 124)

For the Writing Program at UCSB, there were several important local concerns that needed to take into account as we planned for assessment as we planned to teach Writing 50 online. Our primary concern was that Writing 50 was, and is, heavily impacted in terms of student enrollment, and we wanted to provide another way to serve folks who were, quite simply, not able to get into the class.

However, we were also driven by Ed White's (1996) adage that "if you really value it, you will assess it", and his point that if you do not do writing assessment, it will be "done to you" (p. 9). Thus, we wanted with our assessment approach to make sure that what we really valued, student learning and the experience of students in Writing 50, was accounted for, and we wanted to control the means of assessment, so that assessment would not be "done to us." The end result was an assessment system that looked at student learning and teacher effectiveness, and these two variables would be assessed via a variety of assessment instruments: surveys, focus group interviews, content analysis of student work, multiple-choice student evaluations of teacher effectiveness (or ESCI scores, the instrument used in all classes at UCSB), and journals kept by the teachers themselves. It was our hope that this multi-pronged assessment strategy would lead to us being able to understand the local realities of Writing 50 instruction in an online environment, and our hopes were met. However, we wanted to make sure that we weren't the only folks learning about the work done in Writing 50.

The Stakeholders

The primary stakeholders—to use a term that Edward White favors, but which is contested by Brian Huot (2002, p. 54)—we were interested in reporting back to were: Instructional Development (who granted us money to do the work with Writing 50); Summer Sessions at UCSB (who asked us to develop the course); our colleagues in the Writing Program at UCSB (who also teach Writing 50); and our students. There are of course others involved, but these are, from our perspective, the primary stakeholders.

To report back to Summer Sessions and Instructional Development at UCSB, we conversed informally with the leaders of both organizations, emailed them, and ultimately produced short reports on our work. Reporting back to the Writing Program was a bit more involved—since we are a large program of 36 active lecturers, 1 assistant professor, and 39 teaching assistants (UCSB Writing Program, Faculty list, 2007). So, we did several things. The other instructors involved in teaching the class (Prof. Browning, Prof. Patterson, and Dr. Huk) and I did presentations for faculty at committee meetings; we presented at national conferences (including the CCCC in 2008) where our colleagues attended; and we did the most important of all work to communicate our assessment findings with our peers—we talked to them informally in the halls about the experience.

The end result of this work is that the UCSB Writing Program is now considering adding "Online Writing 50" to the permanent course offerings of the program, and, as of May 2008, preliminary approval has been granted by our in-house curriculum committee.

The final group of folks who we needed to report back to is our students in Writing 50, both the online and face-to-face classes. We could of course simply invite students to come to a presentation, but student needs for programmatic assessment are different than the other stakeholders. The truth is that the way we "report back" to our students is through instructional improvement—based upon the assessment work that we have done in Writing 50. The simple truth is that the online course that we are now able to offer students is our "report" to them.

As a result of focus groups and survey work, we think we have a pretty good sense of what students need and want in a hybrid, online course focused on research writing. First, students—as Edward White (1996) points out—probably are, in terms of writing assessment, more interested in their experience than in larger programmatic issues. White goes onto to mention one particularly important concern that students probably have with writing assessment, is that it produces "data principally for the use of learners and teachers" (1996, p. 22). The data that we collected from surveys and focus groups aimed at students' experiences with Writing 50 did this. From student feedback we learned how to run better peer review—making the best use of threaded discussion board and the "insert comment" function in MS Word; how to effectively make use of real-time chats—or when to ditch said chats; and how to help students in an online situation make the best use of electronic and other library resources.

In terms of peer review, I mentioned earlier that all of the instructors involved in teaching and researching Online Writing 50 thought that the peer review done online was better and more engaged, with students getting more out of peer review. That is not just the opinion of the teachers. In one of the final focus groups done after I taught the last online course in our pilot, a student said a very interesting, and indicative thing, in a focus group. When asked to "compare this writing class to any other you've taken," the student said that the online course involved the, "same type of research process"—but online "you're responsible for evaluating your peers' work as well." In both my online and face-to-face sections of Writing 50, I emphasized the importance of peer review, but I never said that students were "responsible" for evaluating peer's work. There is something in the nature of online peer review, which was one of the only ways students could connect in depth with each other, which seems to demand a commitment of students to each other. It is this sort of commitment that has convinced three out of the four folks involved in the pilot to move their peer review to an online environment for all of their Writing 50 classes.

We also realized, from the focus groups and surveys, that students did not always get as much out of the online chats as we would have liked; however, that was in large part because of the clunky chat interface in Moodle, which proved to be slow and prone to crashing. Once we moved to using AOL instant messenger chats, which the teacher would start, the student response to the chats was more positive.

Finally, we were most happy that our students really saw themselves as learning about the art and science of research in the class. According to data from our third and final survey in all of the classes, students felt that they learned a good deal about conducting collegiate research—and the finding of that research. In Figure 3 above, you can see that over fifty percent of the students involved in both classes felt like they learned about locating texts for research, how to organize research, how to create a longer research paper, how to effectively do citation, how to revise their research, and—very significantly, how to approach research through the lens of a research question. What is also interesting is that the skills that students felt that they learned less about—"effectively using a free web search", "note-taking" and "editing for grammar"—are skills that we, the teachers, assumed were covered in our Writing 1 and Writing 2 classes, courses that students must either pass or test out of prior to taking Writing 50. However, the other skills, which are key to the outcomes of Writing 50, are all met, according to student perceptions, by the majority of students.

Ultimately, we created an assessment process that mixed qualitative and quantitative data that allowed us to actually say that our online Writing 50 class was, at the very least, equivalent to our face-to-face class. However, we were able to do even more than that. We created a system of assessment that allowed for student and teacher experience to be fully captured and reported back faithfully to a number of stakeholders. In short we created what I would call a practically grounded assessment structure—an assessment structure that is "site based, locally controlled, and research-based" (Huot, 2002, p. 178). However, I want to make one other point: that the assessment structure that we created evolved out of ongoing conversations between and among the folks involved in the teaching of Writing 50, along with programmatic administrators who understood what Writing 50 was about. The model that we created for assessment is a model based on faculty and student development. It is not the assessment that Edward White warns us about in Teaching and Assessing Writing (1998), assessment done to us rather than by us. It is a type of assessment that stems from a model of instruction that owes more to issues of student learning rather than simple, and simplistic, measures of student and teacher performance.

A Model of Online Instruction

For the 2006 Computers and Writing Conference, the emergence of research into hybrid classrooms (which Catherine Gouge (2006) describes as a course where "students and instructors meet part of the time on site and part of the time via a web-technology ") is evident (par. 2). In the 2006 program, at least four individual presentations make reference to hybrid online instruction; however, the rest of the field has been slower to actually write about hybrid learning spaces. Chris Anson (1999) has theorized about the possibility that hybrid courses could lead to "technology-enhanced learning," which involves the "thoughtful" use of computerized technologies in classes that meet face-to-face (p. 269). It is this sort of "thoughtful" use of computerized instruction that we tried to create in our online Writing 50.

Like Anson, we tried to thoughtfully blend electronic discourse like chats and discussion boards "into the existing curriculum in principled ways," so as to "not erode the foundations on which the teacher-experimenters already base their instructional principles" (Anson, 1999, p. 269). Thus we created a model which looks like the one in Figure 4.

Figure 4: Class Schedule

Week One: Introduction to the teacher and the class—done face-to-face.

Weeks Two to Four: Online work, using the course management system (Moodle)—making careful use of online chats (30 minutes per week) and posting of student work, questions and activities to discussion boards.

Week Five: In addition to online work, students come to the library for an orientation and touching base.

Weeks Six to Nine: Students continue to work online, using threaded discussions, chat, and even wikis.

Week Ten: Students meet for the last class to debrief and turn in final papers.

 

The structure that we created demanded that students spend sometime together at crucial points: the beginning, middle, and end. Also, I, and all the other teachers who taught Writing 50, had at least two conferences with online students—conducted either via AOL instant messenger or in our office. The end result of this sort of hybrid course was the sense that a majority of students felt like they "knew their instructor" well in online classes, and there was "no significant difference" between the online and the face-to-face student's view on this matter (see Figure 5 below).

Figure 5: I Know My Professor Well

I feel like I know my professor well.

Online Numbers
N=87

Face-to-face Numbers
N=74

Online Percentage

Face to Face Percentage

A. Strongly Agree 16
16
18.60%
21.33%
B. Agree 46
43
53.49%
57.33%
C. Neither Agree nor Disagree 13
11
15.12%
14.67%
D. Disagree 9
5
10.47%
6.67%
E. Strongly Disagree 2
0
2.33%
0.00%

Also, the structure allowed teachers to have some access to instructional styles that they were used to, which makes sense when you consider that our online version of Writing 50 was a class developed because of faculty interest, not outside pressure. At the beginning of this paper, I mentioned that many writing programs begin online instruction for a variety of reasons: the possibility of making money, a demand from administrators, and even a desire by the faculty to experiment with their teaching. The creation of online Writing 50 comes out of a desire to do research and experiment with teaching; it comes out of a faculty development perspective—which makes perfect sense when you consider the WAC/WID nature of UCSB's writing program.

WAC/WID programs, as Susan McLeod (2007) makes clear, are programs dedicated to the sometimes difficult and rewarding task of faculty development (p.98). Also, if the Writing Program Administrators have any sophistication, or half a mind, they understand an important point that Joan Mullin (2001) makes in her chapter in WAC for the New Millennium: Strategies for Continuing Writing-Across-The-Curriculum Programs:

Faculty members do not want to be told how to teacher their classes, how to write assignments, or how to evaluate assignments. While they may well solicit help for any of these—and many of them do—they do not want to be told they have to shift their way of thinking about writing, teaching or learning. (186).

This fine bit of advice is something that Dr. McLeod seemed to have in mind as she went forward with online instruction: that faculty will invest much more interest and time into a curricular change that they see as coming from themselves. I, as the first person to teach the online version of Writing 50, very much felt this investment and commitment to the class.

Part of the model for developing an online class that we want to offer up actually is tied into this part of our experience here at UCSB: faculty developed the class. My colleagues (Dr. Huk, Prof. Patterson, and Prof. Browning) and I decided on the texts that we would use (individually); the course management system we would use (collectively); and they way that we would report back on our experience (individually and collectively). And what's important to note here is that we decided these key things based on the sort of thoughtful, pedagogical-centered thinking that Anson advocates in his work. Simply put, the teachers worked hard on this class, engaged in multi-pronged assessment, and thought hard about this class because it was their class.

Conclusion: The Construction of Dovetailed Assessment

At the beginning of the paper, I mention that the assessment system that arose out of our work on Writing 50 was dovetailed, in that it integrated with the work in the field on assessment of online writing classrooms, but also in that it dovetailed with previous programmatic assessment done at UCSB. In the work of McLeod, Horn, and Haswell (2005), there was a focus on using qualitative (focus groups and instructor notes) and quantitative (questionnaires) research methods to understand how effective Writing 2, our required first-year composition course, might be (p. 562)[1]. In our research on online Writing 50, we attempted to use some of the same qualitative methodologies to explore how a new class might, or might not, be equivalent to the older, thoroughly assessed class. Our result is that there is "no significant difference" between the classes, and we wouldn't be able to say that as definitively as we believe we can without having aligned our current assessment project with our previous one.

However, dovetailed assessment is not simply about aligning assessment projects internally—although that is important. It is also about aligning assessment practice with the "best practices" of assessment in the field. Thus, we attempted to align our assessment practice with what researchers in WAC/WID assessment say should happen in terms of keeping assessment local and focused on student experiences and outcomes (Condon, 2001; Fulwiler, 2000; McLeod, 2007).

Also, dovetailed assessment refers, in my mind, to the way that the class under assessment fits, or does not fit, with the assessment itself. The truth is that online Writing 50—a course that demands that students engage in extensive computer-based research and writing; demands that student engage in an independent research project; and demands, ultimately, that students be self-directed—is an excellent fit with online instruction. As Goldberg, Russell, and Cook (2003) point out, in their meta-research on the effects of computers and writing:

when students write on computers, writing becomes a more social process in which students share their work with each other. When using computers, students also tend to make revisions while producing, rather than after producing, text. Between initial and final drafts, students also tend to make more revision when they write with computers. In most cases, students also tend to produce longer passages when writing on computers. (p. 20)

These points—that computers allow student to work social, make more revisions, draft more, and write more—all align nicely with the work that any online writing course should do. Students in an online course need to feel, as they felt in our course, that they are connected to each other and the professor, that they can, and should write more, and that they need to revise more.

Thus, the final point about dovetailed assessment that I want to make is this: the assessment needs to fit the WAC/WID class at hand. And, more specifically, in terms of online courses, you need to think if your class needs to go online. Writing 50 clearly needed to go online to meet student need, and perhaps this is the most important point of any assessment project: student need has to be considered from the first moment. This is not an easy thing to keep in mind, but it is the sort of thing that one needs to keep in mind when doing dovetailed assessment of student learning—particularly in an online WAC/WID class.

Appendices

APPENDIX A: Survey Questions

Third Survey Instrument for Writing 50 Research

Basic Demographic INFORMATION (Circle your response)

  1. What is your class standing?
    1. Sophomore
    2. Junior
    3. Senior
    4. Continuing Student
  2. Have you conducted library research at the university level prior to enrolling in this section of Writing 50?
    1. Yes
    2. No
  3. How many times did you try to sign up for Writing 50 before you got in?
    1. 0
    2. 1
    3. 2
    4. 3x
    5. 4x
    6. 5x
    7. 6x
    8. Other (Please Specify) __________________________

Computer Skills and Access Questions

  1. Do you own a computer?
    1. Yes
    2. No
  2. If you own a computer, then what kind of computer do you own? (Circle all that apply)
    1. Laptop
    2. Desktop
    3. Other: _____________________________________________________________
  3. I own/have the following technologies and services. (Circle all that apply)
    1. My own internet connection
    2. My own printer
    3. My own website
    4. My own facebook, myspace, or other social networking page
    5. Microsoft Word (Not Microsoft Works)
    6. PowerPoint (or other presentation software)
  4. During the course, what computers did you use to do your work? (Circle all that apply)
    1. My own desktop computer
    2. My own laptop computer
    3. A computer at a friend’s house/room
    4. A computer at my parents’ house
    5. A computer at a computer lab at UCSB
    6. A computer at a library at UCSB
    7. A computer at another library off campus
    8. A computer at an internet café
    9. Other (If other, please write in where you have used a computer below)
  5. Which computer did you use most often to do your work for the course? (Please circle)
    1. My own desktop computer
    2. My own laptop computer
    3. A computer at a friend’s house/room
    4. A computer at my parents’ house
    5. A computer at a computer lab at UCSB
    6. A computer at a library at UCSB
    7. A computer at another library off campus
    8. A computer at an internet café
    9. Other (If other, please write in which computer you have used the most below)
  6. I consider myself to be
    1. Very computer literate
    2. Pretty computer literate
    3. Somewhat computer literate
    4. Barely computer literate
  7. Have you taken an online course before?
    1. Yes (If yes, on which topics? ____________________________________________)
    2. No

Experience with the Course (please circle your response)

  1. I know all of the names of the students in this class.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  2. I feel like I know my classmates well.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  3. I feel like I know my professor well.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  4. I feel like I have learned a good deal about conducting college level research.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  5. I feel like I have learned a good deal about how to write an effective research paper.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  6. I would recommend to others that they take this course.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  7. I received help with my writing from the following people during the course. (Circle all that apply)
    1. My professor
    2. My classmates
    3. My family members
    4. Former teacher(s)
    5. A friend
    6. A significant other
    7. An Online Writing Lab
    8. Librarian
    9. Other (Please indicate source here):
  8. I received help using computers from the following people during the course. (Circle all that apply)
    1. My professor
    2. My classmates
    3. My family members
    4. Former teacher(s)
    5. A friend
    6. A significant other
    7. An Online Writing Lab
    8. Librarian
    9. Other (Please indicate source here):
  9. The response I received from my peers was very important to the quality of my writing.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  10. The response I received from my teacher was very important to the quality of my writing.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  11. Per week, I spent the following total amount of time (in class and out of class) on my work for Writing 50.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  12. When I did my work for Writing 50, I was on task most of the time.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  13. During class time for Writing 50, I was on task all of the time.
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree
  14. Coming into the class, my expectations for Writing 50 were: [Use the space below and/or the back of the page to create a short response. Use as much space as you need.]
  15. My following expectations for Writing 50 have been met. [Use the space below and/or the back of the page to create a short response. Use as much space as you need.]
  16. Did you ever consider dropping this course? If so, why? And why did you choose to continue with it? [Use the space below and/or the back of the page to create a short response. Use as much space as you need.]

Research Skills

  1. During the Writing 50 course, I gained the following skills (circle all that apply):
    1. Locating books—finding electronic or print books
    2. Finding texts with an online library database
    3. Effectively using a free web search (i.e. Google, dogpile, etc.)
    4. Note taking
    5. Organizing research as preparatory to writing.
    6. Creating a full 10-15 page research paper.
    7. Citation
    8. Editing for grammar
    9. Revision of my research
    10. Creating and exploring a research question
  2. After Writing 50, I feel ready to conduct my own collegiate level research
    1. Strongly Agree
    2. Agree
    3. Neither Agree nor Disagree
    4. Disagree
    5. Strongly Disagree

Thoughts about the Online Version of Writing 50

The following advantages have been associated with taking online courses. Please circle your opinion regarding the importance of each of these advantages.

  1. Online courses allow me to do my academic work according to my own schedule:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important
  2. Online courses allow me to do my academic work from any location:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important
  3. Without online courses I cannot take college courses because of the demands of my job or family responsibilities:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important
  4. Without online courses I cannot take college courses because of my age or disability:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important

The following disadvantages have been associated with taking online courses. Please indicate your opinion regarding the importance of these disadvantages:

  1. Absence of face-to-face conversations and socialization with other students.
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important
  2. Absence of face-to-face discussions with faculty before, during and after class:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important
  3. I had difficulty in using the computer or Internet:
    1. No Importance
    2. Slightly Important
    3. Important
    4. Very Important
    5. Extremely Important

Other comments?

Is there anything else you'd like to say about your computer use or about your experience of the course?

APPENDIX B

Focus Group Questions

  1. What did you expect from this class?
  2. What are you taking away in terms of what you have learned?
  3. Compare this class to other writing classes you have taken.
  4. Where did you do most of the work for this class (e.g. dorm, library)?
  5. Did you have any difficulties accessing materials for the class?
  6. There were two versions of this class—on-line and face-to-face. Would you have preferred the other mode from the one you took? Why or why not?
  7. Is there anything else you’d like to tell us about this class?

References

Anson, Chris. (1999). Teaching and writing in a culture of technology. College English, 61(3), 261-280. Retrieved April 29, 2008, from the JSTOR database.

Bazerman, Charles; Little, Joseph; Bethel, Lisa; Chavkin, Teri; Fouquette, Danielle & Garufis, Janet. (2005). Reference guide to writing across the curriculum. Parlor Press: WAC Clearinghouse. Retrieved May 1, 2008 from https://wac.colostate.edu/books/bazerman_wac/

Computers and Writing Conference, 2006. (2006). Kairos, 11(2). Retrieved April 28, 2008, from http:// english.ttu.edu/Kairos/11.2/topoi/rice/program.pdf

Condon, William. (2001). "Accommodating complexity: WAC program evaluation in the age of accountability." In Susan McLeod, Eric Miraglia, Margot Soven, & Christopher Thaiss (Eds.), WAC for the new millennium (pp. 28-51). Urbana, IL: NCTE.

Federico, Pat-Anthony. (2000). Learning styles and student attitudes toward various aspects of network-based instruction. Computers in Human Behavior, 16(4), 359-379. Retrieved April 29, 2008 From Elsevier Science Direct database.

Fried, Carrie B. (2008). In-class laptop use and its effects on student learning. Computers and education, 50(3), 906-914. Retrieved April 29, 2008 From Elsevier Science Direct database.

Fulwiler, Toby. (2000). "Evaluating writing across the curriculum programs". In Susan H. McLeod (Ed.), Strengthening programs for writing across the curriculum. Available from https://wac.colostate.edu/books/mcleod_programs/#pub_info

Goldberg, Amie, Russell, Michael and Cook, Abigail. (2003). The effect of computers on student writing: A metanalysis of studies from 1992 to 2002. Journal of Technology, Learning, and Assessment, 2(1). Available from http://www.jtla.org

Gouge, Catherine. (2006). Writing technologies and the technologies of writing: Designing a web-based writing course. Kairos, 11(2). Retrieved April 28, 2008, from http://english.ttu.edu/Kairos/11.2/praxis/gouge/CompleteVersiontoPrint.htm

Haythornthwaite, Caroline (1998). A social network study of the growth of community among distance learners. Information Research, 4(1) Available at: http://informationr.net/ir/4-1/paper49.html

Hochman, Will. (2003). Session G.4 Improving student learning through objective assessment of student writing and redistribution of instructional tasks. Retrieved November 1, 2008, from http://web.archive.org/web/20040331020605/wac.colostate.edu/aw/reviews/cccc2003/viewmessages.cfm?Forum=8&Topic=62

Hohendahl, Peter Uwe. (2000). After three decades of crisis: What is the purpose of a phd program in foreign languages? PMLA, 115(5), 1228-1238. Retrieved November 1, 2008, from JSTOR: http://www.jstor.org

Huot, Brian. (2002). (Re) Articulating writing assessment for teaching and learning. Logan, UT: Utah State University Press.

Liu, Min, Papathanasiou, Erini & Hao, Yung-Wei. (2001). Exploring the use of multimedia examination formats in undergraduate teaching: results from the fielding testing. Computers in Human Behavior, 17(3), 225-248. Retrieved April 29, 2008, from Elsevier Science Direct database.

Lunsford, Andrea A. & Lunsford, Karen J. (2008). "Mistakes are a fact of life": A national comparative study. College Composition and Communication, 59(4), 781-806. Retrieved July 28, 2008, from http://www.ncte.org/portal/30_view.asp?id=129951

McGrath, Laura. (2001). Incorporating technology into the composition curriculum: Access, attitudes, and the new literacy. Kairos, 6. Retrieved April 28, 2008, from http://english.ttu.edu/kairos/6.1/binder.html?response/desmet/index.html

McLeod, Susan. (2007). Writing program administration. Available from https://wac.colostate.edu/books/mcleod_wpa/

McLeod, Susan, Horn, Heather & Haswell, Richard. (2005). Accelerated classes and the writers at the bottom: A local assessment story. College Composition and Communication, 56(4), 565-580. Retrieved July 28, 2008, from http://www.ncte.org/portal/30_view.asp?id=118043

Moore Howard, Rebecca. (1995). Plagiarisms, authorships, and the academic death penalty. College English, 57(7), 225-248. Retrieved May 11, 2008, from JSTOR: http://www.jstor.org

Mullin, Joan A. (2001). Writing centers and WAC. In Susan McLeod, Eric Miraglia, Margot Soven, & Christopher Thaiss (Eds.), WAC for the new millennium: Strategies for continuing writing-across-the-curriculum programs (pp. 179-199). Urbana, IL: National Council of Teachers of English.

National Telecommunications and Information Administration. (2004). A nation online: Entering the broadband age. Retrieved April 28, 2008, from http://www.ntia.doc.gov/reports/anol/NationOnlineBroadband04.htm

Palmquist, Mike, Kiefer, Kate, Hartvigsen, James, & Goodlew, Barbara. (2008). Contrasts: Teaching and learning about writing in traditional and computer classrooms. In Michelle Sidler, Richard Morris, & Elizabeth Overman Smith (Eds.), Computers in the composition classroom (pp. 251-270). Boston, MA: Bedford/St. Martin's.

Prensky, Marc. (2001). Digital natives, digital immigrants. Retrieved April 28, 2008, from http://www.marcprensky.com/writing/

Russell, Thomas L. (2001). The no significant difference phenomenon: Comparative research annotated bibliography on technology for distance education. Washington, DC: IDECC.

Selfe, Cynthia. (1999). Technology and literacy: A story about the perils of not paying attention. College Composition and Communication, 50(3), 411-436. Retrieved November 1, 2008, from JSTOR: http://www.jstor.com

Spear, Karen. (1988). Sharing writing: Peer response groups in English classes. Portsmouth, NH: Boynton/Cook Heinemann, 1988.

UCSB Office of Budget and Planning. (2004). Entering freshman characteristics. Retrieved April 23, 2008, from http://bap.ucsb.edu/IR/frosh/income.html

UCSB Writing Program. (2006). Writing 50/50E curricular guidelines. Retrieved April 23, 2008, from http://www.writing.ucsb.edu/curricularguides/WRITING%20502006.doc

UCSB Writing Program. (2007). Faculty list. Retrieved April 23, 2008, from http://www.writing.ucsb.edu/faculty.htm

UCSB Writing Program. (2006). Mission statement. Retrieved April 23, 2008, from http://www.writing.ucsb.edu/about_us.htm

UCSB Writing Program. (2008). Winter course schedule. Retieved April 24, 2008, from http://www.writing.ucsb.edu/schedule2.htm

University of California at Santa Barbara. (2008). Online catalogue. Retrieved April 23, 2008, from http://www.catalog.ucsb.edu/2008cat/depts/writ.htm#WritiCourses

White, Edward. (1996). Power and agenda setting in writing assessment. In Edward White, William Lutz & Sandra Kamusikiri (Eds.), Assessment of writing: Politics, policies, practices (pp. 9-24). New York: MLA.

White, Edward. (1998). Teaching and assessing writing. Portland, ME: Calendar Islands Publishers.

Yancey, Kathleen Blake. (1999). Looking back as we look forward: Historicizing writing assessment as a rhetorical act. College Composition and Communication, 50(3), 483-503.

Yudko, Errol, Hirokawa, Randy & Chi, Robert. (2008). Attitudes, beliefs, and attendance in a hybrid course. Computers and Education, 50(4), 1217-1227. Retrieved April 29, 2008, from Elsevier Science Direct database.

Notes

[1] McLeod, Horn, and Haswell were engaged in a comparison of accelerated vs. "normal" Writing 2 courses, but the key point for this piece is the methodology they used to get at "multiple measures" and "multiple methods" to create a complex portrait of the class being assessed (2005, p. 561).

Contact Information


Complete APA Citation

Dean, Christopher W. (2009, January 19). Developing and Assessing an Online Research Writing Course. [Special issue on Writing Technologies and Writing Across the Curriculum] Across the Disciplines, 6. Retrieved from https://wac.colostate.edu/atd/technologies/dean.cfm