CCCC 2001 in Review: B.10 The Right Start: A Coordinated Composition Placement Program at the University of Cincinnati

Marlene Miner: Chair
Maggy Lindgren
JoAnn Thompson
Marilyn Palkovacs
Rebecca Sutherland Borah

This well-put-together panel offered me the opportunity to (1) catch up with colleagues at my home institution, the University of Cincinnati, and (2) meditate a bit more on the nature-and difficulty-of assessing in-coming student writing for placement purposes. Indeed, I was proud to see so many of my colleagues working so diligently on such an important task; I was also daunted by the scope of the work they have undertaken-and that still lies before them.

Here's the story. Faculty in English at the University of Cincinnati, composed of seventeen (that's right, seventeen) different colleges with five separate English departments, have been attempting to coordinate placement efforts among the University's various programs. Such coordination is a necessity if our course content is to be comparable, allowing for ease of student transfer from one college to another. For instance, many students who begin in my college, University College, often transfer to another U.C. college, such as Arts & Sciences or the College of Education, and we need to ensure that credit earned in our English courses matches both requirements and comparable courses in other colleges. Working toward such comparability reaches all the way to the placement exam where initial decisions are made about student skill levels and abilities. Considering U.C.'s unique situation-and the use of multiple placement prompts at various colleges-coordinating a common placement procedure and prompt is no small endeavor. Each English department has developed its own placement method, refined over years of use, so the construction of a single placement test has seemed-according to my colleagues-an act that Sisyphus might find daunting.

Putting their shoulders to the wheel (or boulder, as the case may be), representatives from those five English departments have been laboring to coordinate placement with the help of grant money and the time commitments of key faculty. Constructing several different "test" prompts, piloting them, and soliciting student and teacher feedback via various surveys have all been part of this task. The results were interesting if not conclusive. In general, the panel found little discernible difference between the various prompts that were tested. On one hand, this suggests that English faculty at U.C. are largely "on the same page"-both in creating tests and prompts, and in evaluating those tests and prompts. On the other hand, with no clear "winning" prompt, it's hard to know what to do now.

Indeed, even though I am not directly part of this endeavor at U.C., the session was successful in getting me to think about and consider other possibilities for increasing success in the "placement game." Certainly, some of our best and brightest have mulled over this question, and I won't rehearse the readily available sources here; suffice it to say that everything from online testing to full-scale portfolio assessment has been (and is continuing to be) tried and tested. No doubt my colleagues at U.C. will continue both their exploration of the available scholarship and their own scholarly examination of present-and future-data.

But this session offered me insight into something surprising-the possibilities of self-placement. Part of the data collected included students filling out cards that asked them to place themselves in one of the various developmental or first-year writing courses offered at U.C. Such self-selections did not affect students' ultimate placement, but they were compared to where students "wound up"-with surprising results. About 75% of the students "correctly" placed themselves. I couldn't help but wonder if, given a little tweaking of the system, a higher percentage could be obtained. For instance, if students had more information about the courses offered, if they were asked to consider their high-school grades or achievements in writing, or if they had to complete a survey that prompted them to process their experiences and skills as a writer-wouldn't such tools render an even greater correlation between self-placement and placement testing results?

What could one do with such information? Do we hand over, to our students, responsibility for their own placement in writing courses? I'm not sure. It's well worth further considering how some schools do this-and with what success. At the very least, encouraging students to think about their own placement might prompt in them a greater concern for their work as writers-and a greater sense of responsibility in developing writing skills. It's hard to say, according to the session participants-without further research. Students may have more to tell us about their writing lives than we have yet imagined. [JA]

Go To: CCCC 2002 in Review Home