Indirect Assessment of ULL's First-Year Writing Program

I recently received an Instructional Improvement Mini-Grant (link goes to a PDF) to do a somewhat large-scale indirect assessment project for our writing program. I, along with my most excellent research assistant, have distributed a survey to 512 students so far, with maybe about 35 more surveys expected.

Here is our survey. You can see that it's based on the Consortium for the Study of Writing in College's survey instrument created by:

Charles Paine, Robert Gonyea, Paul Anderson, Chris Anson, “Continuing the WPA-NSSE Collaboration: Preliminary Analysis of the Experimental Writing Questions and Next Steps” Council of Writing Program Administrators Conference, 11 July 2008, Denver, Colorado.


I've made a lot of changes: omitted some questions, reworded some, added some others. I wanted to distinguish, for example, between students taking our complete writing sequence and those who transferred in. I can already tell that interpreting the data will be challenging; I've realized that we're going to have to cross-reference a lot of what we find in the surveys with the syllabuses from fall 2010 and spring 2011. For instance, the question that asks, "During the current school year, for how many assignments did you: Narrate or describe one of your own experiences," I've noticed in my casual glancing that it seems like quite a lot of students are answering "No assignments," when my strong sense is that a lot of teachers, especially in English 101, assign a personal essay or personal narrative. We'll have to go through the syllabuses and do some basic coding of what types of assignments teachers are giving, and how many teachers are giving those types of assignments.

We've also been recording the numbers of students in each class we visit in a Google Docs spreadsheet. Our enrollment cap for first-year writing is appallingly high at 27 students (believe me, I've registered my disgust on more than one occasion). But one reason for this is the attrition rate -- teachers may start out with 27 students, but they finish the semester with far fewer. It's like overbooking flights, as one of the members of our first-year writing committee said. This "start out with 27, end up with 17-18" idea holds up in our course-embedded assessment, for which we gather 100 samples of student writing. While one would think that we'd only have to get papers from four sections of first-year writing, we always have to get about seven sections' worth of papers to be sure to reach our 100-sample goal.

So we wanted to track the number of students in each class -- at least the number of students who were present on the day we visited. Most of our visits were in late March and early April, so we get a decent sense of how many are attending class toward the end of the semester, though of course I'm sure that in many of these classes, there were some students in good standing who just happened to be absent that particular day.

We visited six sections of English 101. The number of students present ranged from 12 to 18, and the average number was 15. Of course, those students taking English 101 in spring semesters are likely to have either a.) failed their English 101 course the previous semester, or b.) started out in Basic Writing. So we expect to see more attrition among this group.

We visited 23 sections of English 102 (we'll be visiting two more sections before concluding the data-collection phase of the project). The number of students present ranged from 13 to 23, with the average number as 18.

I'll probably be posting more findings as we discover them. Do you know of anyone else who has done a local study using the Coalition for the Study of Writing in College survey instrument? They give permission for this, but I don't know of anyone else yet who has.