\"Writing.Com
*Magnify*
SPONSORED LINKS
Printed from https://writing.com/main/view_item/item_id/1153039-Case-Study
Item Icon
Rated: E · Essay · Writing · #1153039
Researched case study on curricula and assessment
DIFFERENT ASSESSMENT FOR DIFFERENT CURRICULA? A CASE STUDY


SYNOPSIS

Questions: How does assessment change based on the curricula of first-year writing courses? What influence does curriculum have on assessment? Should assessment depend on the curriculum of the first-year writing course? I was interested in determining how significantly curriculum guidelines drive instructor assessment practices, and analyzing the same instructor’s assessment practices for two first-year writing courses at two different institutions made the most sense in the attempt to make this determination.
Analyses: Given the goal of this case study, it was imperative that I analyze a number of elements regarding these sample student essays. First, I did an analysis of the course outcomes for each of the first-year writing courses at their respective institutions, in order to compare their curricula. Following that, I analyzed each piece of student writing based on the curriculum of the institution in which the writing was done. In conjunction with this, I analyzed the assignments created by Scott for each institution. The purpose of these analyses was to determine how Scott’s assignments addressed the outcomes for each institution, and how each student succeeded in reaching those outcomes in his or her writing. Next, I analyzed each piece of student writing based on the requirements of the assignment; and finally, I analyzed Scott’s comments on the student essays based on both course outcomes and his assignments. My analysis of each of these elements of student writing allowed me to more thoroughly examine the influence curriculum has on assessment practices.
Overview of Findings: While I expected to see some small variations in Scott’s assessment depending upon the curricular outcomes, I wasn’t thinking in the broadest terms of assessment. What I’ve discovered through this research is that while there will be similarities in assessment practices regardless of curriculum outcomes, the approach taken with student papers must change, as the curricular outcomes change depending on the institution. According to Anson:
It may be entirely appropriate…to use quite different response strategies as long as we know how to choose and apply them constructively. This is not to suggest that we don’t bring to our responses an overarching disposition or educational theory that guides our choices and sometimes makes us do similar things with different pieces of writing. But it allows us to admit some flexibility with which we can make informed choices about the strategies to employ for a specific piece of writing. (303)
This may be an oversimplification of Anson’s intention, not to mention of assessment theories and practices as a whole, but in no way could Scott use precisely the same approach to assess both of these pieces of student writing and give them each a fair and accurate assessment. Without even exploring the educational, cultural, or socioeconomic backgrounds of either of the students who wrote the sample essays analyzed here, this case study has acknowledged many legitimate reasons for assessment practices to change, depending a great deal upon the curricula of different institutions.

RATIONALE

When considering writing assessment in first-year composition courses for this case study, the goal was to attempt to determine if and/or how the curricula for certain first-year writing courses affects, and perhaps changes, assessment practices. The instructor whose assessment practices I observed, Scott Schuer, teaches first-year composition at both Washtenaw Community College and Eastern Michigan University. I chose to analyze Scott’s assessment of student writing at Washtenaw and at Eastern, comparing Scott’s assessment practices in terms of the course outcomes at each institution. With this in mind, I’ve focused on writing assessment literature from Erika Lindemann, Brian Huot, Chris M. Anson, Edward M. White, Willa Wolcott, and Cheryl Glynn, Melissa A. Goldthwaite, and Robert Connors, in order to attempt to determine how Scott’s assessment practices may or may not represent the ideas about assessment presented in this literature, and whether or not any variations from the ideas represented in the literature were due to the curricular outcomes for each institution.

QUESTIONS
Some of the questions I considered when analyzing Scott’s assessment practices were: How does assessment change based on the curricula of first-year writing courses? What influence does curriculum have on assessment? Should assessment depend on the curriculum of the first-year writing course? I was interested in determining how significantly curriculum guidelines drive instructor assessment practices, and analyzing the same instructor’s assessment practices for two first-year writing courses at two different institutions seemed to make the most sense in the attempt to make this determination.

METHODS
Scott and I have had many informal conversations regarding this case study. It was important to me, though, that he not hypothesize to me about how his assessment practices may or may not vary according to curriculum. I wanted to allow his practices to speak for themselves as much as they could. Of course, since Scott knew just what materials I would be analyzing and just why I would be analyzing them, it’s hard to be certain how accurate an assessment this is. Did Scott alter his assessment practices in any way, knowing I would be analyzing them? I asked Scott to pick one student from each institution who’s texts I would be analyzing; I told Scott that they’re overall skill levels weren’t important to the study, only that the students he chose were as similar as possible to each other in terms of their skill levels. Did Scott choose students who are particularly good students, or students who tend to struggle with their writing, knowing that the students’ work would be analyzed as well? I wanted the students’ levels of skill to be as similar as possible, because I didn’t want varying degrees of skill to be any more of a factor in analyzing Scott’s assessment practices than they had to be.
Scott provided me with assignments and course outcomes for each course, as well as sample pieces of writing from both of the students he chose. In both cases, the sample writing pieces are revised drafts of the third writing assignment the students have done this semester. I do not know how these students are faring in their respective courses this semester – the only thing I know about these students is what I have learned about them through reading their essays and reflective letters.
In addition to these analyses, I have also analyzed assessment literature in relation to Scott’s assessment practices, asking: “What are the authors saying about assessment? What are their recommendations for assessment? Do the authors suggest assessment practices change depending on the type of writing assignment? Do Scott’s assessment practices reflect the ideas found within this literature?” These questions allowed me to look at Scott’s assessment practices through a variety of lenses, which was helpful when looking at assessment of different types of writing and writing assignments.

SAMPLE PIECES
The sample from Washtenaw Community College is a persuasive essay, which seeks to persuade the reader of the benefits of gaming (playing video games). This essay should make convincing arguments for gaming, using sources as evidence; it should consider opposing viewpoints, acknowledging the opposition, refuting the opposition whenever possible, and making concessions when refutation is not possible.
The writing sample from Eastern Michigan University asks the student to describe his or her own practices with literacy, as well as describe the literacy practices of the subjects of the reading the student chose to focus on. The student is then to determine how these practices fit within Sylvia Scribner’s “Literacy in Three Metaphors,” and consider what educators should learn about creating curricula based on what the student has determined about his or her own literacy practices.

ANALYSES
Given the goal of this case study, it was imperative that I analyze a number of elements regarding these sample student essays. First, I did an analysis of the course outcomes for each of the first-year writing courses at their respective institutions, in order to compare their curricula. Following that, I analyzed each piece of student writing based on the curriculum of the institution in which the writing was done. In conjunction with this, I analyzed the assignments created by Scott for each institution. The purpose of these analyses was to determine how Scott’s assignments addressed the outcomes for each institution, and how each student succeeded in reaching those outcomes in his or her writing. Next, I analyzed each piece of student writing based on the requirements of the assignment; and finally, I analyzed Scott’s comments on the student essays based on both course outcomes and his assignments. My analysis of each of these elements of student writing allowed me to more thoroughly examine the influence curriculum has on assessment practices.
FINDINGS
Washtenaw Community College and Eastern Michigan University have similar outcomes for their first-year writing courses, with the exception of the thesis and argument outcome goals incorporated by Washtenaw. Because of these differences, Scott’s assessment of these sample pieces of student writing varies a good deal from Washtenaw to Eastern:
Similarities in assessment of each sample:
∑ Limited focus on conventions (acknowledges some surface errors, but does not make conventions a main focus);
∑ Asks for more details, more explanations, a more thorough analysis of each student’s topic;
∑ Asks for more clarity, specificity;
∑ Marginal and terminal comments;
∑ Complimentary, encouraging – points out what is working in the essay before discussing what still needs work
Differences in assessment of each sample:
∑ More focused on conventions in the sample from Eastern Michigan University; however, there are more errors of conventions in that piece;
∑ Rubric for Washtenaw assignment and assessment, none for Eastern assignment;
∑ More detail about what needs work with content in the EMU piece; comments on WCC piece are typically more general in terms of what needs work.
Because there is a more specific idea that the assignment for Scott’s EMU course is looking for, Scott needs to be as specific as he is about what still needs work with the student’s piece, and why it still needs work. Contrarily, because of the nature of the persuasive essay for his WCC course, Scott can be more general about what elements of the student’s essay still need work.

ASSESSMENT LITERATURE IN RELATIONSHIP TO FINDINGS
From “Responding to and Evaluating Student Essays” by Glenn et al, I focused on the issues or marginal and terminal comments, and the advice the authors have for instructors using these comments in their assessment of student writing. For example, in marginal comments, Glenn et al suggest that comments such as “logic,” “coh,” and “awk” may only serve to confuse and discourage students; being specific about what these comments mean, “such as ‘Evidence?’ or ‘Does this follow?’ or ‘Proof of this?’…can lead students to question their assertions more effectively than will a page of rhetorical injunctions” (146). In regard to terminal comments, the authors point out that for the most part, terminal comments begin with praise, then suggest revisions for parts of the essays that still need work. In both cases, for both sample papers, Scott has followed these suggestions on how to effectively use marginal and terminal comments.
Scott also considers the purpose of the assignment when assessing student writing, and his assessment comments reflect the purpose of each assignment. As noted above, the recommendations Scott has made on the essay written by the EMU student are more specific, because that assignment calls for more specific details and textual analysis than the WCC essay assignment calls for. Lindemann asserts, “…Purpose governs how we express the message and how our audience is likely to respond,” and this is true both of the purpose of the assignment and the purpose of assessment comments (224). Because the assignment for WCC had a more specific style, Scott could be less specific in terms of content. The EMU assignment asked for more specific content, so Scott’s assessment comments were more specific, as well.
In “Assessment and the Design of Writing Assignments,” White calls attention to the idea that the very act of creating a writing assignment is an act of assessment, in that “we are setting tasks for students, to whose work we – or someone else – will respond to in some way” (21). Because the curricular outcomes vary between the institutions in which Scott teaches, his assessment practices are automatically affected by these curricula. Washtenaw requires, in part, that its students possess the ability to create and support a thesis and to present an argument, acknowledge other positions, take a position, and use evidence to support that position. Scott must create an assignment at WCC that allows students to practice with these strategies. The EMU outcomes do not specifically require these strategies of its students, so in this case, Scott’s assessment practices are heavily influenced by each institution’s curricular outcomes.
Assignments are a method of assessment, and the purposes of the assignments affect the focus of what in student essays is being assessed, and how they are being assessed. In addition, the way students are taught is a method of assessment and reflects the institution’s curricular outcomes. According to Wolcott, “the extent and quality of the instruction and practice that students receive in the modes of writing on which they are assessed can influence how well they perform on assessments” (165). To this end, Scott provides students at both WCC and EMU multiple opportunities for revision, based on both in-class reader review and Scott’s responses to early drafts. Scott also uses in-class activities and prompts that allow students to consider the purpose of the assignment from a variety of angles and through a number of lenses, in order to help ensure that the students understand not only the assignment itself, but more importantly, why the assignment is important to them, both in their educations and in their lives.
As Brian Huot points out, there should be a direct connection between what and how we teach and what we respond to and how in terms of student writing (112). This may seem obvious, but it is nonetheless important to consider how the assessment of student essays reflects the assessment within the assignments for those essays. The variance in Scott’s marginal and terminal assessment comments for the WCC student essay and the EMU student essay can be attributed to the curricular outcomes, and therefore the assignment standards, for each institution respectively. Scott’s assignments, and therefore his assessment practices, depend a great deal upon the curricular outcomes of each institution.
I began the research for this case study with the assumption that Scott’s assessment practices would not change in any significant way, regardless of the curricular outcomes. I was thinking in the narrower terms of marginal and terminal comments, words of real praise and encouragement versus, as Lindemann puts it, words “to damn the paper with faint praise or snide remarks” (225), rather than in terms of the myriad methods of assessment, such as assignments, activities, and teaching strategies. I’ve come to realize that not only should Scott’s assessment practices change depending upon curricular outcomes, but they must if Scott is to do his best to practice assessment fairly and accurately. According to Anson:
It may be entirely appropriate…to use quite different response strategies as long as we know how to choose and apply them constructively. This is not to suggest that we don’t bring to our responses an overarching disposition or educational theory that guides our choices and sometimes makes us do similar things with different pieces of writing. But it allows us to admit some flexibility with which we can make informed choices about the strategies to employ for a specific piece of writing. (303)
This may be an oversimplification of Anson’s intention, not to mention of assessment theories and practices as a whole, but in no way could Scott use precisely the same approach to assess both of these pieces of student writing and give them each a fair and accurate assessment. Without even exploring the educational, cultural, or socioeconomic backgrounds of either of the students who wrote the sample essays analyzed here, this case study has acknowledged many legitimate reasons for assessment practices to change, depending a great deal upon the curricula of different institutions.

Works Cited
Anson, Chris M. “Reflective Reading: Developing Thoughtful Ways to Respond to Students’
Writing.” Evaluating Writing: The Role of Teachers’ Knowledge about Text, Learning, and Culture. Eds. Charles R. Cooper and Lee Odell. Urbana: NCTE, 1999. 302 – 324.
Glenn, Cheryl, Melissa A. Goldthwaite, and Robert Connors. “Responding to and Evaluating
Student Essays.” The St. Martin’s Guide to Teaching Writing. New York: Bedford/St. Martin’s, 2003. 135 – 166.
Huot, Brain. “Reading Like a Teacher.” (Re)Articulating Writing Assessment for Teaching and
Learning. Logan, Utah: Utah State University Press, 2002. 109 – 136.
Lindemann, Erika. “Responding to Student Writing.” A Rhetoric for Writing Teachers, 3rd ed.
New York: Oxford University Press, 1995. 216 – 245.
White, Edward M. “Assessing and the Design of Writing Assignments.” Teaching and
Assessing Writing: Recent Advances in Understanding, Evaluating, and Improving Student Performance, 2nd ed. San Francisco: Jossey-Bass Publishers, 1994. 21 – 51.
Wolcott, Willa. “Issues of Equity in Writing Assessment.” An Overview of Writing Assessment:
Theory, Research, and Practice. Urbana: NCTE, 1998. 159 – 169.
© Copyright 2006 lorileanski (lorileanski at Writing.Com). All rights reserved.
Writing.Com, its affiliates and syndicates have been granted non-exclusive rights to display this work.
Printed from https://writing.com/main/view_item/item_id/1153039-Case-Study