ITEM 122-115-R0104 Basic Questions and Answers Attachment
COMPOSITION PROFICIENCY AND COLLEGE ADMISSIONS
by Mary Sheehy Moe, EdD
Member, K-12 Composition Proficiency Standards Committee, 1998 – 2000
Member, Writing Proficiency Steering Committee, 2000 –
The issues surrounding composition proficiency and college admissions are numerous and complex. As I did when the Board of Regents last considered these issues in 2000, rather than provide a general letter of support as a member of the Steering Committee, I have prepared a document responding to the questions and concerns the Committee has heard most often about writing proficiency-based admissions. That has required writing at some length. However, the document is organized such that readers can skip entire sections, focusing only on those questions and answers of interest/concern to them.
1. What is “composition proficiency” at the college admissions level?
Upon admission to college, a student with composition proficiency is able to develop ideas in writing with relevant and clear examples and details; organize ideas coherently and strategically; and express them effectively by following conventions of syntax, usage, form, and mechanics.
2. Is composition proficiency an essential attribute of the student who seeks to engage in college-level studies?
Yes. Academic writing – the exploration, synthesis, review, analysis, and evaluation of the implications of ideas, events, and experiences – is the primary means through which students learn and demonstrate their learning in the college setting. It is a fundamental component of the general education core required for every baccalaureate degree. It is a common requirement for lower-level and advanced course work in virtually every discipline. Whether students are responding to an essay question on a history mid-term, preparing a mechanism description for an engineering class, taking the comprehensive examinations as the culminating demonstration of knowledge for a master’s degree, or completing a doctoral dissertation, they must be able to engage in academic writing. Conversely, if students are not writing to demonstrate, apply, and extend their knowledge in a variety of classes beginning with their first semester in college and continuing throughout their college years, it is difficult to argue that they are getting a good college education.
The emphasis on academic writing in college is not just a quaint relic of a bygone time. As the craft of the critical thinker, writing requires the student to demonstrate his or her intellectual interaction with the subject of study in greater depth than other measures allow. It is particularly well-suited to the expectations for research, citation, analysis, and criticism which epitomize college-level study. Because it is permanent and transmittable, written discourse can be examined at length and in detail by others, particularly by faculty, to give students the critical feedback they need to mature as thinkers.
3. Will an admissions requirement for composition proficiency limit access to the Montana University System?
In the broadest sense, no. The current policies for open admissions into the system’s many two-year programs—available in every community with an institution of the Montana University System, as well as in ten additional communities with community colleges or tribal colleges—will continue to provide students who are not proficient writers with access to the system. But strictly speaking, a composition proficiency requirement will undoubtedly delay some students’ access to and progress through the four-year colleges and universities in the MUS.
Of course, ensuring composition proficiency through admissions requirements is not new to the Montana University System. The long-standing requirements for ACT/SAT verbal scores were adopted in the belief that the scores ensured composition proficiency. Unfortunately, these multiple-choice tests of “verbal” skills do not focus primarily on students’ ability to write but on their mastery of syntax, usage, punctuation, and vocabulary. Such skills are important to effective expression, but they are not the only components or even the major components of proficient writing. Montana’s admissions requirements also include the College Preparatory Program, the minimum GPA, and the minimum class rank, all adopted in the belief that they, too, ensured that admitted students had the basic skills, including writing, to engage in college-level study. However, because these high school indicators reflect a broad range of learning experiences, grading factors, and achievement, they are not direct measures of composition proficiency.
Empirical and anecdotal data from within and beyond the MUS have established that the existing admissions requirements do not ensure the composition proficiency necessary for college-level studies. That is why, in 1998, the Board of Regents charged a committee of writing teachers from K-12 and higher education to explore proficiency-based admissions; adopted that committee’s recommendations in 2000 to identify or develop authentic assessments of writing proficiency to use for admission decisions; and charged a new committee to engage in the curriculum alignment, field study, and testing review that have taken place between 2000 and 2004.
The writing proficiency requirement, like the other admissions requirements, does not limit access as an act of elitism, but as an effort to do what is best for unprepared students, prepared students, the four-year colleges and universities in Montana, the taxpayers who support the system, and the value of the baccalaureate degree itself. Students who are not proficient writers upon admission to college cannot engage in the scholarly activity that is a basic foundation for college-level study. The harm that results when students are admitted to college without writing proficiency and retained semester after semester without becoming proficient extends beyond them. Inevitably, the college revises its expectations to accommodate these students, with the result that other students who are prepared for the rigors of college-level work are deprived of the intellectual engagement and growth that they deserve, and the diploma they receive has less value.
Seen from this light, an admissions requirement for writing proficiency does not limit access so much as it ensures that the access to a college education is not illusory, an ultimately harmful pretense that one can engage in college-level study without being able to develop ideas effectively, support them convincingly, organize them coherently, or express them clearly.
4. What is an authentic writing assessment for college admissions?
To be authentic, the writing assessment should reflect the student’s ability to engage in the academic writing expected at the college level. The assessment should be constructed so as to allow students to demonstrate their proficiency by writing, rather than by identifying and “fixing” discrete components of writing that is not their own (as is usually the case in multiple-choice tests). The writing context (the topic, or “prompt,” the scenario, and the time allotted to write) should be designed to elicit a reliable sample of a prospective student’s writing ability. Finally, the writing assessment should be scored on the basis of generally accepted standards of composition proficiency at the college-entrance level.
5. Does any particular group or category of students score differently on authentic writing assessments?
Yes, in some instances. From 1998 - 2000, the K-12 Composition Proficiency Committee studied this issue extensively. As reported to the Board of Regents in 2000, the research literature suggested that, while particular classifications as groups do perform differently on authentic writing assessments, the differences are (1) statistically insignificant and (2) less pronounced in authentic writing assessments than in the more common multiple-choice tests of writing proficiency. Specifically:
· Females tend to perform better, although not significantly better, than males on essay-response assessments; males tend to perform better, although not significantly better, than females on multiple-choice assessments.
· African-Americans as a group score lower on composition-proficiency assessments, whether authentic essays or standardized multiple-choice designs. These differences are less pronounced when the test features authentic writing. Racial differences are not as pronounced as the gender differences. For instance, African-American females perform as well on authentic writing assessments as Caucasian males.
· Very little research specifically devoted to Native American performance on composition proficiency tests exists.
The pilot study of the Montana Writing Assessment from 2001 - 2003 added this information to the existing literature:
· As a group, students who identified themselves as American Indians received significantly lower scores on the Montana Writing Assessment than other groups did. (Many students chose not to identify their racial/ethnic class, and the number of students who did not disclose their race/ethnicity is large enough to confound analysis of the data.) In the three years of the pilot study, however, the mean scores of disclosed American Indians rose each year.
· As a group, males received significantly lower scores on the Montana Writing Assessment than females did. For instance, in 2002, the mean score for girls was 3.34 and for boys, 2.99. In 2003, the mean score for girls was 3.46 and for boys, 3.07. Conversely, the 2002-2003 SAT Montana Score Report shows the average verbal score of 541 for males and 536 for females. (In 2002-2003, the SAT did not include an authentic writing assessment; these scores are on a multiple-choice test.)
Interestingly, a factor significantly affecting test scores on the Montana Writing Assessment had nothing to do with a protected classification, but with the time allotted for the test. There was no significant difference in performance between students who were given a 70-minute response time and those who were given a 40-minute response time in 2002. However, in 2003, students who were given a 40-minute response time scored significantly higher than those who were given a 30-minute response time. None of the students receiving the top scores – a 6.0 or a 5.5 – wrote in the 30-minute format.
This finding was one reason the Writing Proficiency Steering Committee concluded that the Montana Writing Assessment is a crucial measure to include in its recommendation for approved proficiency measures. Of the available instruments, the Montana Writing Assessment is the only recommended measure providing at least 40 minutes to write one essay. Moreover, because the MWA is administered with state-level P-20 oversight, it is the only recommended measure that can be “tweaked” to ensure that the test design and administration responds to Montana data and achieves Montana goals.
6. Does the writing assessment of composition proficiency put particular classifications of students at an unfair disadvantage?
The real issues lie beyond this question. If a testing situation requiring a written response to an impromptu question within a limited time frame places particular classifications of people at an unfair disadvantage, should all such tests be waived for individuals in these groups? If so, the revisions required in testing practices at the college level and for admission to practice in many fields would be extensive.
The essential question, though, is this: Is the disparity between the scores of these classifications of students and those of other groups (a) a function of the test design or administration, which fails to identify proficient writers, (b) the result of an immutable inability to write, or (c) the result of a lack of experience with, instruction in, and/or expectations for good writing?
The Montana Writing Assessment was carefully reviewed for cultural bias by an ACT group that included Montana reviewers. During the field study, a significant number of males and American Indians demonstrated proficiency on the Montana Writing Assessment every year, and the scores of both classifications of test-takers improved steadily during the three years of field-testing. It appears, therefore, that the difference in scores is not a function of the test or of any innate or immutable disadvantage with respect to writing. In fact, from 2001 to 2003, as students and their teachers learned more about academic writing from the experience of testing, scoring, and discussing results, the scores of males and of American Indians, as well as of the general population of test-takers, improved steadily. This result, reflecting greater emphasis on writing, clearer expectations for writing, better instruction in writing, and improved student performance when writing, is exactly what proficiency-based admissions seeks to produce.
This factor is another reason the Writing Proficiency Steering Committee concluded that it is crucial to include the Montana Writing Assessment among the recommended measures for demonstrating composition proficiency. It is the only measure that provides the opportunity for practicing teachers at the high school and college levels in Montana to engage in a data-driven, results-oriented discussion about good writing and good writing instruction. That this discussion improves instructional practices and ultimately student performance is manifest in the field study results.
Still, we must not dismiss the concern about the performance of males and American Indians on the Montana Writing Assessment – or on the national writing tests that report similar results. Rather, we should respond to this disparity in performance with focused, concerted efforts to eliminate it. First, we should continue to ensure that approved writing assessments truly are culturally unbiased. Second, as long as this disparity exists, we should dedicate resources to develop, assess, and continually improve effective writing instruction and assessments for lower-performing groups. Writing ability is too important a skill – to learning in college and to success in life -- to be left undeveloped in individual students or in particular groups of students.
7. What criteria are being used to identify measures that will be further reviewed to establish cut scores in the final recommendation for writing proficiency standards?
The Steering Committee has used the following criteria to evaluate testing instruments:
Validity: Does the test measure what it purports to measure?
Reliability: Is scoring consistent from exam to exam and from scorer to scorer?
Fairness: Does the test, through its design, content, administration, or scoring pose an unfair disadvantage to individuals or groups?
Expense: Is the test affordable for students? For the state?
Availability: Is the test readily available to students throughout the world? Can it be taken during the junior year? Can it be taken more than once in a limited time frame?
“Vetted”: Has the test itself been tested – field studies, cultural bias review, endorsement of accredited groups, etc.
K-12 Aligned: Does the test reflect Montana’s established outcomes for high school proficiency in writing as well as what Montana colleges expect at the entry level?
“Bonuses”: Does the test provide additional benefits important to the goals of P-20 education in Montana?
8. Why is the final recommendation on writing proficiency standards for admission being delayed?
The Steering Committee could make its recommendation today if the revised ACT and SAT that include an authentic writing sample were available. Because these tests are so well-known, thoroughly vetted, affordable, and accessible, the Committee is loath to exclude them from its recommendation. However, at the present time, neither the ACT nor the SAT provides a valid measure of writing proficiency, and without that valid measure, the Committee cannot support their use for determining proficiency-based admissions. Moreover, until the initial administration of the revised tests has been completed, the Committee has a weak foundation for recommending a score for admissions purposes.
The Committee anticipates that, when the SAT and ACT with authentic writing assessments become available in 2005, the Committee will recommend scores for five “tests” – the SAT, the ACT, the Advanced Placement exam (in either English Literature and Composition or English Language and Composition), the CLEP test, and the Montana Writing Assessment.
9. Do we need all these tests?
Probably not. It is likely that the ACT, the SAT, and the MWA will suffice to provide the information needed to determine writing proficiency in valid, reliable, fair, accessible, relatively inexpensive, vetted, beneficial ways. However, many high school students take the AP exam anyway and many other students take the CLEP tests; both are well-established and well-vetted. It seems sensible to include them so that students are not being tested redundantly.
Not all students would take all of these tests; most would take only one. Having a variety of options, however, is vitally important in this crucial area of college admissions. Montanans regard higher education as a stepping stone to the “successful life.” Loss of access to that life on the basis of a writing assessment is a high price to pay – unconscionable, if the assessment is not valid, fair, or reliable, or if it does not take sufficiently into account the pressure that a high-stakes test places on the students most desirous and least confident of that access. Therefore, multiple testing measures and the opportunity for retakes must be built into the recommended standards for writing proficiency. The immediacy, validity, early administration, and state control of the Montana Writing Assessment, balanced by the national reputations and broad availability of the other instruments, will give Montana students many and varied opportunities to demonstrate the writing skills that will be essential to the quality of education they receive in a Montana college and the likelihood of their success there.