














































































Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
This document reports the results of a researcher-practitioner partnership project that assessed the academic literacy skills of developmental reading and English students in community colleges. The study aimed to determine how prepared they were for introductory college-level literacy demands, focusing on reading and writing skills and self-efficacy. The document also discusses the importance of college readiness, especially writing skills, and the significance of self-efficacy in learning.
What you will learn
Typology: Lecture notes
1 / 86
This page cannot be seen from the preview
Don't miss anything!
Dolores Perin Julia Raufman Hoori Santikian Kalamkarian
December 2015 CCRC Working Paper No. 85
Address correspondence to:
Dolores Perin Professor of Psychology and Education and Senior Research Associate, Community College Research Center Teachers College, Columbia University 525 W 120 th^ St, Box 174 New York, NY 10027 Email: perin@tc.edu
This research was funded by the Bill & Melinda Gates Foundation under the project entitled “Analysis of Statewide Developmental Education Reform: Learning Assessment Study.” The authors would like to thank the other members of the assessment team, Nikki Edgecombe, Susan Bickerstaff, and Madeline Joy Trimble, for their suggestions for the study and review of earlier drafts of this paper. Mark Lauterbach provided helpful assistance with the statistical analyses. We also appreciate the assistance of Rachel Kasten, Kristen Kurre, Mollie Book, Korleen Brady, and Carolyn Foster in scoring and data entry, and of Geremy Grant and Allyson Shaw in the analysis of student interviews.
Abstract This paper reports findings from a researcher-practitioner partnership that assessed the readiness for postsecondary reading and writing demands of 211 students in developmental reading and English courses in two community colleges. An assessment battery was designed for the study, comprising two standardized tests and five project- developed tasks. The project-developed measures were two text-based writing tasks similar to those typically assigned in college classrooms (a summarization task and a persuasive essay), a self-efficacy scale, a teacher judgment questionnaire, and a qualitative student retrospective report. The text-based writing measures were keyed to high-enrollment, introductory-level general education courses that had significant literacy demands. The results pointed to areas where students needed improvement in order to be ready for literacy tasks at the introductory postsecondary level. There was a discrepancy between the relatively low reading and writing skills as assessed through performance tasks and relatively high student self-efficacy ratings and teacher judgments. This finding suggests the possibility of an unrealistic amount of confidence in students’ ability to perform college-level reading and writing tasks. Correlations between assessment measures tended to be moderate, suggesting that the measures were tapping different skills. A series of hierarchical regressions modeling the text-based writing skills suggested that improvement in text-based summarization may require particular attention to reading comprehension skills, while improvement in text-based persuasive essay writing may depend more on developing general writing skills. Students’ retrospective reports indicated that although participants had some difficulty stating the requirements of the summarization task, they described appropriate strategies to complete it. Overall, the study’s findings point to the need to examine approaches to instruction, curriculum, course structure, and placement policy that may improve students’ college readiness.
1. Introduction Despite completing secondary education, many students in the United States enter postsecondary institutions with low reading, writing, and/or mathematics skills, which greatly impedes their academic progress (J. Jackson & Kurlaender, 2014; Porter & Polikoff, 2012; Sparks & Malkus, 2013). In fall 2000, 42 percent of entering students at two-year public colleges and 20 percent of entering students at four-year public colleges enrolled in at least one developmental education course (Parsad & Lewis, 2003) 1 aimed at preparing students for the academic demands of postsecondary coursework. The current study focuses on reading and writing ability, which prior research indicates is problematic for a large proportion of postsecondary students. For example, in a sample of 57 community colleges in seven states, 33 percent of entering students were referred to developmental reading courses (Bailey, Jeong, & Cho, 2010). Information on the need for developmental English^2 courses is limited, but studies suggest that up to 35 percent of entering community college students are referred to such courses (Jenkins & Boswell, 2002; Perin & Charron, 2006). Thus, it is probable that at least one third of entering community college students require help with reading and/or writing skills if they are going to perform well in college-level courses. Although academic skills are not the only measure of college readiness, they are a central indicator (Armstrong, Stahl, & Kantner, 2015; National Center on Education and the Economy, 2013). Signs of college readiness include passing scores on reading, writing, and mathematics placement tests administered on entry to college, and passing grades in entry-level, college-credit English composition courses (Lym, 2014). More generally, a well-prepared secondary education graduate has been characterized as one who “can qualify for and succeed in entry-level, credit-bearing college courses leading to a baccalaureate or certificate, or career pathway-oriented training programs without the need for remedial or developmental coursework” (Conley, 2012, p. 1). From the perspective of literacy, college readiness includes the ability to read analytically and critically, synthesize written information, and produce ideas in writing (^1) Student-reported data lower enrollment rates to 24 percent at publicly funded community colleges (Sparks & Malkus, 2013). This estimate should be regarded cautiously because it is based on self-report rather than on institutional data. 2 Developmental English courses teach basic writing skills.
is discussed here in order to place the study in context. This model comprises four interacting components that are proposed to affect students’ ability to learn well in a postsecondary setting. Each component consists in turn of multiple subcomponents, many of which have been recognized as important by college English instructors (O’Neill, Adler-Kassner, Fleischer, & Hall, 2012). At the heart of Conley’s (2007) model is the “key cognitive strategies” component, or the work habits that support student learning. Key cognitive strategies include intellectual curiosity, an interest in inquiry, the ability to analyze and synthesize information, an understanding of the level of precision and accuracy needed to perform academic tasks, and the ability to solve problems. A second component in the model is “key content,” which covers academic content knowledge and basic reading, writing, and math skills. Conley (2007) identifies writing skill as being of central importance to this component of college readiness, especially because writing forms the basis of many assessments of knowledge in postsecondary courses: “Expository, descriptive, and persuasive writing are particularly important types of writing in college. Students are expected to write a lot in college and to do so in relatively short periods of time,” and the writing should display competent grammar, spelling, and use of language (p. 14). Besides writing ability, skills in research, reading comprehension, and math, as well as disciplinary content knowledge, feature in this second component. Interacting with key cognitive strategies and key content is a third component, “academic behaviors,” which signifies students’ ability to reflect on, monitor, and control their own performance. Also called metacognition (Nash-Ditzel, 2010), this component of college readiness covers understanding one’s own level of mastery of a skill—for example, through assessing one’s self-efficacy (Conley & French, 2014; Liao, Edlin, & Ferdenzi, 2014)—willingness to persist in difficult tasks, and an understanding of how to transfer skills to a new context. The final component in Conley’s (2007) college readiness model is “contextual skills and awareness,” or a student’s knowledge of the nature of college as an institution. This includes understanding academic norms and expectations as well as specific knowledge, such as of admissions, placement testing, and financial aid procedures.
The effectiveness of developmental education in promoting college readiness has been questioned in recent research (Hodara & Jaggars, 2014; Martorell & McFarlin, 2011; Melguizo, Bos, & Prather, 2011). Given the broad range of skills and behaviors required for college readiness, as detailed by Conley’s (2007) model, and the multiple social and educational needs of low-achieving students, it is difficult to pinpoint the causes of this problem. However, inadequacies in assessment methods used for course placement, course structure (including multicourse sequences requiring lengthy participation), and instructional approaches have been identified as contributing to low achievement rates (Grubb & Gabriner, 2012; Hughes & Scott-Clayton, 2011). If instructional improvements are to contribute to the effectiveness of developmental education, assessment methods will be of critical importance. Assessment and instruction are intertwined, and the design of effective instruction depends on detailed knowledge of students’ academic skills (Salvia, Ysseldyke, & Bolt, 2013). The current study focuses on Conley’s (2007) second component, key content, in its interest in assessing developmental education students’ reading and writing skills in order to gauge their level of college readiness.
3. Assessing Students’ Readiness for College Literacy Demands In this section, we describe traditional methods of assessing students’ college readiness for college-level reading and writing. We then provide rationale and discuss prior research related to the constructs we measured in the current assessment. As mentioned earlier, our study expands on traditional methods by using literacy tasks that are more authentic than those used in placement tests. Another way in which the current study expands knowledge of college readiness is to include self-efficacy ratings, teacher judgments, and retrospective reports.
3.1 Traditional Methods of Assessing College Readiness The level of college readiness in a student population has typically been measured in three ways. The first is to count college developmental education referrals and/or enrollments. In community colleges, the main site of developmental education in the
in this domain (Armstrong et al., 2015). Although readiness for college literacy demands is typically operationalized in terms of placement test scores, the instruments used vary in constructs measured, and, overall, their predictive validity has been questioned (Hughes & Scott-Clayton, 2011). Thus, despite major concern in the United States over the lack of college readiness, there is no commonly agreed-upon definition of this construct that would be specific enough to translate to quantitative measures. At the same time, college instructors routinely form strong clinical judgments on whether the students in their classrooms can understand and apply material covered in their curricula (Perin & Charron, 2006). Besides teacher judgments, a conceptual framework for determining readiness for college reading and writing emerges from three sources of information: examination of text and writing assignments presented at the introductory college level (Holschuh & Aultman, 2009; MacArthur, Philippakos, & Ianetta, 2015); state, district, and/or college student learning goals for reading and writing (Barnett et al., 2012); and the Common Core State Standards for reading and writing (National Governors’ Association Center for Best Practices & Council of Chief State School Officers, 2010).
3.2 Constructs Used in the Current Study College reading and writing. Given the generally poor alignment of secondary and postsecondary literacy demands (Acker & Halasek, 2008; Williamson, 2008), it is important to know what incoming college students are in fact able to read and write. College reading requires the use of complex cognitive processes, such as analyzing text to identify the most important information, utilizing background knowledge from specific content areas, interpreting language and vocabulary appropriately for the context, consciously using personal strategies for understanding new concepts, and drawing analogies between different pieces of information (Holschuh & Aultman, 2009; Macaruso & Shankweiler, 2010; Paulson, 2014; Wang, 2009). The ability to comprehend expository, (i.e., informational), text is particularly important (Armstrong et al., 2015). At this level, competent reading depends on self-regulatory and metacognitive mechanisms, including the ability to set goals for reading a particular text, apply knowledge of text structure to the task of comprehension, assess one’s understanding of information during the process of reading, and assess the trustworthiness of a particular text (Bohn-Gettler & Kendeou, 2014; Bråten, Strømsø, & Britt, 2009).
As with reading, college writing involves the use of strategies to ensure an appropriate response to an assigned prompt (Fallahi, 2012; MacArthur, Philippakos, & Ianetta, 2015). At the postsecondary level, the student writer is expected to understand the informational needs of a reader, and generate text appropriate to the purpose. At this level, students are expected to write discipline-specific texts that summarize, synthesize, analyze, and respond to information, and to offer evidence for a stated position (O’Neill et al., 2012). Further, it is expected that material be written in students’ own words, and that students provide citations for quotations in order to avoid plagiarism (Keck, 2014). Although reading and writing are often taught as separate subjects in postsecondary developmental education courses, in practice the two skills are closely related (Fitzgerald & Shanahan, 2000). In higher education, students need to integrate these skills; for example, at this level, writing assignments tend to be text-based (Carson, Chase, Gibson, & Hargrove, 1992; J. M. Jackson, 2009; McAlexander, 2003) and require critical reading of source text as the basis of a writing assignment (O’Neill et al., 2012; Yancey, 2009). The colleges in which the current research was conducted had integrated reading and writing instruction in single courses as part of statewide reform of developmental education. Two important types of college writing are persuasive writing and written summarization (Bridgeman & Carlson, 1984; Hale et al., 1996; Wolfe, 2011). A persuasive essay requires the writer to state and defend an opinion on an issue, and, at an advanced level, to acknowledge and rebut an opposing position (De La Paz, Ferretti, Wissinger, Yee, & MacArthur, 2012; Hillocks, 2011; Newell, Beach, Smith, & VanDerHeide, 2011). Summarization requires the condensation of information to main ideas (A. L. Brown & Day, 1983; Westby, Culatta, Lawrence, & Hall-Kenyon, 2010). When the material to be summarized is presented in written text, summarization requires both reading comprehension and writing skill (Fitzgerald & Shanahan, 2000; Mateos, Martín, Villalón, & Luna, 2008). Both persuasive writing and summarization are featured in the Common Core State Standards for College and Career Readiness and in college student learning outcomes. In the current study, students were asked to respond to two writing prompts, one requiring a summary and the other a persuasive essay, based on written source text.
Teachers have been found to be reliable judges of their students’ reading ability (Ritchey, Silverman, Schatschneider, & Speece, 2015). An early review of studies of the relation between teacher ratings and student test scores reported correlations from r =. to r = .86, with a median correlation of r = .62 (Hoge & Coladarci, 1989). A more recent review corroborated this finding, with a mean effect size of .63 (Südkamp, Kaiser, & Möller, 2012). Teacher judgments of students’ general writing ability were found to be moderate predictors of students’ motivational beliefs about writing, which included self- efficacy (Troia, Harbaugh, Shankland, Wolbers, & Lawrence, 2013). However, it has also been reported that teacher ratings of reading skill are more reliable predictors of the performance of higher achieving students than of the performance of lower achieving students (Begeny, Krouse, Brown, & Mann, 2011; Feinberg & Shapiro, 2009). Further, despite statistically significant correlations between teacher judgments and students’ reading scores, teachers tend to overestimate their students’ actual ability, especially for students of average reading ability (Martin & Shapiro, 2011). Students’ insight via retrospective reports. First-person accounts of task performance may help in the interpretation of students’ performance scores. For example, low scores may reflect low ability, lack of understanding of task requirements, lack of motivation, or some combination of these variables. For these reasons, student depictions of their own performance can provide important information on their level of college readiness. Two methods have been used to obtain first-hand accounts of performance: concurrent think-aloud statements and retrospective reports (Ericsson & Simon, 1993; Merchie & Van Keer, 2014). The former requires that the participant verbalize “online” the thoughts that are occurring and the strategies he or she is using while doing the task. This method is cognitively demanding, requires much training, and may be unreliable for participants who have low verbal or metacognitive skills and related difficulty in self- reflection. The latter, retrospective reports, involves interviews or questionnaires administered after a task is completed (Aghaie & Zhang, 2012; Harrison & Beres, 2007). Here, the participant is asked specific questions about how he/she interpreted the task and what he/she did to perform it. Although retrospective reports may be subject to problems related to inaccurate memory, difficulty reflecting on one’s own process, and/or social desirability (with interviewee stating what he or she thinks the interviewer will value), the
advantages of obtaining first-person reports from students seemed to outweigh the disadvantages of this approach. Therefore, we employed retrospective reports in the current study to obtain students’ insights into their performance on one of the assessment tasks. Retrospective reports have been used to help explain performance on a wide range of reading and writing tasks. Most of the studies we identified in a literature search were conducted with children (e.g., Crammond, 1998; Farrington-Flint, Coyne, Stiller, & Heath, 2008; Farrington-Flint & Wood, 2007; Griva, Alevriadou, & Semoglou, 2012; Kwong & Varnhagen, 2005; Moore & MacArthur, 2012; Steffler, Varnhagen, Friesen, & Treiman, 1998), although several were conducted with college students, including both English language learners and typical university students (Chou, 2013; Kwong & Brachman, 2014; Plakans, 2008; Strømsø, Bråten, Britt, & Ferguson, 2013). No studies were identified where retrospective reports were obtained from postsecondary developmental education students.
4. Development of the Researcher-Practitioner Partnership
4.1 Characteristics and Benefits The current study was conducted by a partnership between a university research center and two community colleges. Recent literature has identified researcher-practitioner collaboration as an important tool to support the development of effective policies and practices (Coburn, Penuel, & Geil, 2013; Torraco, 2014). Such collaboration may benefit both researchers and practitioners, given that participants not only take part in the investigation of issues and the application of strategies but also gain insight into each other’s experiences and perspectives in their respective roles (Coburn et al., 2013). In addition, Torraco (2014) suggests that scholar-practitioner collaboration ultimately creates stronger programming as a result of collaborators bringing different sources of knowledge to the conversation. Specifically, this collaboration allows for the initial consideration of both research and practice as offering different aspects of knowledge that are valuable. The partnership takes into account different facets of knowledge, including not only knowledge that is produced through research but also information that is embedded in practice itself.
collaborated with practitioner partners, who were community college instructors and administrators in two colleges. The study was part of a long-term research alliance between CCRC and the community college system, situated in a southern state (the state and the participating colleges are anonymized). The system encompasses all of the community colleges in the state and has instituted centralized policies and prescribed student learning outcomes for developmental education. Within the last 10 years, the system has been implementing a statewide reform of developmental reading, writing, and mathematics courses. Importantly for the current study, the reading and writing courses have been combined in the form of single developmental English courses. Further, these integrated courses are taught in eight-week periods, replacing the prior 16-week courses. CCRC has partnered with the community college system to explore the nature, implementation, and early outcomes of the reform. The current study centers on two community colleges in the system. Each of these colleges served both urban and suburban areas. Senior administrators at both colleges committed to participation in the partnership for a minimum of two years. This relatively long-term commitment allowed the researcher and practitioner partners to work together throughout the development of the research design and the data collection phases of the study. The research conducted in this partnership focuses on student knowledge, competencies, and skills. Attention to these areas in developmental courses not only fills an important gap in the literature but also is useful within the context of a researcher- practitioner partnership. When CCRC researchers presented the concept underlying the present study to senior college administrators, they responded positively and agreed to participate because they recognized it as an opportunity to obtain findings that can inform curriculum refinement in their developmental reading and writing courses. More specifically, the practitioner partners indicated that improved understanding of the skills and knowledge students gain in developmental courses would inform decision making on curriculum and pedagogy in the future. To leverage the partners’ respective expertise and produce mutually beneficial research, the partnership developed a process of communication that brought the participating researchers and practitioners together. On-site interviews with instructors of both developmental and college-level courses as part of a larger project verified the
importance of written summarization and persuasive writing, the two tasks used in the current study. At the beginning of the learning assessment study, each college named a research liaison and two developmental English instructors to serve as lead faculty partners. These individuals provided on-site logistical support, such as recruiting students to participate in assessments. Further, early in the development of the partnership, the college liaisons articulated to CCRC their respective colleges’ objectives for participating in the study. Both liaisons were deans, and they approached the study from the perspective of supporting the curricular and pedagogical development of their developmental English courses. Moreover, lead faculty allowed researchers to observe multiple class sessions in order to familiarize themselves with instructional practices, faculty styles, and student participation in classrooms in the study sites. Lead faculty and research liaisons also contributed to the selection and design of the assessment instruments used in the study. There was a large amount of communication between the CCRC researchers and the college leads from the beginning of the development of the partnership. An important event in the collaborative process was a one-day retreat, which was held at one of the two colleges; the research team and the faculty leads and research liaisons from both colleges attended the retreat. The retreat included a broad discussion among participants of the study’s goals and workshop sessions for in-depth discussion of the assessments. In particular, at the retreat, the research team worked closely with the practitioner partners to assess the appropriateness of the written text that the researchers proposed to use in the text-based writing tasks, the nature of the writing prompts, attributes of students’ writing to be evaluated, and the assessment administration procedures. There was a great deal of discussion on the nature of the prompts, and especially on how critical thinking could be assessed. The wording of the prompts was carefully crafted in this collaborative procedure. As the assessment was being developed by the partnership, two practical constraints became apparent. First, the instructors did not wish to give up classroom time for the research, given their need to meet curricular goals and prepare students for tests. Second, the instructors did not wish the students to be subjected to what they considered to be an excessive amount of testing. Keeping these constraints in mind, the partners agreed on an assessment battery consisting of a mixture of standardized and researcher-developed
assessment data based on 12th-grade end-of-year standardized test norms, as well as on the proficiency of research samples that were as similar as possible to the current participants.
Explanation : We interviewed some of the participants in order to seek their perceptions of their ability on the written summarization task we used in the assessment. The interview was developed as a retrospective report in which students described their experiences of writing a summary shortly after completing the task. In particular, we were interested in whether the students understood the nature of summarization and the strategies they used to summarize a written text.
5.2 Participants and Setting The participants were 211 students attending developmental education courses in two community colleges (which we refer to as College 1 and College 2) in a southern state. College 1 was situated in a mid-sized city and served an urban population. Enrollment in college-credit courses was 17,937; 53 percent of students were male, 52 percent were White, 23 percent were Black or African American, 53 percent were aged 24 or below, and 7 percent had registered for developmental reading and English courses. College 2 was located in a suburb of a small city and served an urban–suburban community. College-credit enrollment was 7,676, with 39 percent male students, 30 percent White students, 28 percent Black or African American students, and 12 percent registered for developmental reading and English courses. According to centralized state policy, all developmental reading and writing instruction was integrated in single developmental reading and English courses, which were taught in an eight-week, compressed acceleration model (Edgecombe, 2011). This policy was part of an ongoing statewide restructuring of the state’s developmental reading, writing, and mathematics program. At the time of data collection, College 1 was integrating reading and writing and using the accelerated time frame for the first time, and College 2 had been doing this for several years. There were three levels of the integrated developmental reading and English course, and study participants attended the intermediate and top levels. At both colleges, the largest proportion of developmental reading and English enrollments (65 percent at College 1 and 56 percent at College 2) were in the top-level courses, which are one level below college-credit English. The state mandated that the student learning goals listed in Box 1 be addressed at all levels of the developmental reading and English curriculum.