Federally Mandated State Tests May Be Gone, but There Are Other, Better Ways Educators Can Assess Students During the Shutdown
- #COVID19 may have canceled end-of-year tests, but educators already know they are only one source of info and the type of assessment actually most removed from the day-to-day teaching and learning @LauraMSlover
- Interim (periodic tests to check progress toward end-of-year goals); formative (given during instruction to gauge understanding); and diagnostic (to pinpoint strengths and gaps to guide instruction) are all still key to gathering student data during #COVID19
In this time of educational upheaval, superintendents and school leaders are reconsidering curriculum, instruction and professional learning plans, given state-mandated school closures through the end of the school year and the impact on the typical testing regimen. Many leaders rely on data from annual state summative assessments and local interim assessments to gather information about student learning to drive academic decisions.
Many district leaders meet regularly with their academic teams to develop, monitor and refine their district strategic plan, and school-based leaders do the same with corresponding school-based plans. Teachers use data from a full spectrum of assessments to inform instructional decisions. But now, with schools closed, some of these tests won’t be administered. Many leaders want to continue to use data-driven decision-making but recognize they are in uncharted territory.
School closures across the country have prompted many in public education to fret about the possible loss of annual academic assessments and the student performance data they provide. These end-of-year assessments mandated by federal law garner so much attention because they’re one of the only consistent touch points for comparable education data that help the public monitor schools’ performance. But as educators already know, end-of-year tests are only one source of information — and they are the type of assessment that is actually most removed from the day-to-day teaching and learning. When it comes to informing instruction, very little will be lost if we take a pause on annual testing.
Traditionally, schools and districts use a number of different types of assessments — interim (periodic tests to check progress toward end-of-year goals); formative (given during instruction to gauge understanding); and diagnostic (to pinpoint strengths and gaps to guide instruction) — to inform daily instruction as well as to determine student placement, drive planning, ensure common expectations and determine whether students are on track to meet learning goals.
Children deserve a first-rate education; and the public deserves first-rate reporting on it.
Please support our journalism.
For example, some districts use teacher-driven portfolios (a compilation of academic work), exit tickets (quick checks for understanding at the end of a lesson), quizzes, and good old-fashioned book reports and research papers, as well as interim assessments they purchase from private testing companies to gauge growth at intervals across the year. Increasingly, districts and schools are turning to curriculum-aligned assessments — formative and interim — that provide fine-grained information about student learning that is directly tied to instruction.
These are the data sources that will matter most in this moment of educational upheaval — and beyond. During this extended period of remote learning, schools will need to take different approaches to teaching and learning. And that means thinking differently about assessment as well. The good news is that there are a number of ways to collect data on students through methods that are more closely connected to learning.
What constitutes a quality test?
Educators have responded heroically to the new normal. Some have quickly set up distance learning opportunities and others have created and distributed learning packets. And parents and family members have become homeschool teachers overnight. As teachers, students and parents settle into remote learning, they will need quality assessments that can inform instruction and provide cycles of feedback. Whether they are delivered as part of a digital curriculum, via an online platform or via pencil and paper packets, there are ways to gather the necessary information using high-quality tools.
What constitutes quality? Quality assessments for learning are valid in that they fairly measure the content and skills that students have been taught and provide information that supports appropriate inferences about student performance. Quality tests also are aligned to curriculum, meaning they measure what’s been taught; reliable, meaning that the results are trustworthy and consistent if a test were given multiple times; and fair and unbiased, which means they are designed to allow all test takers equivalent opportunities to show what they know and are able to do. Fairness is more critical now than ever. In addition to allowing for appropriate accommodations for students who need them, they should be culturally responsive and include some recognition that the at-home testing environment brings a variety of potential impacts on the ability of individual students to show what they know and are able to do.
This means analyzing the data from assessments taken during home instruction in light of uneven testing conditions (such as not having internet access or a device or even a regular quiet place to do work), including potentially gathering data from multiple sources before making any high-stakes decisions. It’s also the case that as with any distance learning system, cheating is possible. There are many tools that teachers and administrators can use to mitigate this issue, and this is likely to be a growing area of interest for test developers and platforms as the use of distance learning expands.
Quality assessments also must be useful — and should inform supports for students both now and when schools reopen. For example, educators can and should continue to use frequent formative assessment practices to check for understanding and find and address misconceptions. Designs for these formatives can make use of available technologies to set age-appropriate expectations and gather reliable data on what students are learning. These could include video presentations, audio recordings and different formats for writing assignments (including production of multimedia texts). These different formats may allow for greater cultural responsiveness and adherence to universal designs for learning, which maximize opportunities for all students to learn. Indeed, this sort of rethinking of assessments can support competency-based learning approaches that are consistent with ongoing shifts in the educational landscape.
Educators can continue to use interims to check for progress toward end-of-year expectations, particularly if students have online access. With the loss of state assessment data, these tests will play an even more important role. Well-designed interims provide an opportunity to gauge the depth and breadth of gaps in key concepts/core content for the current course of study that may impact future learning. In other words, since concepts build progressively from one to the next, weak student understanding of some initial concepts will have ramifications for future learning for years to come. Good use of interim assessments can ensure that students have a firm foundation for building more advanced learning, inform decision-making about which resources to highlight for use at home now, and inform instructional planning for when students return to campus. They are one of the most important tools because they can be used now to track progress and used again when classes start up to measure where students are, effectively getting double duty out of one type of assessment.
When schools reopen
This last point is critical. When schools and districts reopen their doors to students — whenever that may be — educators will need to quickly gauge where students are. Interim assessments will be able to help them quickly identify students and groups that have gaps — and then diagnostic assessments can go deeper to pinpoint what students know and where they have learning gaps and need additional support. Think of medicine: You might be given a high-level test to identify potential issues, and then a more detailed follow-up test if more information is needed. Now more than ever, we’ll need to know each student’s strengths and needs so we can serve them optimally. Flexible diagnostic tools that measure skills and knowledge provide teachers with a fairly quick way to group students by their instructional needs.
Yet gathering the data is only a first step. Educators will also need to analyze the data and form a plan to meet students’ needs — especially considering the learning time lost to the global pandemic. This rapid response will require an understanding of how to translate data from different assessments into action. This kind of “assessment and data literacy” is already an area for growth for many schools and districts.
So as districts and schools launch their remote learning models, they should also ensure that they have assessments to measure student progress. What’s more, they need to remember that without analysis and subsequent adjustments, data aren’t especially useful. Schools must prepare educators to use the data from these assessments and gear up with a plan so that educators are ready to fill learning gaps to support all students’ needs when it’s time.
Will there be perfect information? Absolutely not, but assessment information never is. Students are learning through different modes due to inequities in instructional plans, access to internet and devices, and parental support. Now is not the time for perfect — it’s the time for action.
Laura Slover is the CEO of CenterPoint Education Solutions, a nonprofit that supports schools, districts and states in designing and implementing aligned systems of high-quality curriculum, assessments and professional learning. She is the former CEO of Parcc Inc., which designed the multi-state Partnership for Assessment of Readiness for College and Careers test.Submit a Letter to the Editor