Earlier this summer, we attended the National Conference on Student Assessment (NCSA), hosted by the Council of Chief State School Officers. The audience for this conference is national assessment experts, policymakers, and education organizations. The 2019 conference’s theme was “Measure What Matters, and Create Accountability for Equity.” Highlights included tracks on improving assessments, educator training and development, and piloting innovative assessment systems. Ensuring equity in assessment design and analysis was also an important issue echoed throughout the conference and in conversations we had with assessment experts and designers.
At iNACOL, we are reflecting on concepts to transform systems of assessments that can better serve all students, such as improving formative assessments with educator training, improving assessment literacy, new ways of approaching assessment validity and comparability, and updates on implementation of the Innovative Assessment Demonstration Authority (IADA).
Getting Formative Assessments Right
Getting formative assessments right was a constant theme in several sessions. Formative assessments have become a common term in classroom practice but with differing definitions and practices. Dylan William, a professor at University College London, stated that formative assessments are not an end to themselves but work to support a broader system of assessments designed for various learning and feedback goals. Additionally, William urged that we get clear on formative assessments as a process, not an end goal or an end-product assessment.
According to Andy Middlestead from the Michigan Department of Education, formative assessments in Michigan are used by students and teachers to elicit information to support students to become more self-directed learners. Middlestead stated, “Formative assessments are not: a benchmark/interim/diagnostic test. It is not a thing you can buy.” Instead, formative assessment is a conversation between educators and students, and the feedback from the assessments should be helpful to move learning forward.
Building Educator Capacity in Assessment Literacy
Transforming assessment systems requires investments in teacher training and development, especially in assessment literacy. For example, educators need the training and capacity to create formative assessments and to understand where students are in their learning trajectory, as well as what needs to happen next for students to succeed.
Michigan is an example of a state that is investing in educator training in assessment literacy.
The Michigan State Board of Education adopted assessment literacy standards and is working to develop standards for teacher preparation programs. Under the Michigan educator evaluation system, every administrator is now an evaluator, which means that assessment literacy is ever more important to the state and stakeholders. Finally, Michigan is also developing an assessment credential system for educators.
States Piloting Innovative Assessments
The Every Student Succeeds Act (ESSA) provides flexibility for states to redesign systems of assessments to better align with student-centered learning. IADA allows states to develop next-generation systems of assessments and pilot them on a smaller scale in select districts. This is an opportunity for states to assess their results and continuously improve as they scale statewide. NCSA offered multiple sessions on states participating in IADA, and they are highlighted below.
Louisiana’s proposal to pilot an innovative assessment was the first to be approved by the U.S. Department of Education. Louisiana proposed a new English Language Arts assessment called LEAP 2025. This assessment would test all grade 7 students in a pilot district. The assessment consists of five units, and a district is required to teach three of the units, of which one unit is state-mandated. The other two required units are chosen at the discretion of the district. Some pilot highlights include:
- Integration: Students take three formative tests throughout the year, including three end-of-unit assessments and then one end-of-year essay question.
- Equity: Teachers are focusing on developing background knowledge so that no student is at a disadvantage due to lack of life experiences.
- Local control: Local school districts continue to decide which books are used during instruction and which assessments students take.
In May 2018, Georgia passed SB362, which established the state-level assessment pilot and was subsequently approved under the IADA by the Department of Education. Georgia has three state-approved pilots, NAVVY, Cobb Metrics, and GMAP, all piloting in separate districts and regions across the state. NAVVY began in rural Georgia with 10 districts implementing it, and now it has 17 districts in the consortium. The two main goals of the assessment are to provide immediate data and reclaim instructional time. NAVVY is built to assess each student’s competency on each standard of each unit in every grade. Educators have flexibility on when students are tested, and each student receives three opportunities to demonstrate competency. This serves two purposes. Each time students are tested, the school gets formative information on students’ mastery, the teacher is empowered to personalize lessons for student learning and accountability. Students must demonstrate competency within three tries, or students will not receive credit. Educators also receive data in real-time; as soon as students are done with the exam, teachers have access to the students’ results. Georgia is considering three approaches to collecting and reporting summative scores:
- Report the percentage of standards for which students demonstrated competency
- Weighting the standards, potentially based on importance, to yield a weighted percentage achieving competency
- Keep the high-dimensional, multivariate profiles of competencies intact and see how they map onto the current state achievement levels
In the fall of 2018, New Hampshire was granted an Innovative Assessment Demonstration Authority to continue its PACE (Performance Assessment of Competency Education) program. In the beginning, only whole districts could use PACE. Now individual schools can apply to the New Hampshire Department of Education to take part. Approximately 10,000 students in grades 3 through 8 take the PACE exam. PACE gives teachers the flexibility to determine who takes the exam, when it is taken, and where it is taken. To ensure that teachers are prepared and qualified to leverage this flexibility, New Hampshire is prioritizing assessment literacy for educators and is working with the Center for Assessment to give professional development for educators doing this work. The assessments are made at both the local and the state level, with some common exams that teachers in all pilot districts create together and then, all pilot districts must administer to students. While districts are given the flexibility to determine when students take the exam, participating schools and districts must submit an assessment schedule that outlines when the exam will be given. New Hampshire highlights many opportunities under the PACE program from educator involvement in creating the exams, high-quality professional development around assessment literacy, and the amount of data and feedback that both teachers and students receive from the embedded assessments throughout the year.
North Carolina was recently approved for the innovative assessments pilot program. The North Carolina Personalized Assessment Tool (NCPAT) will rely on the use of a customized, end-of-year assessment for each student. The customized assessment will include test questions to measure a student’s achievement and will be a response to that student’s scores on formative assessments taken during the school year. North Carolina plans to begin piloting the test in the 2020-2021 school year.
Natalie Truong is Policy Director, and Alexis Chambers is Policy Associate at iNACOL.
The post Reflections on Assessments for Competency-Based Education and Updates to States Piloting the Innovative Assessment Program appeared first on iNACOL.