Ohio Accountability System Thoughts

For what it’s worth, here are my responses to questions that were posed to testing committee members this week in preparation for our next session:

With a focus on designing an ideal system of state assessments, what are the key issues for you going forward? What areas that need further exploration?

Is it possible to use existing diagnostic assessments for summative and accountability purposes as well?  For example, ACT Aspire, which provides a range of results correlated   to predicted ACT college and career ready outcomes, could be used in a summative system where spring to spring scores are compared and credit is given for growing students, similar to the construct of the K-3 literacy measure on the LRC.  This data could also be disaggregated by subgroup for ESEA accountability purposes.

Reducing the overall amount of testing, while not tying the hands of Districts to administer diagnostic assessments that inform teaching and learning is the gold standard outcome for the testing committee. Finding a suite of assessments that can be used for multiple purposes, while allowing districts to choose their tool and administer a single assessment whose data can be used in a variety of ways should be the focus moving forward.

The frustration is that the data from PARCC/OCBA will in no way be able to inform instructional decision making for students at the point when teachers are equipped to make a difference with them.  The key to ending over-testing is to admit that the argument of ‘testing the full range of the standards’ is a false choice that inappropriately limits decisions that can be made to reduce the overall test level for students.  If current diagnostic platforms (Aspire, STAR, MAP) have college and career ready metrics and student growth data baked into their systems, and if their data can prove to be validated in the actual outcomes of students who have taken the assessments and transitioned to college/career (and continued to be remedial free), then having another assessment layered on top that ONLY measures the CCSS is an extreme example of redundancy.

The issue for exploration is why has Ohio insisted on continuing down the path with PARCC in the name of school accountability when there already exists a range of assessments that can measure students’ ongoing journey towards college and career readiness?  What matters is are students college and career ready, not did they take tests that measured the full range of the standards.  If systems already exist that can answer the first question, and if longitudinal data studies validate the claims these assessments contend to make regarding readiness at X, Y, or Z performance levels, then having an additional system that measures standards first and then layers on a CCR score is unnecessary.

What are two issues that you would like ODE to speak to at our next meeting?

1.   Is it possible to move to a system where ODE allows for multiple assessments to be chosen by Districts for accountability purposes, can can ODE crosswalk the scores to ensure continuity of performance levels across the measures (this is the model from the Third Grade Guarantee alternate assessment structure).

2. What is ODE’s ability to re-submit the ESEA waiver to account for either recommendations from the testing committee or legislation signed into law by the Governor?

3. What constraints are placed on the committee by the current PARCC/AIR contracts?  Is there room to make significant recommendations for Fall 2015, or are we boxed in by contracts that will not expire for several more years?

4. In light of the sub-optimal testing conditions many students faced electronically (an inordinate amount of distractions due to error codes and general tech hiccups), will ODE advocate for school districts and work with the legislature to consider the data from 2014-2015 as a full year field test data set and not use the data for grading purposes on the local report card?  To do so is unfair to school districts and will paint a false picture of District performance in a year where there are too many variables in the data to reliably count on their statistical validity.

5. What is ODE willing to do regarding testing and the multiple graduation pathways at the high school level?  As ODE is all about ‘choice’, it seems logical that once a student has demonstrated that s/he is remedial free and has qualified for a graduation pathway, s/he should be able to choose what additional assessments to take.  For example, if a student scores a 27 composite on the ACT as a freshman and meets the remedial free pathway on a college and career ready assessment, why should s/he have to take additional assessments that measure college and career readiness?  The opt-out movement will continue to stay alive and active in Ohio unless this issue is resolved, especially in high performing school districts.

Remaking Education?

An article is below that captures the thoughts of the Ohio Superintendent For Public Instruction on college and career readiness.

My major concern is that while on the surface he talks about 21st century learning, students going deeper with the curriculum, improved rigor, etc., his remedy for our educational ‘woes’ will not measure these ‘solution’ skills.
In Ohio, not only are we going with PARCC to assess (in a standardized way) the common core, but there is also a push for standardized end of course exams and standardized assessments at two points in time for all courses in order to measure teacher effectiveness.
Isn’t this over-reliance on standardized tests and the narrowing of the curriculum what the common core and the focus on 21st Century skills were supposed to get away from?
From my vantage point, it looks like we are heading for a huge spike in standardize tests, not less.
And, if you think that teachers will embrace the types of creative reforms meant to produce flexible, creative, authentic thinkers (the kind that can’t be outsourced by a computer if you are a Dan Pink fan) given this new onslaught of testing, I’d think again.
In Ohio current law mandates that 50% of a teacher’s evaluation be tied to measures of student growth by the 2013 – 2014 school year. In this economy, if your job is on the line, what do you think is going to happen? More teaching to the test, more top down instruction, less creativity, fewer opportunities for students to demonstrate mastery in unique and creative ways (21st Century), more student disengagement than ever.
At some point this tension between the authentic, immersive, engaged world that students live in outside of school and the authoritarian, standards driven, narrow approach that schools take in the name of proving mastery is going to have to come to a head.
We can not continue to talk about promoting digitally proficient, flexible, creative students while measuring them in ways that do not promote these values.
If traditional brick and mortar institutions continue to keep the current head in the sand approach, competition from electronic providers who understand how to leverage the world of our digitally native students will threaten to overwhelm the traditional system.
A middle of the road solution would be to require a portable electronic portfolio, tied to the common core, that would contain specific requirements (with flexible options) to demonstrate mastery of content at each grade level. By the time students reach 12th grade, this portfolio would serve as a rich senior capstone experience that demonstrates growth over time and authentically measures students strengths. While there would expenses on the front end in terms of planning and implementation, this would allow teachers to embrace the types of authentic, non-linear practices that our students need to thrive in an age where “learn-unlearn-relearn” must be the focus in our hyper-speed global economy.
None of the above can happen if standardization is the only focus.

http://www.dispatch.com/content/stories/editorials/2011/12/15/revamping-education.html


(Mis)Information

To be competitive in the global age in which we find ourselves, flexible, adaptable thinking must be the hallmark of student processes that are imbedded into lessons on a daily basis. Students must be able to develop unique, creative solutions to authentic problems as a result of their learning. This is the heart of Project Based or Problem Based learning. While there is underlying knowledge that can serve as a foundation for these types of experiences (and can be tested using selected response items), the emphasis should be on the application of knowledge in new and unrelated circumstances. The current generation of standardized accountability measures do not test for this type of learning, and has had the effect of diminishing students creative problem solving capacities. The new generation of assessments must emphasize authentic demonstrations of problem solving ability over students ability to pick the one right answer out of four in a test item bank. The world we live contains multiple pathways to arriving at satisfactory results. Why do we still insist on pigeonholing our kids by teaching them to always look for only one correct answer? The past 20 years of this emphasis has created a generation of students who are not divergent thinkers and who are dependent on others to tell them the correct answer. What is scary is that educational leaders have also been boxed in by the emphasis on testing. This recent article demonstrates the damage that the overemphasis on standardized tests has caused to school leaders:

http://www.thisweeknews.com/content/stories/dublin/news/2011/11/29/dublin-schools-expect-benefits-from-new-state-tests.html
Here are two quotes from the article (which are in direct competition with one another):
“What will happen is a shift to more authentic tests,” Axner said. “We’ll get away from more standardized tests that don’t allow our kids to think.” (This is a reference to the new PARCC assessments that will measure the common core in Ohio)

Then, there is this:

But secondly, the plan is instead of the district waiting 60 days for the results, you’ll have the student results within 60 seconds. … That will improve the ability to provide intervention and remediation.” (The only types of items that you get this type of instant feedback from are selected response items)

Mr. Axner espouses getting away from standardized testing, but in the VERY SAME ARTICLE speaks of the benefits of standardized testing. (If you’re not from Ohio, he leads one of the best districts in the state).

What this proves is that breaking free of the paradigm of standardized testing is extremely difficult if you live within the paradigm (and is yet another threat to the long term existence of brink and mortar school districts if leaders lack the ability to work outside of this paradigm).

The Test

OK, so I’m worried…..

Ohio has just announced that PARCC will be the test vendor for the new Common Core assessments.
My concern is that their idea of 21st century assessment (how many times will they use the word ‘innovative’ in their literature?) will be to take 20th century selected response items and computerize them. This would lead to more of the same in the classroom (a narrowing of the curriculum in order to ensure that students only pick the one correct response) and this is incompatible with 21st century skills students desperately need to compete in the global economy.
The new assessments will drive instruction and learning for better or worse. I hope (but doubt) that policy makers will think carefully about the damage they are inflicting by measuring accountability using single, constrained metrics.
Here is a slide from the November 2011 PARCC powerpoint describing the efficiencies of their system:

PARCC’s assessment will be computer-based and leverage technology in a range of ways:

Item Development
Develop innovative tasks that engage students in the assessment process
Administration
–Reduce paperwork, increase security, reduce shipping/receiving & storage
Increase access to and provision of accommodations for SWDs and ELLs
Scoring
–Make scoring more efficient by combining human and automated approaches
Reporting
Produce timely reports of students performance throughout the year to inform instructional, interventions, and professional development

I’ve take the liberty to redline the codewords for multiple choice items (in my opinion).
Here’s a portion of another slide about the types of items PARCC will be using:
Summative Assessment Components:
Performance-Based Assessment (PBA) administered as close to the end of the school year as possible. The ELA/literacy PBA will focus on writing effectively when analyzing text. The mathematics PBA will focus on applying skills, concepts, and understandings to solve multi-step problems requiring abstract reasoning, precision, perseverance, and strategic use of tools
End-of-Year Assessment (EOY) administered after approx. 90% of the school year. The ELA/literacy EOY will focus on reading comprehension. The math EOY will be comprised of innovative, machine-scorable items

Some lingering questions include how many PBA questions will there be, what percentage of a student’s summative score will the PBA questions count for, and will there be anything other than selected response questions on the EOY assessment?
The common core espouses collaborative, authentic project based and problem based learning. This will not occur if the assessment students are subject to and teachers are accountable for values single pathways to correctness.
Is anyone else worried?