Team of Teams

It’s been a few months since I’ve had the time to write on the blog site.  Starting a new job will do that to you, especially when the new job is being THE leader.  Over the past five years I’ve done a lot of thinking and reading about leadership, but when the buck really does stop with you it takes the internal dialogue to a whole new level.  On an aside, I love my new job.  It’s a challenge every day, and there are certainly difficult moments, but it does give me the opportunity to put into practice ideas and concepts that I’ve been interested in trying for a while.

One of my top talking points when working with my leadership team is how do we reduce siloed behavior and thinking in teachers, so that their potential can be unleashed and student learning can be exponentially increased as a result of collective consciousness and action.  One of the many unfortunate byproducts of the teacher accountability movement tied to student growth scores is the increase in propensity for individual action as a tool for self preservation.  If you don’t know what I mean, spend a few days in a building during the fall when the approval of student learning objectives is taking place….you’ll quickly understand.

In any given school day, there are a number of evidence points that go unconnected regarding student performance simply because our teachers don’t interact with one another in a manner that will lead to the sharing of information that will benefit learning.  All too often, the precious collaboration time teachers have is usurped by conversations that are not instruction focused or growth oriented.

The only way to move the needle on practices that inhibit both student and adult growth alike is to focus on the power of leadership in creating a culture of empowered decision making and shared, collective action.  As education has become more complex, the need for leaders who understand their role in crafting a culture conducive to a different kind of de-centralized thinking is imperative.  As a leader, it matters how I grow and empower those who are under me, for the success of the organization begins and ends with my ability to move others out of their silos.

As I’ve been wrestling with how to achieve this goal, I’ve been reading (and just finished) “Team of Teams” by General Stanley McChrystal.  It is the type of book that gets better as the pages move on.  The first half of the book uses the lens of the Iraq war and the early struggles in 2003-2004 to paint a picture of how traditional, hierarchical leadership is failing to keep pace with the changed realities of the informationally dense, interconnected world in which we find ourselves.  McChrystal patiently sets the parameters of the issues in the first half of the book, and then spends the second half exploring the changes in philosophies and behaviors that lead ultimately to the team of teams which changes the trajectory of the war by 2007.

While this is not a book about education in any real sense, there are applicable lessons for principals and superintendents throughout the book.  Often with leadership books the text starts strong, but wains as the book moves on and the author begins to repeat him/herself.  ‘Team of Teams’ is just the opposite.  I found myself throughly understanding the problem about 100 pages in, and was wondering when I would get to the ‘answer’ part of the book.  Rest assured, you do get there, and when you do it’s the type of read where you’re just going to finish it in one sitting because the material is so strong.

If you are an educational leader looking to release untapped potential in your District or organization, I would highly recommend exploring this book.  Below are a few of my favorite quotes from the last section of the book where things really got good.

“As our environment erupts with too many possibilities to plan for effectively, we must become comfortable sharing power.” (p. 212)

“The speed and interdependence of our current environment means that what we cannot know has grown even faster than what we can (which means…) The role of the senior leader (is) no longer that of controlling puppet master, but rather that of an empathetic crafter of culture.” (p 222)

“As a leader (the) most powerful instrument of communications is (your own) behavior” (p. 226)

“‘Thank you’ became my most important phrase, interest and enthusiasm my most powerful behaviors.” (p. 228)

“Gardeners plant and harvest, bur more than anything, they tend…Regular visits by good gardeners are not pro-forma gestures of concern – they leave the crop stronger.  So it is with leaders.” (p. 229)

“Creating and leading a truly adaptive organization requires building, leading, and maintaining a culture that is flexible but also durable.” (p. 231)

“The leader’s first responsibility is to the whole.” (p. 232)

“A leader’s words matter, but actions ultimately do more to reinforce or undermine the implementation of a team of teams.  Instead of exploiting technology to monitor employee performance…the leader must allow team members to monitor him.  More than directing, leaders must exhibit personal transparency.  This is the new ideal.” (p. 232)

“As the world becomes more complex, the importance of leaders will only increase.  Even quantum leads in artificial intelligence are unlikely to provide the personal will, moral courage, and compassion that good leaders offer.” (p. 232)

“A gardening approach to leadership is anything but passive.  The leader acts as an ‘Eyes-On, Hands-Off’ enabler who creates and maintains an ecosystem in which the organization operates.” (p 232)

Recommendations for a post PARCC Ohio

Should the H.B. 64 Legislative Conference Committee choose to maintain the prohibition on PARCC that is currently included in the House and Senate versions of H.B. 64, it will be necessary for Ohio to quickly find new assessments for the 2015-2016 school year or risk losing significant federal funding.  While time is short to make extremely significant decisions as to the future of testing in Ohio, there is a solution which has the potential to solve the current testing dilemma while meeting the needs of groups at the local, state, and federal levels.

The assessment proposal below seeks to do the following:

  1. Provide recommendations on comparable assessments that can be used for accountability purposes in Ohio.
  2. Reduce the amount of testing time for students.
  3. Allow for assessments that can used for both diagnostic (to inform instruction) and accountability purposes (to measure the overall impact of instruction).
  4. Provide local choice and control in selecting assessments that meet specific local context needs.

It is recommended that the Ohio Department of Education be directed to:

  1. Develop a list of comparable assessments that can be used to assess reading and math in grades 3 – 8 and once in high school per the current requirements of NCLB.
  2. Revise Ohio’s ESEA waiver to seek approval for the utilization of multiple assessments in meeting the testing mandates of ESEA.
  3. Utilize existing science tests in grades 5, 8, and physical science, as well as social studies tests in grades 4, 6, U.S. History, and Government that have been developed by the American Institutes of Research.  However, the Department should be directed to condense these assessments into a single testing window, or secure an alternate test vendor capable of providing science and social studies assessments in a single window.

It is recommended that the Department include the following assessments on the list of choices for Districts to utilize in order to meet mandated testing requirements:

  1. ACT Aspire
  2. NWEA Map
  3. STAR
  4. The Iowa Assessments
  5. Acuity (Grades 3 – 8)
  6. Terra Nova (Grades 3 – 8)
  7. IReady

For accountability purposes, the Department should be directed to establish scale score performance levels at the Limited, Basic, Proficient, Accelerated, and Advanced levels for all comparable assessments that local districts may choose to administer.  Additionally, Districts should be given flexibility to administer assessments in a manner that meets local conditions provided that data is reported to the Department through EMIS by a late spring/early summer date established by the Department. (Note: This precedent has already been established by the Department’s permissive stance on multiple assessments to measure the Third Grade Reading Guarantee.)

For value added data, the Department should be directed to contract with Battelle for Kids to expand the list of vendor assessments eligible for extended value added reporting to include all assessments on the Department approved list. http://portal.battelleforkids.org/Ohio/measures/value-added_information/SAS_EVAAS_Calculations.html

For funding, the Department should be directed to reimburse local districts, on a per pupil basis, an amount up to the per pupil dollar amount that was spent on PARCC in the 2014-2015 school year.

A common sense approach to the potential PARCC replacement issue is to leverage existing assessments used in Ohio schools for both formative and accountability purposes.  As alignments exist for both the Common Core and college/career readiness with the proposed assessments, it would be a mistake for Ohio to adopt another state’s assessments from a time before PARCC as an interim solution.  If the goal is to reduce the overall amount of time students spend testing during the school year, then allowing districts to use existing assessments to both inform and measure the impact of instruction is the right adjustment to make for the students of Ohio.

Another Way To Think About Shared Attribution

What if it were possible able to create shared attribution measure for teachers based on the K-3 literacy composite score?  I asked ODE, they said no, but the idea is still worthy (so here it is….)

On page three of the Guidance About Shared Attribution (http://goo.gl/dZ7IlV), the following statement is made regarding a potential benefit of this approach:
Shared attribution is included as a local measures option to allow districts to create a team atmosphere, which reinforces the fact that all teachers within a building or district are working toward common goals.
In the spirit of this desired outcome, is it possible to create a shared attribution option that is linked to the K-3 Literacy Measure?  A teacher’s growth rating would be calculated in the same manner that points are assigned for value added under shared attribution:
 
Inline image 1
Under this option, a teacher would receive 5 points if their building received an A for K-3 literacy, 4 points for a B, etc.  These points would then be converted in the 600 point scale system as one measure in determining a teacher’s overall SGM.
 
Likewise, local districts could also choose to award points based on the District report card grade on the K-3 measure.
 
As Dr. Ross has made the third grade guarantee a cornerstone of his superintendency, an approach that unites teachers in ensuring that as many students move from off track to on track in reading prior to the end of the third grade is one that should be given serious consideration by the Department.

Whose Honesty Gap?

I got a call from the local newspaper today asking for comments on the “Honesty Gap” report that was recently released by Achieve: http://www.achieve.org/files/NAEPBriefFINAL051415.pdf
Below is the follow up email I sent as a supplement to my phone comments on the matter.  (It should raise a HUGE red flag when a major Common Core and PARCC proponent misuses NAEP data in an effort to demonize schools and make their product and solution ‘look’ superior to what preceded it).
Just following up on our conver​sation from earlier today.  The link below is to an article from the Center for Public Education that tackled the issue between NAEP proficiency levels and state proficiency levels from all the way back in 2008 (in other words, this is not a new issue, rather, a re-occuring accountability skirmish).  I would highly encourage you to read the entire article and reference the data tables and figures.
A major point in the article is the fact that the NAEP Validity Studies Panel determined that the most appropriate score comparison would be the NAEP basic level to state proficient levels (paragraph 6 under the ‘Why is there a discrepancy?’ heading).
The article does a nice job explaining that proficient on NAEP and proficient on state assessments is an apples and oranges comparison (my analogy).  A more true apples to apples comparison would be NAEP basic to state proficient.
At the bottom of the article there are several data table links that show the score comparisons (NAEP proficient to state proficient and NAEP basic to state proficient).
The Achieve study attempts to take states to task for the ‘honesty gap’, when the actual lack of honesty is on the part of the authors of the Achieve study who inappropriately use comparison labels in ways in which they were not intended.
Also, for what it’s worth, Achieve has a major stake in painting states as low performing, for they are the group behind not only the Common Core but PARCC (the test that replaced the OAA’s and OGT)  It should come as no surprise that there is an agenda behind the ‘honesty gap’ report.
From the Achieve website: http://www.achieve.org/history-achieve

2009: Work begins on the development of the Common Core State Standards; Achieve partners with the National Governors Association and Council of Chief State School Officers on the Initiative and a number of Achieve staff and consultants serve on the writing and review teams.

2010: The final Common Core State Standards are released; Achieve begins serving as Project Management Partner for the Partnership for Assessment of Readiness for College and Careers (PARCC).

Please feel free to use any of this information in your story.

Thanks

Ohio Accountability System Thoughts

For what it’s worth, here are my responses to questions that were posed to testing committee members this week in preparation for our next session:

With a focus on designing an ideal system of state assessments, what are the key issues for you going forward? What areas that need further exploration?

Is it possible to use existing diagnostic assessments for summative and accountability purposes as well?  For example, ACT Aspire, which provides a range of results correlated   to predicted ACT college and career ready outcomes, could be used in a summative system where spring to spring scores are compared and credit is given for growing students, similar to the construct of the K-3 literacy measure on the LRC.  This data could also be disaggregated by subgroup for ESEA accountability purposes.

Reducing the overall amount of testing, while not tying the hands of Districts to administer diagnostic assessments that inform teaching and learning is the gold standard outcome for the testing committee. Finding a suite of assessments that can be used for multiple purposes, while allowing districts to choose their tool and administer a single assessment whose data can be used in a variety of ways should be the focus moving forward.

The frustration is that the data from PARCC/OCBA will in no way be able to inform instructional decision making for students at the point when teachers are equipped to make a difference with them.  The key to ending over-testing is to admit that the argument of ‘testing the full range of the standards’ is a false choice that inappropriately limits decisions that can be made to reduce the overall test level for students.  If current diagnostic platforms (Aspire, STAR, MAP) have college and career ready metrics and student growth data baked into their systems, and if their data can prove to be validated in the actual outcomes of students who have taken the assessments and transitioned to college/career (and continued to be remedial free), then having another assessment layered on top that ONLY measures the CCSS is an extreme example of redundancy.

The issue for exploration is why has Ohio insisted on continuing down the path with PARCC in the name of school accountability when there already exists a range of assessments that can measure students’ ongoing journey towards college and career readiness?  What matters is are students college and career ready, not did they take tests that measured the full range of the standards.  If systems already exist that can answer the first question, and if longitudinal data studies validate the claims these assessments contend to make regarding readiness at X, Y, or Z performance levels, then having an additional system that measures standards first and then layers on a CCR score is unnecessary.

What are two issues that you would like ODE to speak to at our next meeting?

1.   Is it possible to move to a system where ODE allows for multiple assessments to be chosen by Districts for accountability purposes, can can ODE crosswalk the scores to ensure continuity of performance levels across the measures (this is the model from the Third Grade Guarantee alternate assessment structure).

2. What is ODE’s ability to re-submit the ESEA waiver to account for either recommendations from the testing committee or legislation signed into law by the Governor?

3. What constraints are placed on the committee by the current PARCC/AIR contracts?  Is there room to make significant recommendations for Fall 2015, or are we boxed in by contracts that will not expire for several more years?

4. In light of the sub-optimal testing conditions many students faced electronically (an inordinate amount of distractions due to error codes and general tech hiccups), will ODE advocate for school districts and work with the legislature to consider the data from 2014-2015 as a full year field test data set and not use the data for grading purposes on the local report card?  To do so is unfair to school districts and will paint a false picture of District performance in a year where there are too many variables in the data to reliably count on their statistical validity.

5. What is ODE willing to do regarding testing and the multiple graduation pathways at the high school level?  As ODE is all about ‘choice’, it seems logical that once a student has demonstrated that s/he is remedial free and has qualified for a graduation pathway, s/he should be able to choose what additional assessments to take.  For example, if a student scores a 27 composite on the ACT as a freshman and meets the remedial free pathway on a college and career ready assessment, why should s/he have to take additional assessments that measure college and career readiness?  The opt-out movement will continue to stay alive and active in Ohio unless this issue is resolved, especially in high performing school districts.

HB 7, Safe Harbor, and Recommendations Data for Accountability

With the recent passage and enactment of House Bill 7, there are now limited safe harbor provisions in place for both school districts and students as it relates to PARCC data for the 2014 – 2015 school year.  I applaud the efforts of the legislature to ensure that reasonable protections are put in place in response to a testing system that is unproven and riddled with implementation issues.  Given the logistical challenges of delivering the PARCC assessments online this year, and the resulting skepticism of the validity and reliability of the test results that are directly linked to these challenges, I am calling on members of the legislature and the Governor to go even further and extend safe harbor to include a moratorium on the use of any PARCC related data on the local report card for the 2014-2015 school year.

The issue at hand is that data from these unproven assessments will still be used in a high stakes manner for all local school districts across the state.  Below are several evidence pieces that accentuate the disconnect between safe harbor as now enacted and the exposure Districts still have relative to results from this flawed system:

From EdConnect – March 23, 2015

As part of safe harbor, the 2015 Ohio School Report Card will not have an overall letter grade or letter grades for the six groupings of measures called components. All other results and letter grades will be reported on the 2015 Ohio School Report Card, just like previous years.

From the ‘Guidance on Safe Harbor‘ – March 24, 2015

As Ohio transitions to new state tests, safe harbor gives schools, teachers and students time to adjust to the new tests. In most cases, there will no longer be consequences tied to the results of the state tests given in the 2014-2015 school year. The consequences of state tests usually impact the following school year. Therefore, a safe harbor on tests given in the 2014-2015 school year will affect consequences in the 2015-2016 school year.

Working backwards from the second quote, it is completely erroneous to assume that the only consequences tied to LRC information come in the form of state and federal sanctions.  There are real consequences for school districts in the form of economic development harm caused by low letter grades.  Businesses make decisions about where to locate, families make housing decisions, and voters make levy decisions based in large part on the letter grades that are attached to district and building report cards.  If the state is willing to grant safe harbor to teachers and students, it should also hold districts harmless and grant full safe harbor from the use of PARCC data on the report card for this school year.

Based on these evidence points, It is my belief that this year should be considered a full state wide field test of the PARCC system, and the results should not be utilized at all in the state accountability system.  The position statement from Mason City School District also clearly articulates this position:

We also believe Ohio should avoid using any of this year’s results for accountability purposes, and instead view 2014-15 results as a true field test with no state-issued report card that assigns Performance Index(PI) scores and grades on subcategories. 

Given the degree of consternation many Ohio House and Senate members have expressed in response to PARCC, it is only reasonable to arrive at the conclusion that the data from this system will be problematic this year, and to hold district accountable for these results is patently unfair.  The data will in no way reflect the quality of the educational experiences students are provided by districts, and there will be no discernible educational value that can be gleaned from the results due to the numerous implementation issues related to the PARCC rollout in Ohio.

 

 

Edcamp Columbus 2015

At Edcamp Columbus I will be leading a feedback and idea generation session for the Ohio Senate Testing Advisory Committee.  I feel fortunate to have been selected to serve on this committee, and I’m looking forward to interacting with participants at Edcamp in order to bring as many voices into the advisement process as possible.

Senator Lehner has indicated that it is her expectation the committee get right to work on problem solving and recommendation development. Therefore, my goal is to use the Edcamp work session as a forum for generating and capturing concise thoughts in three distinct but connected areas that can be used to inform the work of the committee:

1. Succinctly identify logistical challenges of the current assessment system
(Subtext – Construct an argument as to why the current state is undesirable)

2. Identify potential future assessment alternatives

3. Brainstorm ways in which these alternatives could be utilized in an accountability system that meets the needs of students, teachers, parents, and policy makers
(Subtext – How is this future state a more desirable place than the current state)

Discussion Guiding Caveats

Existing Diametric Forces – The desire for an assessment system that informs teaching and learning vs. the present accountability system that ranks, sorts, and compares.

Challenge – Is there a way to create a new system that simultaneously meets the needs of these divergent concepts and allows them to not be in opposition to one another?

The Google Doc that will be used to record ideas from this session can be found HERE

Ohio’s Disconnect Between Testing and Educational Options

As a school leader responsible for delivering Ohio’s statewide assessments, I have been living with a great deal of ambivalence regarding my feelings about the current direction of the State in regards to testing and accountability. While I agree that the old OAA/OGT system lacked rigor in the realm of cut scores and performance levels, the new system is an extreme over correction that encroaches on far too many instructional days and is being delivered in a manner that will not yield reliable results. While I believe in accountability, and think that testing should play a part in the accountability system, I also believe unequivocally that the PARCC tests, in their current form, are not the appropriate solution for Ohio and can not be sustained in their current state.

The disconnect I see is in the rhetoric from State leadership regarding choice in educational options for students on one hand, while demanding standardized accountability experiences in the form of testing requirements on the other hand. To exasperate this disconnect, there is a willingness in the legislature to allow private school students to have one set of testing expectations for graduation, while holding public school students to a far different, less flexible set of expectations. While some would argue that there is choice in the current system, given the three graduation pathways Ohio has laid out, I would argue that this is a false set of choices given the expectation that all students participate in PARCC/OCBA tests.

The recent letter from State Superintendent Ross and Chancellor Carey on College Credit Plus crystalized the issue for me. Below is the closing paragraph from the letter:

Screen Shot 2015-03-01 at 7.42.24 AM

I believe that the “greater array of options and choices for (students’) futures” should not just apply to college credit plus, but should be expanded to the tests that are used in the accountability system in the State of Ohio. The current one size fits all approach with PARCC is placing an undue burden on school systems and students alike. Schools currently have a wide variety of assessment tools in place that measure student progress and growth over time. These assessments (ex. STAR, NWEA MAP, ACT Aspire) take far less time to administer, are aligned to the common core, and provide growth data that can be used for value added purposes.

In the coming weeks it is expected that a commission of Ohio educators will be formed to explore the current state of standardized testing in Ohio and will be charged with making recommendations regarding the future direction of testing as it relates to accountability. This committee has the potential to offer common sense solutions that will allow all interested parties to have their needs satisfied in the areas of testing and accountability. I hope that this group, when convened under the direction of Senator Lehner, will explore choice options that should be made available to all schools in the State of Ohio. If this group is able to capitalize on the work underway at the Ohio Department of Education to develop a list of exams that can be used for multiple purposes, and can put forth recommendations that maintain accountability structures while offering relief from the current model, it will be a win for all students and educators across the State of Ohio.

#ohedchat OTEC 15 Kickoff Chat

Every year that I have gone to OETC I always wish I had done a bit more pre-planning, given then vast scope of choices available during each time slot. With this in mind, this weeks #ohedchat will focus on the OETC 15 agenda and getting ready for three awesome days of learning.

Even if you can’t attend (which unfortunately is my lot in life this year), you can still participate by sharing what sessions interest you and resources you have that could be of interest to others expressing interest in sessions and topics.

The chat will be broken down by day and time. For each time slot, share about the sessions you plan on attending (or what looks interesting if you are not going). Also, if there is a session you are steering away from because you have tons of knowledge/resources on that subject, share that as well.

If you are of the otec-X persuasion on Wednesday, you can share about what you would attend on Wednesday if you weren’t ‘un-conferencing’

Schedule:

9:00 – Welcome/Introductions

9:05 – Tuesday Pre-Conference (Both Morning and Afternoon Sessions)
Tuesday Conference Agenda

9:12 – Wednesday Morning Sessions (Prior to 12:00 Noon)
Wednesday Conference Agenda

9:20 – Wednesday Afternoon Sessions (After 12:00 Noon)
Wednesday Conference Agenda

9:28 – Thursday Morning Sessions (Prior to 12:00 Noon)
Thursday Conference Agenda

9:36 – Thursday Afternoon Sessions (After 12:00 Noon)
Thursday Conference Agenda

9:45 – Wild Card Question – There are lots of IIS Vendors at OETC this year. What solution has your District chosen and why?

9:50 – Thanks for participating in #ohedchat and enjoy #oetc15 if you’re attending

#ohedchat Clean Out The Bookmarks edition

This coming Monday (January 19th) I’ll be hosting #ohedchat from 9 – 10PM EST.  As I was thinking about a topic, I came across this mess in my bookmarks:

Screen Shot 2015-01-17 at 6.49.10 AM

It occurred to me that I have a ton of valuable resources that are packed away in this jumble of bookmarks that are just wasting away in pile of digital clutter.  What to do with all this stuff and how to organize it better is another blog post entirely, but it did get me thinking that a cool #ohedchat exercise would be for participants to consider the questions for the chat and dig through their own digital archives to re-discover and re-connect with resources that were once their ‘go-to’ tools before they were replaced by something else.  For me, one thing I did was look back at all of my bookmark folders from past conferences, and I found awesome tools that I never took or had the time to explore once the conference was over.  This week’s #ohedchat will be an opportunity to open up your digital attic wherever it is (bookmarks, Pearltrees, Pinterest, Google Now, etc.) and share tools you have packed away that can make a difference for you and others.  Happy exploring!

#ohedchat
Monday, January 19th
9 – 10PM EST

9:00 – Welcome and introductions

9:01 – All responses tonight should include a resource from your digital archive of tools, so be digging through those bookmarks when responding!

9:03 – Q1
We all have digital clutter in our lives. What tools/strategies do you use to organize your professional digital resources?

9:08 – Q2
Share a link to a favorite professional teaching/learning video that has impacted you, then share why you selected it.

9:14 – Q3
Share a link to a favorite non-tech education related website, then share how you use it or how it has impacted you professionally.

9:20 – Q4
Share a link to an influential article you re-discovered through your bookmark exploration. How did this article impact you?

9:26 – Q5
Share a link to your favorite Google site for tips/tricks. Explain what about the site made you want to share it.

9:32 – Q6
Share a link to your favorite non-Google tech site. In what ways are the tools on this site impactful?

9:38 – Q7
Open Mic – Share one final resource you re-discovered as part of your digital archive exploration that you plan on bringing to the top of your go-to list.

9:44 – Bonus Question
Share an artist, album, or song that connects (in some loose way) with tonight’s topic.

9:50 – Thanks for participating in #ohedchat tonight!