Education, Ohio, Public Schools, School Board, Schools

Gunlock Talk

Mr. Gunlock,

According to your opinion piece in the Dayton Daily News on March 22, 2017, you are still locked into the mindset that Ohio students are failing. And, that we, the truly committed and concerned, are not willing to set ‘high enough bars’.

You callously, and continuously, combat the majority voices of primary stakeholders (literally thousands of students, parents, educators, principals, counselors, and superintendents) who are, and have been, against this entire high-stakes testing mess in Ohio.

For the record, I believe that high school credit hours should stand as the requirements for high school diplomas.

In vain, while you sat on the Ohio State Board of Education, we who are currently in the education field or who are raising children who currently attend public schools, have sought to bring you data and research proving the ills and harms of this obsessive testing culture. We carried first-hand experiences and observations to you through countless emails, editorials, phone calls, blogs, webinars, meetings, protests, civil disobedience and committee hearings only to be met time and time again with your haughty disregard and disrespect. We were defeated before we began to speak for you already had it set in your mind that we the teachers and parents do not desire success for those we have based our whole careers and lives around.

So, I thought, in this instance, to speak in your language of boxes and numbers, since you seem to relish those more than real-life stories. I am frankly very tired of this “we are failing” talk. It is false narrative for a slew of reasons.

For the sake of this weary argument about cut scores, and what scores Ohio students have to have to represent what in your eyes is success or not, I offer some numbers for you. These are comparative side-by-side sets of data from the first, last, and only year (2014-15) that Ohio took PARCC tests. I believe PARCC scores are the cut-scores to which you were referring in your last ditch chance to redeem your stance. (Or, perhaps they were the AIR cut scores that had to be modified post factum?) At any rate with either sets of data I am certain I would be able to make my point.

If one reads through this entire 67-paged pdf of data charts, one would quickly see as I did, that in comparison to the other states who took PARCC tests in the 2014-15 school year, Ohio in fact did great!

gunlock

For my response here to you, Mr. Gunlock, I have taken the combined numbers of the 4th and 5th levels (Accelerated and Advanced) and pieced together from 3 pages of the pdf, the averages of the other states right next to Ohio’s averages.

All the red percentages indicate that the other states’ combined averages were below the numbers Ohio produced. The two blue percentages show where Ohio’s numbers were slightly below the combined averages. (Grade 5 and Grade 8 by .9% and .1% respectively.)

One who enjoys tooling around with all of these numbers, I suppose could come up with a bunch of comparison bites. But ultimately these numbers show what they show — scores of tests — period the end.

They do not inform instruction. They do not inspire students. They do not build community support. They confuse and complicate communication about real causes to achievement gaps. They cost too much time and energy. They squander far too many resources and public monies.

A high school diploma IS symbolic. It represents 11,700 hours a student spent with passionate professionals and peers. It represents 702,000 minutes of memories a child stored up. It is a holistic accomplishment mark at nearly a fifth of a century of a person’s life. It should not be reduced to data digits.

A professor once taught me that all data is skewed in the same manner that all maps reveal some distortion. Perspective matters. One cannot accurately assess the education field from afar. I think some tremendous insight and enlightenment could be effective if people had education expert ‘fitbits’ on their wrist. Instead of counting steps, it would count teaching encounters with children. One could only offer input into education policies if they had such and said number of direct teaching interactions with Ohio’s youth and little learners.

Mr. Gunlock, you and C. Todd Jones have relieved yourselves of service on the State Board of Education. I hope, as my graduating class of 2031 might sing to you, that you both are able to “Let It Go!”

Sincerely,

Kelly A. Braun,
Mom of a 2018 graduate, (my youngest of five),
PreK Lead Teacher,
Badass Teachers Association Admin,
& Ohio BATs Admin

Standard
Education, Ohio, Public Schools, School Board

Oh Really, Gunlock?

mandyI find it very interesting that Mr. Gunlock is still trotting out the same tired old rhetoric about the OGTs measuring eighth-grade knowledge even after he quit the state school board mid-term. We have asked for proof of this claim over and over yet he has failed to provide it. I would love to read any information you could provide on the topic, Mr. Gunlock. Of course this is not even the real problem anyway. The problem is anyone who thinks that a child’s score on some tests is a true indicator of his readiness for the future. You did not take exit exams, Mr. Gunlock, yet I am sure that you consider yourself a success.

Why do we continue to insist on giving these exams and tying them to graduation when study after study shows that this is not only unnecessary, it can actually be harmful? We are one of only fourteen states in the entire country that require children to pass exams to graduate. Are those thirty-six other states full of kids that are not prepared for the future? Of course not.

As for business leaders complaining about a lack of qualified candidates for the workplace, I have yet to hear that from an actual business leader. A 2014 study done by The National Association of Colleges and Employers and published in Forbes magazine found that the top ten skills employers seek are:

  1. Ability to work in a team structure
  2. Ability to make decisions and solve problems (tie)
  3. Ability to communicate verbally with people inside and outside an organization
  4. Ability to plan, organize and prioritize work
  5. Ability to obtain and process information
  6. Ability to analyze quantitative data
  7. Technical knowledge related to the job
  8. Proficiency with computer software programs
  9. Ability to create and/or edit written reports
  10. Ability to sell and influence others

Is there any proof that these new tests measure ANY of these qualities? How can we know for sure when no one has ever seen these tests?

I would also like to know why you left the state school board so abruptly, Mr. Gunlock. Was it because you couldn’t bear the thought of admitting that perhaps you got it wrong? That maybe, just maybe, these kids are not the problem? That the fact that only 24% of students scored proficient or above on the Geometry test may have more to do with the actual test than the kids themselves, regardless of how you feel about them only needing to answer 35% of the questions correctly? I am inclined to believe the children who took these tests when they tell me that there were questions on there on topics that they had not yet covered in class. Of course that would only make sense given that, despite what the name implies, these end-of-course tests are given in March and April.

Is taking the word of  PARCC about its cut scores even though their tests were deemed so poor that we dropped them after only one year fair? How about switching testing vendors after one year while simultaneously raising the cut scores of the new tests? Or giving these same kids three different sets of Math and ELA tests in three years while providing little to no information to districts about these ever-changing testing requirements?

Does it matter that some students were taking these tests online while some used paper and pencil? While PARCC claims that it did a study and found no discernable difference, the results from states around the country say otherwise. As a matter of fact, Derek Briggs, professor of research and evaluation methodology at the University of Colorado at Boulder, who also happens to serve on the technical advisory committee for both PARCC and Smarter Balanced (whose tests are created by the same vendor we now use for all of our state tests), is quoted as saying, “In the short term, on policy grounds, you need to come up with an adjustment, so that if a [student] is taking a computer version of the test, it will never be held against [him or her]”. Yet we are still holding current juniors responsible for the results of these tests that nearly everyone else was given a safe harbor from.

Like Representative Fedor said recently, the adults got it wrong, not our children. I am incredibly grateful to her and the remaining school board members who recognize that we have a serious problem here that needs to be addressed so that the 35,000 of our current juniors who are not on track to graduate next year get the opportunity to do so.

Written by Mandy Jablonski, Ohio BAT

Standard