Friday, November 21, 2014

Grand Ledge Named to the 5th Annual AP Honor Roll



 


















Grand Ledge Public Schools is one of 28 school districts in the State of Michigan; and one of 547 school districts in the U.S. and Canada being honored by the College Board with placement on the 5th Annual AP® District Honor Roll for increasing access to advanced placement (AP) course work while simultaneously maintaining or increasing the percentage of students earning scores of 3 or higher on AP Exams. 

This is the 3rd time, in the past 5 years that Grand Ledge Public Schools has been recognized on this Honor Roll!! Reaching these goals indicates the district is successfully identifying motivated, academically prepared students who are ready for the opportunity of AP. Since 2012, Grand Ledge Public Schools has increased the number of students participating in AP while improving the number of students earning AP Exam scores of 3 or higher. 

Last year 244 GLHS students completed 331 AP exams.  Of those, over 70% earned scores of 3 or higher!

Proud To Be A Comet!


The complete 5th Annual AP District Honor Roll can be found at:

Thoughts on M-STEP and Beyond



On November 13, 2014 the Michigan Department of Education (MDE) released long awaited details on the spring 2015 student assessment.  The MDE heralded the release as “great news for our local districts” because “they have been very anxious to hear what the new assessment will be”. What we heard is that the that the Michigan Student Test of Educational Progress (M-STEP) is a combination of Smarter Balanced Assessment test items and some state developed items and for 11th graders, the test will also include ACT+Writing and WorkKeys. What we heard is that for the first time most districts will be required to administer the assessment on line.  What we heard did little to ease my anxiety as a high school principal. 
 
In fairness to the MDE the M-STEP is a reaction to changes in Michigan law. You may recall that the need for a new assessment arose when legislators halted the MDE's original plan of adopting the Smarter Balanced Assessment this year thus diverting what the department and some local districts had been working on for three years.  In short, the M-STEP is stopgap assessment created in response to a legislative directive that forced MDE to come up with a retooled state-developed assessment for the spring of 2015. That same legislative directive also mandates MDE put out a new request-for-proposals for a more long-term solution beginning in the 2015-16 school year. The results of that request and the “long-term” solution remain to be seen.

While, according to the MDE’s press release,  “the new assessment meets all of the requirements put into law by the legislature” I have to wonder at what cost.  As mentioned, this will be the first time that all statewide assessments will be administered online.  The MDE says not to worry, that 80 percent of schools in the state meet the minimum technology requirements and for those who don’t there will be a paper-and-pencil version of the test.  But I do worry. In theory, my district does meet the minimum technology requirements for the assessment.  On paper, we have the hardware and bandwidth capacities to administer the test.  In reality that hardware and bandwidth is used daily for instruction.  There is not a computer lab that goes unused. To devote these resources to testing for 11th graders results in a disruption to instruction for all grade levels during the testing window.   

This disruption is made greater by the length of test.  The entire M-STEP will take longer for districts to administer than the old Michigan Merit Exam (MME); all tolled, the 11th grade assessment comprises 8 partial days of testing or approximately 16 hours.  Initially, it was said that some some portions of the ELA and math exams were optional, that schools could cut out roughly 5 hours of testing by only completing the mandatory portions. Since, the MDE has "reconsidered" and has made it clear that the entire test is mandatory, that there will be "consequences"to those schools that choose not to administer all portions of the M-STEP.  Regardless, no matter how you slice it, the M-STEP signifies a substantial increase in testing time for students; an increase of 5 partial days and up to 8 hours.  Again, The result is a significant disruption to the educational process for ALL students in a school.

There are other concerns too; the effect of increased time spent on “high stakes” testing for 11th graders, an issue magnified in many school throughout the state by the number of students who will be taking AP tests in May; The impact of the new online testing format on student performance; the impact on school and district scores and rankings; the apparent devaluing of the ACT.  The list goes on. All of this begs the question “is this good for kids”? My answer is NO, for all of these reasons.  Yet this spring, we, like every other high school in the state will administer the M-STEP.  Between now and then we will continue to prepare our students the best we can.  That is all we can do…for 2014-2015. 

So let’s turn our attention to 2015-2016 and beyond. As school leaders we have the right and the responsibility to speak up.  It is vital for us to have a voice on issues such as standardized testing, accountability and the like.  As the legislature considers a direction and a “long term solution” for student assessment in Michigan it is imperative that we as school leaders play a role. We know what is best for kids.  We need to be part of the process!  We need to be heard!  If not, not we have only ourselves to blame!

Monday, November 17, 2014

Talking Baseball!


Last week Major League Baseball (MLB) unveiled its post-season awards.  Once again I found myself drawn to the discussion about who should be named the American League Most Valuable Player (MVP).  For the third year in a row a member of my beloved Detroit Tigers was a finalist.  This year it was Victor Martinez, and for the third year in a row the Angel’s outfielder Mike Trout was the primary competition for one of my beloved Tigers! 

Truth be told, I didn’t care who won.  For me, the debate was not about who is the better player.  Both are among the best in the game at their respective positions. Both had great season.  Both are valuable to their teams.  As a Tiger fan I would love to have Mike Trout on my team.  My guess is that many Angels fans wouldn’t mind if their team had Victor Martinez. The true debate was around the measure.   What is the best way to determine a player’s value? What are the truly important stats? 

Martinez’s supporters, like Miguel Cabrera’s in 2012 and 2013 tend to be traditionalists They would argue that he deserves the award because he had the better year based on the statistics we are all familiar with.  He had better offensive numbers; more runs batted in (RBI’s), more home runs (HRs) and he hit for a better average (BA). To them, these stats tell the story.

Trout’s supporters are “new school”.  They argue that there is more to the story than homeruns and batting average.  They say that Trout’s “value” cannot be measured by “typical” statistics alone.  They say that there are many factors that should be considered when measuring a player’s effectiveness.  They cite Sabermetrics, a term coined in 1980 by renowned baseball author and researcher Bill James.   James and others created new statistics with which to measure players' productivity other than the traditional batting averages and ERA.  In this school of thought “new” statistics such as batting average on balls in play (BABIP); wins above replacement (WAR) and equivalent average (EqA) are just as valuable when it comes to assessing a player (or a team) as the traditional stats.

A similar debate revolves around schools in our state.  What is the best way measure the effectiveness or “value” of a school or district?  Each summer, the Michigan Department of Education (MDE) unveils its “post season awards”.  Schools are ranked and labeled.  “MVPs” are awarded and lauded.  The awards are based on traditional statistics including:  Student participation in state assessments, student proficiency on state assessments, graduation or attendance and district reporting on school improvement plans (SIP) and teacher effectiveness.   At the high school level the key indicators include:  Proficiency on the MME, ACT scores, graduation rates.  Improvement or Adequate Yearly Progress (AYP) in student proficiency is also taken into account.

Unfortunately, the system is flawed.  The fact is schools that demonstrate higher proficiency on the state assessments are rated less effective than schools that demonstrate lower proficiency.  Schools that show improvement in their scores are ranked lower than schools that see their scores drop.  To have a system that so fundamentally miscommunicates to the public, to parents, to school staff is misleading.  Many assume, incorrectly, that schools with yellows are worse than schools with green and that schools within yellow are all the same.  Unfortunately many assume this is the whole story.  Of course those of us who work in schools know this is not the case. 

With this in mind, I wonder what “stats” should be kept for schools each season?  What is a true measure of a school or district’s worth?  Like those that subscribe to Sabermetrics, I would argue that there are many factors that should be considered when measuring the effectiveness of a school.  The “traditional statistics” are important however they do not tell the whole story.  Some Districts in Ohio recognize this and have begun publishing their own “Quality Profiles” in conjunction with, or in spite of, the Ohio Department of Education (ODE). They provide stats on National Merit Qualifiers, scholarship offers, performing arts awards, etc., in an effort to tell the whole story.  Maybe its time we do the same in Michigan.  Maybe we should talk about SIECA’s (students involved in extra curricular activities), or perhaps OFAS (opportunities for advanced studies) when we consider our post- season awards.