Saturday, October 11, 2014

NGSS Assessments and the Sword of Damocles

Inevitably the question of assessments come up when I talk to teachers, parents, and administrators about the NGSS.  We have lived with the education world's variation of the  "Sword of Damocles" hanging over our heads for the last 12 years so it only natural to be concerned about the next iteration.  However, my response to this may seem flip but it is none the less accurate.

For elementary science, the ultimate assessment is called middle school.  

Far more than ever, the standards are developmental.  The physics instruction which starts at Kindergarten, builds through third, middle, high school, and then life.   Without the base knowledge at Kindergarten, it becomes more difficult to catch up with each iteration.   I then step off my soap box and start talking about what I know and what I suspect.

The National Academies Press released a report earlier this year entitled "Developing Assessments for the Next Generation Science Standards."  Before this was released, I participated in a webinar where details regarding this report were the focus.  Perhaps the most salient detail was that our "current assessment model was a non-example."   So, what is our current model?  Like everyone else, the ubiquitous #2 pencil plays an essential role.

 

This type of assessment was what it was- a means to an end.  Does it really measure what students "knew and were able to do" or just what they could memorize?

Now, I told you that story to tell you this one.  Once upon a time in Maryland, we had the Maryland State Performance Assessment Program (MSPAP).  According to the state website, the assessment measured three things:

  1. How well students solved problems cooperatively and individually.
  2. How well students applied what they learned to real world problems.
  3. How well students could relate and use knowledge from different subject areas.
At the time, most teachers did not like them because they required a lot of set up and management.  However, with a few exceptions, they would also say that it really did measure what students knew and were able to do.  The downside was individual accountability was difficult.  

Reading between the lines of the NAP report, it feels like an assessment focused on performance will be the focus, but will be done in the digital world.  

Earlier this week, I stumbled across the Technology & Engineering Literacy Assessment (TEL).  Given the proliferation of computers in schools and the gauntlet that PARCC and Smarter Balance have dropped to be fully digital, I would start looking at the TEL as an example of what an NGSS based assessment system will look like. 


Finally, the future of science education comes down to one brutal question.  Will any science assessment count?  No mater how well the assessment is developed, if schools do not see it on an equal level with Common Core assessments, then will it really mater?  Unfortunately, we have moved from an intrinsically motivate institution to one extrinsically motivated.  In the end, do we want the Sword of Damocles hanging over our heads or not.