Sunday, October 9, 2011

PSA screenings and STEP assessments


The big health news of the past week, especially if you’re a guy, is the release of new recommendations for how to screen for prostate cancer. To quickly summarize, until now, the basic advice was for all men 50 and over to have a PSA test each year. The PSA test is a blood test that measures certain proteins that can signal the presence of cancer. If the PSA score is over a certain threshold or has rising by a certain percentage, a biopsy is performed to confirm the cancer. Treatment options include the removal of the prostate and localized radiation. Both treatment options often lead to severe side-effects, but when caught early, prostate cancer is rarely fatal.

But the revised recommendations by the Unites State Preventive Health Services Task Force are that healthy men not get the PSA test. Ever. The reasoning is that the PSA test does not do a good job identifying who actually has cancer and the combination of the PSA test and biopsies that often follow do not do a good job sorting which cancers are truly life-threatening and which are so slow-growing that they pose no real threat. While the use of the PSA test has gone up dramatically in recent years, as has treatment of prostate cancers, there has been little effect on prostate cancer deaths.

It seems counterintuitive that having less information can lead to better health, but that’s what the task force is saying. As Shannon Brownlee and Jeanne Lenzer write in the New York Times, “if there is one lesson from the P.S.A. test, it is that more information and intervention do not always lead to less suffering.” It’s not too much of a stretch to see the significance of these themes to the use of assessments in classrooms:
  •  More information is not always better. Teachers and school leaders have a limited amount of time and energy to give tests and analyze data. And it’s pretty much zero-sum. Adding another assessment means cutting something else or at least reducing the amount of attention you can put into other assessments. One of the big reasons I love the STEP literacy assessment is because it assesses all key areas of early literacy including letter identification, letter-sound correspondence, reading comprehension, concepts of print, phonics, etc. So instead of giving five different literacy tests to our new kindergarten students, we give one. When we want to see if our students are on track to meet their end of year reading goals, we look in one place. We talk about one assessment with families. This focuses our teaching, gives everyone a common language, and means we can actually use our results to drive instruction. 
  • It’s ok to say no. The prostate task force is basically telling patients and doctors to say no to taking the PSA test. It’s out there. Some people are taking it. But you don’t have to. And that same message should apply to school leaders and teachers. The default position should be that we are not adding any additional assessments, not to have a “one more can’t hurt” attitude. One of the great things about being a school leader at KIPP is we’re part of a national network of great schools and smart people who are suggesting new ways to do things all the time. This is super helpful, but just because using the EMDA math assessment works for someone else’s school doesn’t mean it’s a good idea for us. At KPEA we would need to be blown away by a new assessment to add it to our portfolio. And that hasn't happen in our first two years.
  • Assessment data needs to be actionable. One of the big issues with the PSA test is knowing what to do with the results. There is no proven threshold to indicate when cancer is present so it’s hard to know definitively what next step makes the most sense. And often that means erring on the side of treating the possible cancer as aggressively as possible. Having a teacher give an assessment that doesn’t give back actionable information is a waste of time. When teachers at KPEA create our quick interim assessments that we give every 2-3 weeks to spot check if kids are learning what we have most recently taught, we push ourselves to think about questions like:
o   Would knowing how many kids get this question right impact my planning and teaching?
o   What would I do differently if most of my students get this right? Or get it wrong?
o   How would I use this information to plan intervention groups?
o   Do I already know this information from another source? Is asking this question going to give me a better read on my students’ skills?  

If the conversation at our assessment planning meeting shows that a particular question doesn’t give us new knowledge or that the results wouldn’t change how we teach moving forward, we don’t ask it. Because we’re thoughtful about our assessments, we get back high-quality information about what our kids know and don’t know and can adjust our teaching to make sure all our students get the instruction they need.  



  

No comments:

Post a Comment