Contact our sales team

Thank you!

Sed posuere consectetur est at lobortis. Aenean eu leo quam. Pellentesque ornare sem lacinia quam venenatis vestibulum.


Please confirm your state so we can curate content for your location.


How do we make assessment more like a Fitbit?

I love my Fitbit. I wear it all the time. It works behind the scenes, tracking my steps, calories burned, and heart rate. I get ongoing insights on how active—or lazy!—I am on a given day. And it works—when I see I’ve taken only 2,400 steps at 4:30pm, I take a walk around the block. Thanks, Fitbit!

And, as it turns out, Fitbit also gives me great ideas. Last year, as I speed-walked to rack up those 10,000 steps, something occurred to me: How can we make our assessment more like a Fitbit? How can assessment run steadily in the background, giving teachers the information they need to make choices and respond to the needs of their class? At Amplify, our goal is to give teachers more time to do the great teaching that supports strong literacy. But more and more, it seems assessment interrupts, or even replaces, literacy instruction. That won’t help teachers or students reach their goals. We need assessment to measure the impact of our instruction while it’s happening, while we can do something about it!

By Deb Sabin, Amplify ELA’s Chief Academic Officer

Another analogy: my husband’s recent shoulder injury. He dutifully did the physical therapy he was prescribed, but things only got worse—until he finally got an MRI. Turned out a whole new issue had developed! Waiting that long for the MRI—the summative assessment—kept us from seeing how things were going and adjusting in real time. You see my point: regular measures of the impact of instruction are vital in the same way.

More and more classrooms are providing teachers with good tools for measuring learning. Problem is, that turns an increasingly large percentage of classroom time into something that looks like an assessment. Rather than coaching a group of students as they practice selecting relevant evidence, teachers spend more time watching their students take more tests.

To solve this problem, here are the questions we should be asking:

What is the data capturing?
A. Learning achieved, or
B. learning in progress?

How often can I collect and analyze data?
A. At learning end points with need for analysis, or
B. during learning for quick strategizing?

Where is the assessment being placed?
A. End of practice or
B. Within practice

How does it support student practice?
A. It provides information but assumes practice or
B. It is part of practice

If you answered “B” to each question, that’s formative assessment done right. In the next post, we’ll talk about how, and why, it works.

And now, another walk around the block.

More Insights


Dyslexia toolkit

The more awareness we have about dyslexia, the more questions we have about dyslexia. How best to screen for and identify students at risk—as early as possible? How best to support them when we do?

Read more
NEWARK-MAY31:  Ivy Hill School, Newark, NJ for Amplify, May 31, 2018.


All your kids can read and succeed: How mCLASS addresses dyslexia

How can you identify early students who may be at risk of learning disabilities, including dyslexia? What kind of dyslexia legislation has recently been enacted, and how can you meet state-specific screening needs? Our p...

Read more
See all insights