LEARN NC

K–12 teaching and learning · from the UNC School of Education

Learn more

Related pages

  • Technology integration: An introduction to LEARN NC's resources for integrating technology into your teaching.

Related topics

Help

Please read our disclaimer for lesson plans.

Legal

The text of this page is copyright ©2010. See terms of use. Images and other media may be licensed separately; see captions for more information and read the fine print.

I’ve been driven by the language of professional learning communities for the better part of the last 6 years. Having been hired to work at a school being opened as a PLC, I knew that my principal had a simple expectation for our staff: That we would work collaboratively to identify and then amplify instructional strategies that produced results for our students. We weren’t going to cross our fingers and hope that students mastered the essential skills in our curriculum. Instead, we were going to systematically investigate our practice and reflect on data to drive every decision.

For my learning team — a group of highly motivated sixth-grade language arts and social studies teachers — the challenge of using data to inform our practice was both new and exciting. We recognized that for the majority of our careers, we were making decisions based on nothing more than instinct and intuition, and while our instincts were often accurate — we were accomplished teachers, after all — we believed that the potential for improvement existed as long as we were willing to begin squeezing information out of the numbers that were available to us.

Inspired by Rick DuFour and Bob Eaker, our professional learning team was “hungry for facts and constantly in search of meaningful data.”1 We’d even embraced Geri Parscale’s argument that student-level interventions are the cardiopulmonary resuscitation (CPR) of a professional learning community:

CPR…is directive, timely, targeted, systematic, and administered by trained professionals. When someone collapses in the presence of one of these trained professionals, immediate action is taken to avoid permanent damage. Similarly, when children are dying academically, we must approach them with the same sense of urgency.2

But like many professional learning teams, we had almost no practical experience with using data to drive decisions. None of us were experts in disaggregating the information generated by yearly end-of-grade exams. None of us were confident in our ability to write questions that reliably measured the specific objectives we were trying to test and none of us had the time to standardize our grading practices to ensure that an A was an A, regardless of what classroom students were assigned to. Knowing that there were gaps in our professional know-how, we turned to our administration for help.

That’s when one of the members of our school’s leadership made an all-too-common decision: She showed up at our next team meeting with a data notebook — an empty three-ring binder with a set of dividers labeled “state assessments,” “district assessments,” “team assessments,” and “classroom assessments” — for every teacher. “Now you can begin to look for patterns in your data!” she said. “Just keep records of every assessment that you give and you’ll be ready to go.”

While her efforts were well-intentioned and supported by research — data notebooks and walls are common practices in many schools — they left us overwhelmed. To begin with, just writing down assessment scores in the proper place in our data notebooks was a time-consuming task. Then, “looking for patterns” in our data became a frustrating process of flipping through pages and pages of scores, trying to mentally calculate what it all meant. There was no easy way to sort students into subgroups or to automatically pull out the kids who were struggling with individual objectives. There was no easy way to calculate classroom averages for comparison. There was no easy way to tie a student’s previous performances to their current work to look for signs of progress and there was no way to efficiently report what we were learning to anyone — parents, teachers on our team, principals, other support professionals within our school, or students.

As a result, it wasn’t long before we suffered complete data collapse. Because the tools that we were working with were woefully inadequate for the task that we were trying to complete, we quickly realized that the process was impossible and stopped working with numbers for good, going back to trusting our guts and simply hoping that our work was producing results.

The solution for avoiding data collapse in your school is really quite simple: Schools need to invest in some form of student response systems — also called responders or clickers — that teachers can use to collect, manipulate, and report evidence of learning in a timely way. Only then can data truly inform decision-making.

The benefits of student responders

My experiences with student response systems began after a friend working for a technology company found an extra set of Promethean’s ActiVote responders lying around the office and lent them to me for a year. Almost immediately, I found my assessment practices changing for the better. I began asking several quick multiple-choice questions each class period designed to do nothing more than give me immediate information about whether or not my students had a firm grasp on the essential skills that I was trying to teach. While there was nothing sophisticated about the questions that I was asking, there was a new level of sophistication in my instructional decisions.

On more than one occasion, the feedback that I received from my responders surprised me. Sometimes my students mastered content quicker than I expected, allowing me to move on sooner than planned. Other times, content that I assumed was easy and approachable left my kids baffled, requiring extra practice that I would have never guessed was necessary. Regardless of the reason for my surprise, the end result was informed decisions and immediate improvements in teaching and in learning — something that just wasn’t possible with my data notebook.

I was even more surprised by the impact that instant access to data had on my students. Like most response systems, the Promethean ActiVotes that I was using allowed me to post colorful graphs instantly. Before long, my classes were collectively hooked, trying hard as a group to earn high percentages simply because they took a measure of pride in knowing that they had mastered something new together. When given the chance to work on questions with partners, I’d hear them talking with far more care than ever before, determined to get their answers right for their classmates. Perfect scores were met with widespread celebration and low scores were met with pleas for second chances to prove what they knew.

Responders also started to change the nature of the conversations that we were having as a class. Questions like, “Why do you think our class is struggling with this particular concept?” or “Why did so many people give the same wrong answer to this question?” introduced new measures of metacognitive reflection into my classroom as students began to look for patterns in their misconceptions. For me, each of these conversations became a de facto window into the minds of my students. As I learned to rely on their insights about the causes for intellectual stumbles, my students began to understand that self-assessment can play an important role in their own growth as learners.

Finally, regular use of my ActiVotes also ensured that individual students were getting targeted, daily feedback about the skills that they had yet to master — along with a forum for making safe comparisons between their own levels of competence and the levels of competence reached by their peers. While wrong answers are never fun for kids, they are important sources of information about the specific areas where continued focus and attention are needed.

Responders allowed my students to receive that information immediately — rather than having to wait the week or two that I usually need to grade, return and review classroom assignments — and then to take independent action: Attending a working lunch review session, asking for extra practice, or finding help from peers and parents. Like no other tool that I’ve ever used, responders empowered my students to take ownership over their own learning by automatically generating the kind of individualized feedback that I struggle to provide to every child on a daily basis.

Tips for using student responders

While there are few tools that I would recommend as highly as student responders, there are a few tips that I’d pass on to any teacher interested in working with student response systems. They include:

Be careful not to limit your questions to the lower levels of Bloom’s Taxonomy.
One of the traps that I found myself falling in to during the year that I spent with responders in my classroom was over-relying on simple knowledge and application questions in my assessment work with students. Because student response systems lend themselves to multiple-choice style questions and because writing multiple choice questions that force students to think at the higher end of Bloom’s Taxonomy can be a challenging task, I initially saw the level of questioning in my classroom drop after starting my work with responders.
Over time, I began to spend more time deliberately crafting higher-end questions for use in my classroom. Likert-style prompts that asked students to make judgments and to evaluate potential solutions — along with questions that forced students to apply skills that we’d practiced in class before generating an answer — became more common. While it often meant that I’d ask fewer questions during the course of each class period, I soon had a bank of question styles that I could draw from to ensure that good questions stayed at the forefront of my instructional practice.
Be sure that you can tie every question to a particular skill that students must master.
One of the real advantages of having access to student response systems is that classroom results are automatically dumped into spreadsheets that can be sorted and manipulated easily. Promethean’s ActiVote tool even color-codes student responses — green for right answers and red for wrong answers — making it simple to get a quick sense of question mastery, both at the individual and whole-class level.
This functionality, however, is only useful if each question is tied to a specific skill that students must master. When I was methodical about crafting questions designed to measure individual skills, I found that I could quickly make instructional choices — which skills need to be retaught to my entire class, which students need to be regrouped for extra practice, which students could place out because they had a firm grasp on the concepts holding other kids back. The moral of the story: Taking advantage of the data generated by student responders requires a bit of deliberate frontloading by classroom teachers.
Always ask questions regarding the reasons for results.
Easily the most interesting by-product of my work with student responders was the increased commitment to metacognition — knowing about knowing — enabled by instant access to results. Listening to my students talk through the reasons for both their correct and incorrect answers was exciting simply because I could see that they were making connections between content and their own thinking. That understanding alone will result in more responsible and self-aware learners.
What’s more, I gained new insights into why students — as individuals and as groups — were struggling with the concepts that we were studying in class. Before adopting responders, I thought I knew the reasons behind the common misconceptions that my students held towards the topics we were studying. After adopting responders, I became more systematic about asking my kids what it was that they didn’t quite get — after all, we had daily evidence to look at together — and I was often surprised by what I’d learned.
Use web-based products like Poll Everywhere.
Now don’t get me wrong: Promethean’s ActiVote responders did me right! They were a durable tool that stood up to the demands of my middle grades students. They also came packaged with a great software program that automated the recording and reporting of results — something no three-ring binder could ever do — and allowed me to manipulate data easily. They’re a great tool, but they’re also a great tool that costs well over $1,000 per classroom to implement and — like any piece of hardware — they’re a great tool that will one day be outdated.
Which is why I typically like to recommend that schools and districts look into purchasing subscriptions to web-based polling products like Poll Everywhere. Generally offering the same features as student response systems that require hardware purchases — reporting at the individual and class level, automatic generation of charts and graphs, ability to download spreadsheets that can be manipulated — web-based polling products accept responses from students in two ways: Through the Internet (perfect for schools where classrooms have access to multiple computers) or through text-message responses (perfect in middle and high schools where many students generally have cell phones and text-messaging plans already).
What makes web-based polling products a more sensible purchase is that a year’s subscription is generally just a small fraction of what a class set of student responders cost — Poll Everywhere’s most popular service is priced at $129 per teacher — and the products are constantly being improved. So instead of the buyer’s remorse that you’ll inevitably feel after dropping over a thousand dollars on a tool that will be antiquated in no time, consider investing in a tool that can grow with your students and your school.

Decisions informed by data

Regardless of the choices that you make regarding which student response system to invest in, take action — and take action now! Responsible instruction depends on decisions informed by data. Those decisions, however, are impossible until teachers are provided with the kinds of tools that can make high-quality, ongoing, and immediate formative assessment possible.