Student Response Systems

View printable Student Response Systems (PDF)


What

Student response systems (SRS) is a tool that instructors can use to receive immediate feedback on teaching and learning and promote active learning. Sometimes referred to as classroom response systems, clickers or personal response systems, but typically refer to a handheld device called a clicker but it also includes note cards, phones and computers. SRSs are used in small or large classrooms to facilitate attendance recording, engaging students in lectures, ensure key points are understood, give low-stakes quizzes, or a way to ask students’ opinion or attitude that they might not normally share comfortably. Types of questions asked can range from multiple choice, true-false, numerical answer and short answer, to matching, clicking on an image or drawing. Using the responses, an instructor may move on with the material, provide more instruction if students are not meeting a learning goal, or jump-start a discussion based on the responses. The possibilities are endless since SRSs can be used in conjunction with other teaching tools; Think-Pair-Share, Peer Instruction and Jigsaw.


Introduction

Some refer to the original SRS as the students face, voice or hand. It was a means of responding to questions by instructors. While those mechanisms are still used today, other means improve teaching and learning for both the instructor and student. The instructor to capture a summary of the class understanding and further discuss a topic that was misunderstood, more easily start a discussion based on responses, or provide lasting data for the instructors and students to use for classroom improvements and studying. The anonymity possible with the systems allows student to answer more comfortably and the immediate feedback for students helps them to better know where they stand in relation to the content and the class. There are many different types of systems, and depending on your purpose and classroom, one may be more appropriate than another system. Research suggests that SRS increase attendance, class preparation and participation, decrease attrition rates (Beard, Morote, & Volcy, 2013; Burnstein & Lederman, 2001; Lopez-Herrejon & Schulman, 2004; Owens et al., 2004). The mixed results in SRS efficacy research prompts a necessary clarification, the key to successful SRS implementation lies in development of good questions and enactment of the questioning (linked to grade, with follow-up; appropriate responses, probing questions or discussions. Just remember to be consistent and capitalize on the learning opportunity that SRS can provide.


Implementation

  1. Determine what is the purpose of your questions; attendance, check for pre-class work completion, gauge student learning during class, keep students attention during lecture…
  2. Determine what type of questioning would be best suit your learning goals and enhance student learning; multiple choice, T/F, numerical, short answer, drawing, point identification.
  3. Research and determine which system is best for your course (type of questioning/purpose) by talking with colleagues, CTL and your department
  4. Write effective questions and develop follow-up questions/responses and activities surrounding your questions as appropriate.
    1. Completion of pre-class work or follow-up questions
      1. (they are expected to know what the mean is and how to calculate it) What is the mean of these numbers? 14, 14, 14, 15, 16, 17, 17, 18, 18. A) 14, B) 15.12, C) 15.88, D) 16.
      2. To which position do your hands return after throwing an offensive punch/ A) Guard, B) Pyramid, c) Resting
    2. Just-in-time teaching:
      1. What are the stoichiometric coefficients for the following chemical reaction? [chemical reaction] A) #, #, #, #, B) #, #, #, #, C) #, #, #, #, D) #, #, #, #,
      2. How confident do you feel with your ability to successfully achieve [selected learning outcome]? A) Very sure I have achieved it, B) Pretty confidence I can in the near future, C) I think I will need much more practice/discussion, D) I am completely lost.
    3. Application question or discussion start questions:
      1. There is an ice cube floating in a glass of water, as the ice melts, what will happen to the water level? A) not enough information provided, B) level of water will decrease, C) level of water staying the same, D) level of water will increase
      2. Scenario …… What would happen to ___? (or What would you do)?
      3. What do you think is the most interesting part about this artifact? A) color, B) layout, C) materials, D) message
  5. Test the system well before class and then again on the exact system you will use during class (i.e. in the classroom). The computer, the classroom set-up or the wireless networks in your classroom may influence your SRS. Remember, less technical problems led to students having higher confidence in you.

Frequently Asked Questions

What options do I have to stop this behavior, my students are not taking it seriously…

  • Students are seemingly not even reading the question and just responding.
  • Students are responding for friends using multiple devices.
  • Students are responding inappropriately.
    • Eliminate the possibility for anonymous responses
    • Incentivize responses. Please see question b) in FAQ for more details.
    • Take roll the old fashioned way after you notice numbers do not match up (or every so often out of the blue). Or take a picture of the class, describe how you can match faces with names and anyone who responds who is not here will be significantly reprimanded (report cheating to university, fail assignment, get zero for associated points…). Provide responses students can say to others when asked if they could take their clicker for them to class.
    • Use a SRS that has GPS capability to be able to identify where students are located.
    • Use the responses in a different way; start a discussion, ask specific students their answers and ask them to explain to the class their reasoning, have students move to an area of the room where As are, Bs and so forth to discuss their responses…

How much of their grade should be based on SRS responses?

  • Many studies suggest a small incentive related to SRS responses; sometimes rolled in with participation points. There is a balance to prompt students to take it seriously but also not too many points as to prompt excessive stress/anxiety/cheating. The 5% principle, 5% of their final grade, has been shown to raise attendance levels without prompting too much anxiety.
  • You may want to build in a “do no harm” clause into grading if you chose to provide points to students; this will help mitigate if there are any technical difficulties, excused absences or class cancellations.
    • This could mean that if students obtain 95% of the total possible SRS points, they get 100% related to the clickers (if they get 80%, 80/95 = 84% related to clicker questions).
    • Extra credit: Students could get up to 5% extra credit on a test or their final course grade based on responses.
  • Weighing the questions: Students could receive credit for just responding or you could also give students more points for correct. There could be credit for responding the first time but more credit for responding correctly the second time (there are tons of different equations you can use). What works for your class might not work for the next, don’t hesitate to contact CTL with any questions you have.

Other Resources

Points or no points for SRS responses:
Grading incentives paper review
The Chronicle of Higher Education article on cheating related to clickers

Vanderbilt’s Classroom Response Systems Guide

EducauseReview: clickers in the classroom 

Carnegie Mellon White Paper on Classroom Response Systems

Article from university business related to SRS versatility and relation to note to retention rates 


References

Beard, K., Morote, E., Volcy, K. (2013). Effects of a student response system on pre-class preparation, learning and class participation in a diverse classroom. Teaching and Learning in Nursing. 8 (40, 136-139.

Burnstein, R. A., & Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics Teacher, 39: 8-11.

Bruff, Derek. (2009) Teaching with Classroom Response Systems: Creating Active Learning Environments. Jossey-Bass, San Francisco CA.

Lopez-Herrejon, R. E., & Schulman, M. (2004). Using interactive technology in a short Java course: An experience report. Annual Joint Conference Integrating Technology into Computer Science Education: Proceedings of the 9th annual SIGCSE conference on Innovation and technology in computer science education.

Owens, K., McConnell, D. A., Steer, D., Van Horn, S., Knott, J., Borowski, W., et al. (2004). Changing pedagogy to include 61 ConcepTests and peer instruction in introductory geoscience courses: The impact on instructors and students. The geological Society of America Annual meeting (Denver). Abstract available at http://gsa.confex.com/gsa/2004AM/finalprogram/abstract_75839.htm


Cite this resource: Tomaswick, L. (2017). Assessing Student Learning – Student Response Systems. Kent State University Center for Teaching and Learning. Retrieved [today's date] from https://www.kent.edu/ctl/student-response-systems