iCard: The low-tech, low cost tool for assessing student engagement in large groups

DSC_0001In this post, I discuss my initial use of simple coloured card voting system for assessing student understanding in a large group University setting. Not a novel concept, but simple and effective.

There have been some amusing Twitter debates recently on the topic of ‘no hands up’ policies, and the contentious use of randomly asking questions to pupils.  The issue of tutors assessing understanding of entire groups is not really addressed by either picking random students, or allowing students to elect to answer a question. We get a potentially non-representative snapshot, and by carefully selecting who answers, or what question gets asked, we can convince ourselves that we are doing a great job. Working in a University setting with large groups, I am focussed on assessing whether everyone in the group understands, who doesn’t, and reasons for lack of understanding.

So how can all students be encouraged to participate in Q&A sessions, whilst allowing the tutor to assess student understanding of key concepts by the whole cohort? One area that is being used particularly in Universities is Mobile technology. Apps such as Socrative allow the group to submit answers which can be displayed on screen, but require users’ own devices, which may be OK for Universities, but not really applicable to schools. Similarly Twitter can be used, where students give an answer in class and answers appear on screen. On the downside, the majority of replies are off-task, although once the novelty wears off, maybe this approach will have some merits. Similarly, ‘clicker’-type voting devices are an option, as they do not require the use of students’ own mobiles. Programming individual devices to individual students, if such information is needed can be a barrier, especially for large cohorts.

I frequently put a question on screen, and then ask groups of up to 200 undergraduate students the following questions (in this order, with typical responses noted):

Hands up everyone who thinks the answer is true (25%)

Now hands up if you think it is false (25%)

Hands up who doesn’t know (25%). And as I lose the will to live…

Hands up who doesn’t care (25%).

If that doesn’t work, simultaneously I ask for left hand for true, right hand for false. And so on. Getting large groups to answer questions so that I can gauge levels of understanding of key concepts is not easy, especially in my admittedly didactic teaching sessions.

The iCard voting system: For when Technology-Enhanced Learning seems like an un-necessary evil

Expanding on the concept of left-hand or right-hand up, is the simple use of coloured voting cards. Hold up a red or green sheet of A4 to test understanding of a key point. It allows the tutor to see who understands (or if we are being pedantic, those who think they know the answer), who doesn’t, and who is disengaged completely. However a 50:50 question is not very informative. This is where the iCard comes in. Four (or more) pieces of coloured card (A6 should suffice) liked by a treasury tag can be given out at the start of the session. Questions can be asked to the entire group, which can be a simple recall of a straightforward fact that is central to the understanding a concept, or a more testing question that requires a few minutes of working out.

Is it useful?

Teaching large groups of anything up to 200 renders ‘hand up…’ pretty useless. Even in a smaller group, getting everyone to consider the question, and seeing evidence of some effort by all students is near impossible. In contrast, the iCard-type approach does work. My initial use of this was in an end of semester informal test with 60 students. Questions were projected via Powerpoint with the question, plus 4 colour coded answers.

Presentation2

Initially, as anticipated, a minority did not engage well, but 90% were happy to answer all questions. To get the remaining 10% to engage, there has to be an element of bullying.  These students soon found out that if they didn’t answer with the rest of the group, I would push them individually for an answer, and then inform the rest of the group whether the individual was right or wrong. These students soon started to answer with the rest of the group.

What did I learn from using the iCard?

1) Student misconceptions on ‘simple’ fundamental points. Some of the questions were deliberately simple, and I anticipated a >95% of respondents giving the correct answer. By quickly scanning the room for incorrect answers, I could identify who got the answer wrong, and crucially, what the misconception was. I could explain why the given answer(s) might be incorrect, without necessarily highlighting students with wrong answers, by encouraging students to keep their card ‘close to their chest’

2) Identifying weaker students. As the majority of students were answering correct for each individual question, I could focus my attention to the incorrect answers. As expected, some weaker students consistently answered incorrectly. However some students whom I had down as particularly strong students were exposed as having gaping holes in their knowledge, sometimes on fundamental points.

3) Identifying topics that were poorly understood. Two topics out of 11 were particularly poorly answered. I can now look at, and take action on a) how those topics were delivered, and/or b) whether there is some underpinning knowledge that is missing from earlier in the course.

4) Poorly worded questions can trip up students. All questions should have only one correct answer. However questions can be ‘read’ differently, and in two questions, an ‘alternate reading’ of the question would lead the student to answer incorrectly. By discussing why answers are wrong, the students could argue their case. As a result, I will re-writing a few questions before using them again.

5) Students’ responses can spark debate over contentious points. As noted above, students are happy to argue with me if they think that I am wrong. Although the voting cards do not allow students to express views, or give complex and well-articulated answers, the voting cards are an ideal way to initiate subsequent debate.

I must stress that this is not a new concept, and is certainly not my idea. They are so cheap and simple, yet seem to be effective, especially with carefully worded questions or tasks. This type of in class formative assessment seems to be used only sparsely used in Universities where large group teaching is common, and where if anything, mobile technology and clickers are being introduced more widely. With such diverse opinions on how tutors should assess student understanding during sessions before ‘moving on’, maybe it’s time to re-visit some old technology before blindly moving on to a technology-based approach.

Benefits include:

1) They are very cheap and easy to make

2) They are applicable to situations where tutors are assessing a right/wrong answer/MCQ answer, or where specific/defined opinions of a group are being sought, and especially for large groups

3) Bureaucracy of obtaining anything that is remotely costly or technological can be a barrier to implementation.

4) Some tutors will always remain as technophobic as humanely possible, and a minority are particularly ‘risk averse’. Even technophiles have concerns that the time teaching students how to use any technology, and technology failure may detract from the learning.

On the negative side:

1) There is no permanent record of who voted for which answer, only the tutors judgement on who or what to follow up.

2) You will get it in the neck from advocates of Technology Enhanced Learning

In summary, all students get the same questions, and get the same treatment and there is less of a requirement to ‘pick on’ individual students. Please feel free to comment on potential uses, and importantly, limitations of use.

And here is me rambling on about it at a recent TeachMeet

Advertisements

About TheOtherDrX

Senior Lecturer in Biosciences. MSc Biosciences course leader and lecturer on topics such as Cell Biology, Moleular Pathology and Genetics. I manage a research team of PhD students and post-doctoral scientists working on novel anti-tumour drug combinations, nanotech-based delivery of anti-tumour agents, and artificial scaffolds for 3D cell culture studies as a replacement for animal-based studies. I also do a bit of STEM public engagement work with my Geiger counter.
Image | This entry was posted in Student engagement and tagged , , , , . Bookmark the permalink.

6 Responses to iCard: The low-tech, low cost tool for assessing student engagement in large groups

  1. I have a set of these kinds of cards with red/orange/green and smily faces. They are bit more school-level in style but do the same job. I used to ask students to put them on their desks when they were working and update them every so often or as necessary. Other times we would hold them up as you described.

    Last week I learned about plickers.com from a twitter connection and I tried them out. They are printed cards that are essentially QR codes with four orientations. Each one is different so you can assigned students to a specific one. (Perhaps in a class of your size you could have students fill in a two question Google form with their name and the number of the card that was passed out to them.) Then you ask a question and they hold the card up in the right orientation for their answer. You scan them with your phone or tablet and the results are collated and graphed instantly. What I like about them is that there is no tech in the hands of the students, that is no need for a class set of devices. Only the teacher needs an app on their camera-enabled device. Have a look on their website where a 21 second video shows so much better what this long paragraph was trying to describe!

    Good for you for using some formative assessment in your uni class. When I taught at uni (and as a uni student) I realised that those kinds of lecturers are not in the majority and there is not always a lot of support for new ideas.

    • TheOtherDrX says:

      Thanks for the kind comments and suggestions. The QR code scanning idea looks like it could be appropriate for some situations in smaller groups that I occasionally teach, but I mostly have over 60 students and spread throughout a large lecture theatre. I’ve definitely get the App and give it a go though, and test it at our staff technology-enhanced learning session. I tried Socrative in class, but his needed students to have Smartphones and was a distraction from learning, and the class stats element was really minor to me. The main reason why I really like this system is that there is no tech whatsoever, as I tend to get the cards out when I have 10 minutes spare at the end of a session. Yes class stats give me a clearer idea who is a weak student, but I’m now not primarily using this to initiate any intervention as such (if only I had the resources to do anything on a one-to-one basis…). The primary aim of for students to appreciate for themselves whether they are taking on board core concepts in class, and if not, do something about that, which can be followed up at the next in-class formative test.
      Formative assessment in traditional lecture-style situations/subjects does seem to be rare, and from student feedback, it seems to be appreciated. On a very positive note, I am now starting to loan these out to other academics in the department.

  2. I agree that anything that involves all students with a phone in their hands is currently not an option for me. And yes, I also think the main benefit of these systems is for the student to know how well they are doing. I tried asking, “How was my pace today?” and that was really useful for me.

  3. Pingback: Should we adopt more active learning at the expense of cutting the STEM curriculum? | TheOtherDrX's Higher Education blog

  4. Paul Davide says:

    Informative blog post – I Appreciate the info – Does anyone know where my assistant might get access to a sample WV DoR IT-141 copy to use ?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s