Learning and Teaching in the Sciences (conference report, part 2)

Professor Melanie Cooper from Clemson University, South Carolina came to Leicester’s Learning and Teaching in the Sciences conference as part of a UK tour sponsored by the Physical Sciences Centre of the Higher Education Academy.  In her talk, Using technology to investigate and improve student problem-solving strategies, Prof Cooper began by drawing an important distinction between problems and exercises.  Often when people set ‘problems’ what they are in fact asking students to do are ‘exercises’, activities designed to train the participants to be able to tackle similar future tasks in a formulaic way.  Problem-solving is about developing a range of skills that will equip students to “address novel situations and arrive at a suitable course of action” (Dudley Herron).  It is not, therefore, about knowing how to crank an equation to get the right answer.

In understanding how students approach problem-solving, there would clearly be huge value in directly observing them throughout the duration of a task.  Such ethnographic research methods, however, have a number of difficulties.  Firstly, the time required for the observations themselves, and all the moreso the subsequent evaluation, is a vast commitment.  Secondly, observations tend to be based, for reasons of practicality, on relatively small numbers of individuals. 

In her education research, Prof Cooper has been able to access a very much larger cohort (several thousand students at Clemson take general chemistry each year) and has got around the need for direct observation of the students at work by exploiting the IMMEX software, developed principally by Ron Stevens at UCLA.  Not to be confused with any similar-sounding floor-to-ceiling cinematic experiences, IMMEX stands for Interactive Multi-Media EXercises. An example of IMMEX use (in the context of genetics education) can be seen in the open access journal Cell Biology Education (see Stevens, Johnson and Soller, 2005).

IMMEX seems to involve some pretty fearsome computing, but I hope the following catches the essence of it.  Students carry out an on-line activity working from a single start-point towards a specific correct answer.  Along the way they can select from a number of briefing sheets, experimental results and other lab data relating to the problem in order to help them to the solution.  Not all of the available information is equally valuable or necessary to complete the task.  The software records the route taken by each student from start to finish (a so called ‘search path map’), and uses artifical neural network (ANN) clustering to identify and categorise common strategies.  The technology has to be ‘trained’ by exposure to a large number of examples, and then generates a ‘topological map’, for example a 6×6 grid of ‘nodes’ where similar approaches are clustered together.  Rather than contemplating 36 different approaches, these nodes can then be rationalised into a smaller set of ‘states’ representing similar models, in terms of strategies used and/or outcomes achieved.  So, for example, a ‘novice’ strategy might be ineffective (i.e. the student is unable to solve the problem) and/or inefficient (i.e. they visit most or all of the pages before completing the task), whilst an ‘expert’ would take a more efficient and effective route encompassing only the necessary information sources.

The use of ANN allows for helpful categorisation of students’ performance in a particular task.  This can be used to provide them with formative advice on how they might improve their approach.  At this stage a second approach is used to predict and to evaluate the changes in strategies that students make when offered the opportunity to undertake one or more similar tasks.  With sufficient data Hidden Markov Modelling (HMM) can make statistical predictions about the likelihood that students using strategy X will use the same approach again the next time, or whether they will swap to a different tack, and if so whether it will be strategy Y or strategy Z.  The challenge then is whether interventions that we make can move the students on towards a better strategy.

Work by Prof Cooper and others has shown that individual students, be they ‘novice’, ‘competent’ or ‘expert’ at the outset, can improve their competence by repeating activities – but only up to a point.  After five performances, or fewer, none of the participants working on their own exhibited any further improvement in either their ability or strategies employed.  How, therefore, can educators help students to make further refinements in their problem-solving abilities?

Melanie’s evidence shows that an answer lies in group work.  Working with others, particularly those of with different approaches (see below) involves metacognition, i.e. it forces the students into explicit reflection about what they are doing. Tackling a problem collaboratively exposes students to new ways of thinking and/or offers them clarity about why certain approaches are less useful.  What’s more, there is evidence that the improvements made by involvement in group work are retained if the students are subsequently required to work on their own again.

What group arrangements work best? Groupings should be organised by the tutor, not left to the students to choose.  The  maximum group size should be four, and the majority of benefit can be achieved by students working in pairs.  At Clemson, they use the GALT (Group Assessment of Logical Thinking) test as an initial means to identify the type of thinking employed by students – concrete (C), transitional (T) or formal (F), according to the Piagetian model.  Following the GALT assessment, students in Prof Cooper’s research were assigned to pairs according to all possible combinations; FF, FT, FC, TT, TC and CC.  It was clear from the research that there were distinct combinations that afforded greater improvement (to at least one of the pair).  For example, ‘transitional’ students paired with ‘concrete’ improved the most.  Overall female students improve more than males via experience of groupwork, but there was no significant difference based on whether pairings were single sex or mixed gender. Male students, incidentally, improved more as a result of using concept maps than as a result of participation in groups – but that’s probably a story for a different report.

Other talks at the Learning and Teaching in the Sciences conference, by Norman Reid and by Alan Cann,  are discussed elsewhere on this site.