Learning and Teaching in the Sciences (conference report, part 3)

The fact that you are reading this blog entry at all means that you are already engaging with Web 2.0, which has been defined on Wikipedia as “a perceived second-generation of Web-based services such as social networking sites, wikis, communication tools, and folksonomies that emphasize online collaboration and sharing among users”.  In the third talk at the University of Leicester Learning and Teaching in the Sciences conference on 23rd May 2007, Alan Cann raised the potential impact of Web 2.0 technologies in science teaching.

Dr Cann began with the definition of Web 2.0 given in the previous paragraph, and illustrated how broadly we have come to accept interactive aspects of the web with reference to Amazon.  Ostensibly an online shop, Amazon offers us the opportunity to review the goods on sale, even allowing us to give critical reports.  Similarly, we are invited to rate the performance of sellers for whom Amazon has acted as middleman.

Alan highlighted the fact that, when asked to write an essay, the default strategy of today’s student is to turn to Google and Wikipedia.  We may not like it, but this does not change the reality and, setting the pattern that was to run throughout his presentation, Dr Cann challenged us to think about ways that we can work with and develop the students’ study habits rather than fighting against them.  So, for example, we should teach students how to use Google more effectively to obtain the best quality information, rather than simply chastising them for using such a shoddy tool and brow-beating them into using the ‘proper’ searches.  This does not mean that we abandon training sessions on PubMed, Web of Science and the like, far from it.  We start with Google and move on to the more professional tools as an extension of good practice.

Against that backdrop, what is the place of wikis, blogs, podcasts and the like in the teaching of science? Alan suggested that, used appropriately, these Web 2.0 technologies can be particularly helpful in engaging the ‘long-tail’ of less able and less motivated students who do not respond well to the traditional approaches.  Clearly we need to adapt our writing style to be appropriate to the medium – the academic journal genre is not appropriate for blog entries which must be more bitesized and engaging.

What about podcasts and ‘viral’ video?  Dr Cann shared some insights from his personal experience and research projects conducted over the previous couple of years.  Alan has been developing blogs, podcasts and online video for the public understanding of science (specifically microbiology), for using in teaching statistics to first year undergraduates at the University of Leicester, and to share his virtual frogroom with fellow tropical frog enthusiasts. In doing so he has gathered both statistical data and qualitative comments from users concerning the relative merits of different approaches.  His observations included:
(1) A general dislike for the ‘push’ model of subscription via RSS feed, people prefer to ‘pull’ material to their computer as and when it looks of interest to them.
(2) Students are happy to listen to ‘work’-related podcasts on their computer, but reserve use of their mp3 player for ‘entertainment’.
(3) More students watch online videos via YouTube, and the like, than listen to podcasts.

Dr Cann finished by reiterating the point that this is not a call to ‘dumbing down’ and that the intention was to offer Web 2.0 resources to students in addition to traditional approaches.  The materials produced must remain academically robust, but should be offered in a format that is comfortable and familiar for 21st Century undergraduates.

Learning and Teaching in the Sciences (conference report, part 2)

Professor Melanie Cooper from Clemson University, South Carolina came to Leicester’s Learning and Teaching in the Sciences conference as part of a UK tour sponsored by the Physical Sciences Centre of the Higher Education Academy.  In her talk, Using technology to investigate and improve student problem-solving strategies, Prof Cooper began by drawing an important distinction between problems and exercises.  Often when people set ‘problems’ what they are in fact asking students to do are ‘exercises’, activities designed to train the participants to be able to tackle similar future tasks in a formulaic way.  Problem-solving is about developing a range of skills that will equip students to “address novel situations and arrive at a suitable course of action” (Dudley Herron).  It is not, therefore, about knowing how to crank an equation to get the right answer.

In understanding how students approach problem-solving, there would clearly be huge value in directly observing them throughout the duration of a task.  Such ethnographic research methods, however, have a number of difficulties.  Firstly, the time required for the observations themselves, and all the moreso the subsequent evaluation, is a vast commitment.  Secondly, observations tend to be based, for reasons of practicality, on relatively small numbers of individuals. 

In her education research, Prof Cooper has been able to access a very much larger cohort (several thousand students at Clemson take general chemistry each year) and has got around the need for direct observation of the students at work by exploiting the IMMEX software, developed principally by Ron Stevens at UCLA.  Not to be confused with any similar-sounding floor-to-ceiling cinematic experiences, IMMEX stands for Interactive Multi-Media EXercises. An example of IMMEX use (in the context of genetics education) can be seen in the open access journal Cell Biology Education (see Stevens, Johnson and Soller, 2005).

IMMEX seems to involve some pretty fearsome computing, but I hope the following catches the essence of it.  Students carry out an on-line activity working from a single start-point towards a specific correct answer.  Along the way they can select from a number of briefing sheets, experimental results and other lab data relating to the problem in order to help them to the solution.  Not all of the available information is equally valuable or necessary to complete the task.  The software records the route taken by each student from start to finish (a so called ‘search path map’), and uses artifical neural network (ANN) clustering to identify and categorise common strategies.  The technology has to be ‘trained’ by exposure to a large number of examples, and then generates a ‘topological map’, for example a 6×6 grid of ‘nodes’ where similar approaches are clustered together.  Rather than contemplating 36 different approaches, these nodes can then be rationalised into a smaller set of ‘states’ representing similar models, in terms of strategies used and/or outcomes achieved.  So, for example, a ‘novice’ strategy might be ineffective (i.e. the student is unable to solve the problem) and/or inefficient (i.e. they visit most or all of the pages before completing the task), whilst an ‘expert’ would take a more efficient and effective route encompassing only the necessary information sources.

The use of ANN allows for helpful categorisation of students’ performance in a particular task.  This can be used to provide them with formative advice on how they might improve their approach.  At this stage a second approach is used to predict and to evaluate the changes in strategies that students make when offered the opportunity to undertake one or more similar tasks.  With sufficient data Hidden Markov Modelling (HMM) can make statistical predictions about the likelihood that students using strategy X will use the same approach again the next time, or whether they will swap to a different tack, and if so whether it will be strategy Y or strategy Z.  The challenge then is whether interventions that we make can move the students on towards a better strategy.

Work by Prof Cooper and others has shown that individual students, be they ‘novice’, ‘competent’ or ‘expert’ at the outset, can improve their competence by repeating activities – but only up to a point.  After five performances, or fewer, none of the participants working on their own exhibited any further improvement in either their ability or strategies employed.  How, therefore, can educators help students to make further refinements in their problem-solving abilities?

Melanie’s evidence shows that an answer lies in group work.  Working with others, particularly those of with different approaches (see below) involves metacognition, i.e. it forces the students into explicit reflection about what they are doing. Tackling a problem collaboratively exposes students to new ways of thinking and/or offers them clarity about why certain approaches are less useful.  What’s more, there is evidence that the improvements made by involvement in group work are retained if the students are subsequently required to work on their own again.

What group arrangements work best? Groupings should be organised by the tutor, not left to the students to choose.  The  maximum group size should be four, and the majority of benefit can be achieved by students working in pairs.  At Clemson, they use the GALT (Group Assessment of Logical Thinking) test as an initial means to identify the type of thinking employed by students – concrete (C), transitional (T) or formal (F), according to the Piagetian model.  Following the GALT assessment, students in Prof Cooper’s research were assigned to pairs according to all possible combinations; FF, FT, FC, TT, TC and CC.  It was clear from the research that there were distinct combinations that afforded greater improvement (to at least one of the pair).  For example, ‘transitional’ students paired with ‘concrete’ improved the most.  Overall female students improve more than males via experience of groupwork, but there was no significant difference based on whether pairings were single sex or mixed gender. Male students, incidentally, improved more as a result of using concept maps than as a result of participation in groups – but that’s probably a story for a different report.

Other talks at the Learning and Teaching in the Sciences conference, by Norman Reid and by Alan Cann,  are discussed elsewhere on this site.

What advice would you give to students starting your course?

Each academic year since 2005, the Higher Education Academy in the UK has run an essay competition for current students to express their views on an aspect of teaching and learning.  In 2007, the theme was “What advice would you give to students starting your course?”  The top entries submitted by Bioscience students have recently been made available on the HEA Centre for Bioscience website.  The winning author Aneeqa Meedin, from the University of Sheffield, produced a thought-provoking ten commandments for biomedical science students.