Suggesting that student engagement might actually be a bad thing tend to get certain people’s dander up. There was a mild spat recently about Rob Coe reiterating that engagement was a ‘poor proxy’ for learning.
Carl Hendrick unpicked the problem further:
This paradox is explored by Graham Nuthall in his book ‘The Hidden Lives of Learners,’ (2007) in which he writes:
“Our research shows that students can be busiest and most involved with material they already know. In most of the classrooms we have studied, each student already knows about 40-50% of what the teacher is teaching.” p.24
Nuthall’s work shows that students are far more likely to get stuck into tasks they’re comfortable with and already know how to do as opposed to the more uncomfortable enterprise of grappling with uncertainty and indeterminate tasks… The other difficulty is the now constant exhortation for students to be ‘motivated’ (often at the expense of subject knowledge and depth) but motivation in itself is not enough. Nuthall writes that:
“Learning requires motivation, but motivation does not necessarily lead to learning.” p.35
Motivation and engagement and vital elements in learning but it seems to be what they are used in conjunction with that determines impact. It is right to be motivating students but motivated to do what? If they are being motivated to do the types of tasks they already know how to do or focus on the mere performing of superficial tasks at the expense of the assimilation of complex knowledge then the whole enterprise may be a waste of time.
Even this mild critique annoyed some spectators, but now it turns out that engagement might actually be counter-productive. According to a new report for The Brown Centre by Tom Loveless, in which he takes apart the 2012 PISA findings to demonstrate that countries that do well on student motivation do poorly on maths attainment and vice versa. Contrary to every intuition, student engagement and motivation may actually be retarding learning.
Loveless is a cautious analyst and is at pains to point out the risk of jumping to hasty conclusions on flimsy correlational evidence:
The analytical unit is especially important when investigating topics like student engagement and their relationships with achievement. Those relationships are inherently individual, focusing on the interaction of psychological characteristics. They are also prone to reverse causality, meaning that the direction of cause and effect cannot readily be determined. Consider self-esteem and academic achievement. Determining which one is cause and which is effect has been debated for decades. Students who are good readers enjoy books, feel pretty good about their reading abilities, and spend more time reading than other kids. The possibility of reverse causality is one reason that beginning statistics students learn an important rule: correlation is not causation. p 28
But that said,
Data reveal that several countries noted for their superior ranking on PISA—e.g., Korea, Japan, Finland, Poland, and the Netherlands— score below the U.S. on measures of student engagement. Thus, the relationship of achievement to student engagement is not clear cut, with some evidence pointing toward a weak positive relationship and other evidence indicating a modest negative relationship. p 27
Nations with the most confident math students tend to perform poorly on math tests; nations with the least confident students do quite well. p 28
He explains that, “U.S. students were very confident—84% either agreed a lot or a little (39% + 45%) with the statement that they usually do well in mathematics. In Singapore, the figure was 64% (46% + 18%). With a score of 605, however, Singaporean students registered about one full standard deviation (80 points) higher on the TIMSS math test compared to the U.S. score of 504.”
Loveless points out that within county analysis reveals something concealed by between country analysis:
In Singapore, highly confident students score 642, approximately 100 points above the least-confident students (551). In the U.S., the gap between the most- and least- confident students was also about 100 points— but at a much lower level on the TIMSS scale, at 541 and 448. p.29
But, “the least-confident Singaporean eighth grader still outscores the most-confident American, 551 to 541.”
He goes on to show that “enjoying math is not positively related to math achievement. Nor is looking forward to one’s math lessons.”
Now none of this proves that engagement and intrinsic motivation are actually bad for attainment, but serious doubts are cast on policy measures which seek to boost student engagement in the belief that results will improve. Suitably cautious, Loveless says, “PISA provides, at best, weak evidence that raising student motivation is associated with achievement gains. Boosting motivation may even produce declines in achievement.”
The report doesn’t really seek to unpick why all this might be so, but there does seem to be a link with the points Nuthall’s made above: what students enjoy may not actually involve much in the way of learning. Hard work often isn’t fun. If Rob Coe is right [UPDATE: I no longer think he’s quite right] that “learning happens when people have to think hard” then it’s small wonder that activities that prioritise engagement and motivation over ‘thinking hard’ don’t actually result in long-term retention and transfer between contexts.
There’s also compelling evidence that we’re pretty terrible at judging when we learn best. How would you go about learning a new skill, memorising dates, or learning how to solve an equation? Most of us tend to review the basics, complete a few related practice exercises, reach an acceptable level of proficiency, and then move along to the next topic. This is approach is known as blocking, or massed practice. Massed practice allows us to focus on learning one topic or skill area at a time. The topic or skill is repeatedly practised for a period of time and then you move onto another skill and repeat the process. Interleaving practice, on the other hand, involves working on multiple skills in parallel.
Even though laboratory tests and classroom trials have demonstrated interleaving clearly a more effective way to learn than massing practice, the fact that performance is lower during instruction fools us into believing that it must be ineffective. Even showing the evidence of improved test results can just lead to a Backfire Effect with over 65 percent of students simply discounting the evidence and continuing to do what they’ve always done. Part of the problem is that by boosting students’ performance during instruction and then blocking practice we encounter the ‘illusion of knowing’ and that warm, fuzzy feeling of cognitive ease. Time and again, we prefer what we believe to be true than the uncomfortable realities of what is actually true.
To sum up, I’m not saying engagement and motivation don’t matter at all – clearly they are important in all sorts of contexts – but the idea that there is any kind of direct link to achievement appears to be dubious. If you want to engage students because you want them to be more engaged, fine. But if you believe that engagement will automatically lead to better results you may well be mistaken.
Interesting post as always David and looking beyond ‘conventional wisdom ‘ . I wonder if there is any correlation between engagement and students continuing to study that subject at a higher level. Have you found any research on this? Im always a bit waryof research that relates just to measured performance in tests.
The work of Professor Eric Mazur and Dr Derek Muller have found that confusion rather than clarity leads to better learning (as measured in tests!) I’ve outlined some of these ideas on a blog http://neilatkin.com/2015/02/21/confusion-vs-clarity-great-teachers-who-beat-themselves-up-and-poor-ones-who-think-they-are-great/
Hi Neil – no, I’m not aware of any research in this area – I suspect that engagement matters hugely in determining whether students will follow further study.
And I’m hugely wary of research which doesn’t link to test results. Anyone can make students feel good but that’s not the job of a teacher. Without an external measure of success it’s impossible to evaluate a ‘good idea’. It becomes unfalsifiable.
Thanks for the link.
I am not sure the OCED data reflects the complexity of the different cultures. In many Asian cultures the motivation for students is not about enjoying maths but doing well so they can get into the best University and then create a new future for themselves. The South Korean system is an extreme of this where it is clear that if you don’t work hard and get the top marks you wont get into one of the 3 top universities in the country. Thus this is their motivation … not whether they like maths or not or are even engaged in it. Worth reading this article … https://theconversation.com/south-korean-education-ranks-high-but-its-the-kids-who-pay-34430
I recommend reading Loveless’s paper. He also makes the point that in Finland students report a lack of engagement and motivation but do well on tests. These findings are not so easily dismissed as putting it down to South East Asian culture.
Yes but it does point to there might be a range of many other factors that could be affecting engagement and “test” results. The questions asked of students at the time of testing are very limited and a lot seems to be drawn from them. I am not saying that engagement is important or not important but I believe there are other aspects of learning in each country that makes such an analysis questionable.
Of course the analysis is questionable: Loveless himself questions it and urges caution. Did you read through the report?
Yes. Further to my comment … http://www.scmp.com/lifestyle/family-education/article/1763278/western-schools-envious-east-asian-scores-global-exam-may
Hi David – great article, but you might want to fix typo here: “and motivation of thinking hard”, where “of” should be “over”.
Thanks David. I think this matter is slightly confused by the combination of two different conceptions of engagement. If I’m thinking hard about something, then surely by definition I must be engaged WITH it (if not engaged BY it). So in that sense engagement is indeed essential to learning.
It seems to me that ‘enjoyment and confidence’ point to the notion of being engaged ‘by’ something, which is then itself a poor proxy for engagement ‘with’ it. But simply then to say that ‘engagement’ might not be that essential for learning to take place could be a little misleading.
Of course, I can see that the vast majority of teachers combine ‘engagement’ and ‘enjoyment’ in the way you are critiquing it.
Have you seen https://steppingbackalittle.wordpress.com/2015/02/19/beyond-the-cult-of-engagement-part-1-the-problem/ ? In it I try to distinguish between different routes into being engaged ‘with’ something.
A neat distinction, thanks.
Chris, great clarity. I’ve signed up for your blog updates. My view, after reflecting on my time as a student, is that teachers spend a vast amount of energy thinking about what their students think; do they like this? are they all following? are they interested? etc. and I say “Stop!” Students don’t need teachers inside their heads all the time. And, by over-thinking this, teachers (especially ones the students like) end up creating a subtle awkwardness that students learn they can alleviate by acting engaged.
Students need/want to know what the teacher thinks about the topic… not what the teacher thinks about what the students think. Only then does each student feel free to make their own mind up about what the teacher thinks. So, teachers just need to think about why they care about their field, how they want to express that via teaching and what this looks like in the classroom.
By removing these walls between student and teacher, by halting the ‘are they engaged?’ worries, we’ve more chance of seeing actual person to person connection and therefore allowing deep thinking and therefore creating real engagement with the teacher’s topic of choice from both the students and the teacher.
Thanks very much Leah – I think based on what you’ve just said, that you might well also like my most recent post on ‘Stripping the Ideology from Differentiation’, as that also very much aims to stop the game of ‘cat and mouse’ with teachers trying to get inside pupils’ minds.
Chris
Very nice. Perhaps engagement could be defined as ‘the extent to which pupils think about what it is they need to learn’ rather than engagement being ‘on task and interested’. Long winded I know!
Love the “engaged WITH” versus “engaged BY” distinction. Very useful.
Hi David, really thought provoking. If thinking really hard is the key (and like you I think it is), then it’s easy to see how students can be engaged in something that doesn’t promote learning. Quietly cutting out some statements and then waiting patiently for the teacher to reveal the answer, head down solving 100s of simple equations that they can already do etc… (Although I guess you might argue that the second example could be part of over practising to build fluency). The question I’ve been pondering is “is it possible to be thinking really hard and not engaged?”.
No, I don’t think it is possible to be thinking hard and not engaged. It all depends on what you’re supposed to be engaged in and what you’re thinking hard about. You can certainly be thinking hard about something other than what a teacher is trying to engage you in.
So whilst a poor proxy for learning we might consider it a prerequisite? A little like good behaviour.
[…] I wrote a post – Does Engagement Actually Matter? – detailing some very interesting findings on the links between intrinsic motivation, enjoyment and […]
I always find your posts very interesting and having children with whom I am now discussing motivation it seems, anecdotally at least, that my eldest son(10) is utterly disengaged by easy topics. He frustrates his teachers as he doesn’t perform but if he is given something difficult he will sit for 45 minutes unbroken working it through. Of, course learning is a personal path with thorns and stones on the way but surely there is a reward for all this learning? So far reward has been little mentioned but I am sure you have see Dan Pink’s Motivation video. Even children need autonomy, mastery and purpose to their learning but I doubt many teachers explain that education is a reward that feeds the mind and soul not just gets you some qualifications on a piece of paper. For me engagement is the key in the lock for a topic but opening the door and exploring what is beyond is the scary and challenging task that takes resilience and application. Might this have something to do with the Bristol University Neuroscience in Education team exploring the effectiveness of risk and competitiveness in the classroom?
Could boosting engagement and motivation of students also be considered to be an end of education in itself? If we have mathematically world class students who are at the same time so disengaged that they hate their lives (of which there are more than a few in South Korea), would we consider their education to be a success?
It could. I say as much in the blog. The Loveless report also considers Finland which also reports high levels of demotivation and lack of student engagement.
All of which could lead us to question the status of Pisa results as the ultimate arbiters of long term educational success… but to do so would be to open a can of worms so hard to close, that it would be rude to do it on this post! 😉
Oh, I’m not claiming PISA scores are the final arbiters of anything. But PISA provides a uniquely huge dataset. Even if we’re not at all sure that it tests anything worthwhile we can at least be sure that it provides broadly comparable results between very different territories.
I do agree with you David. It provides the most universal benchmark against which to measure any large-scale educational currents. It might well be that it is a side-street to the fullest fruits of a good education, but in the absence of a tighter conception of what this might be, and certainly in the absence of any global way of measuring it, we are right to use the PISA data.
[…] Read the full article here. […]
Hi David. I was reading the Loveless article and was struck by his conclusion that “neither approach can justify causal claims”, where the approaches he means are using using individual student data and using aggregated student data. I would be slightly critical of Loveless’ method, where for example, he includes “school attendance” as a measure of student motivation towards maths – I’m pretty sure that is not what is mostly intended in discussions of engagement in the classroom. That’s not to criticise Loveless, but I wonder how you can take his careful analysis and lack of conclusive evidence one way or the other and infer throughout your post that there is a noteworthy negative correlation between engagement and learning. Inexplicably, you write that “student engagement and motivation may actually be retarding learning”. Did you really read that conclusion on reading Loveless?
I enjoy your analyses in most cases but I have to ask: did you read Loveless on this occasion, or just copy and paste the graphs?
Yeah, I did read it. Sorry you don’t think much of my analysis, but the cautious nature of Loveless’s conclusions is something I referred to throughout the post – did you miss that? Just in case you skim read it, here’s the final paragraph of my post again:
To sum up, I’m not saying engagement and motivation don’t matter at all – clearly they are important in all sorts of contexts – but the idea that there is any kind of direct link to achievement appears to be dubious. If you want to engage students because you want them to be more engaged, fine. But if you believe that engagement will automatically lead to better results you may well be mistaken.
Even in the quote you pick out I used the word “may” to indicate tentativity. Just to reiterate, I don not think and have not said that Loveless’s report proves anything. I just find it very interesting that engagement, often cited as a pre-condition for learning appears not to matter quite as much as we thought and in fact might even prove counter-productive.
Hi David. With respect, “may” and “might” in this context are weasel words, as are “I just find it very interesting that”. What you “just find very interesting” is that engagement “in fact might even prove counter-productive” to learning. Well, the only research you cite establishes explicitly, unequivocally that engagement is not counter-productive to learning.
Learning Spy is at its best when it’s a platform for honest debate. I’m sorry to be so vociferous, but I feel very strongly that blogs like yours are our best hope of holding empty argument to account, not just casting about for evidence to support a world view.
Hi Peter – I’m genuinely confused by your ire – how does the research I’ve cited establish that engagement is NOT counter-productive to learning? Are you suggesting I’ve misread it? Or that I’m twisting something to prove a point? I’m really not – I’m very surprised by Loveless’s findings and cannot get my head around how you’ve managed to interpret them to mean the exact opposite of my reading.
Hi David. My ire is most probably misplaced – I really like your blog so I’m irritated at this piece of what I perceive to be poor interpretation.
If it helps to clarify my point, here’s my answer to your question.
Loveless’ research establishes that engagement is not counterproductive to learning in two ways, both of which I would hope you recognise in a critical analysis of his research. Incidentally, Loveless makes that point clearly, so none of this should be taken as criticism of his research or analysis.
1. His research finds only weak – mostly insignificant – correlations of any sort between “engagement” and “learning”. He alludes a few times to weak negative correlations that would, indeed, be surprising, were they to be substantiated. But he concludes, having considered all the evidence, that there is no evidence of correlation either way. That means he has found there is no relationship between engagement and learning outcomes. Or, in other words, Loveless has found that engagement is not counterproductive to learning outcomes. I’m not sure which part of this is in question, by please let me know if you disagree.
2. Loveless’ research uses certain points in the PISA data to analyse. I question just how much these points relate to Coe’s poor proxies (the excellent analysis that started your post). The best point I can find to make this point is this, from Loveless: “The -0.58 and -0.57 coefficients indicate a moderately negative association, meaning, in plain English, that countries with students who enjoy math or look forward to math lessons tend to score below average on the PISA math test.”
Coe is writing about teacher behaviour – how they are prone to think various forms of “engagement” might be evidence that their teaching has resulted in learning. I suppose, by definition, that Coe means by this that these are signals available to a teacher in a classroom. Loveless’ use of “enjoy math” or even worse “look forward to math lessons” as a proxy (no pun intended) for Coe’s “engagement” is a stretch. And that’s putting it kindly.
I do think you have seen some evidence for what you seem to want to be the case. I just don’t want you to take short cuts, if that’s not a horrendously arrogant thing to say (which it clearly is).
I want the analysts I read and engage with to dig deeper, is all.
I’m not at all sure your reading is any less influenced by bias than mine. Loveless clearly says “the relationship of achievement to student engagement is not clear cut, with some evidence pointing toward a weak positive relationship and other evidence indicating a modest negative relationship.”
Now I’m not claiming anything other than that this is not at all what we would expect to find. It may not be strong evidence, but it’s certainly surprising. I really can’t see how this leads you to conclude that there is “no evidence of correlation either way” – I will reread to see if I can see that stated anywhere.
As to your second point, I completely agree that Coe’s proxies are very different to the one’s Loveless uses. My only reason for citing Coe’s research as in an attempt to explain why there appears to be this negative (albeit very weak) correlation between achievement and motivation.
Thanks for your kind words about the blog. Much appreciated.
[…] Does engagement actually matter? – David Didau: The Learning Spy. […]
[…] Make your 'Marking Policy' a 'Feedback Policy' Work scrutiny – What’s the point of marking books. Marking is an act of love. NAHT Assessment report summary. Hattie & Timperley, The Power of feedback. Hattie, Feedback in School. Q+A: Education expert Prof John Hattie – TV News Video. Why are so many of our teachers and schools so successful? John Hattie at TEDxNorrkoping. Setting or streaming. 'I wanted pupils to know their exam results are just a small part of them' Does engagement actually matter? […]
[…] engaged in work they can already do. David Didau writes (brilliantly and provocatively as always) here on this […]
[…] worrying recently about engagement. A few things have come together. Firstly, David Didau wrote an excellent post on the topic that made me think. Then, following my piece on explicit instruction in The […]
[…] my thinking. I am currently feeling guilty at the number of times I must have written about students being engaged when it comes to lesson observation! I try to be so aware of Robert Coe’s Poor Proxies for […]
[…] engagement and modeling of real-world application of knowledge across domains.” Well, engagement might well be a canard, but what of this business about modelling real would applications of knowledge? Whatever this […]
[…] David Didau writes about a report by Tom Loveless for The Brown centre looking at PISA Data from 2012 that suggests: […]
[…] This leads to the vandalism of complex ideas, the preoccupation with fun and engagement that so bedevils school as a microcosm of wider culture. As teachers, we are perhaps guilty of […]
[…] This indicates that the role of ADEs is to find the best ways to help Apple sell tech to schools. There’s not even a pretence that any of this might raise students’ achievement, it’s all about helping “engage students”. The problem with engagement as a primary aim is that it might even be in conflict with achievement. […]
[…] Does engagement actually matter? by @LearningSpy from March 2015 […]
[…] students may well enjoy engaging lessons and motivating speakers, these things don’t actually seem to make a lot of difference to learning. In the Sutton Trust report What Makes Great Teaching, Rob Coe and colleagues identify the six […]
[…] that they make lessons engaging. I explain here the nonsense of proving progress in lessons and here I argue against the use of engagement as a proxy for learning. There’s also the question of […]
[…] Suggesting that student engagement might actually be a bad thing tend to get certain people’s dander up. There was a mild spat recently about Rob Coe reiterating that engagement was a ‘poor proxy’ for learning. Carl Hendrick unpicked the problem further: This paradox is explored by Graham Nuthall in his book ‘The Hidden Lives of Learners,’ (2007) in […]
[…] [1] https://www.learningspy.co.uk/learning/does-engagement-actually-matter/ […]
[…] really need. I argued here that children having fun is a poor reason for selecting an activity, and here I suggested that some forms of ‘engagement’ may actually do more harm than good. […]
[…] is a poor proxy for learning. David Didau writes, “does engagement actually matter?” here. Greg Ashman explains why he feels engagement is a poor proxy for learning here. “Just […]
[…] David Didau writes about a report by Tom Loveless for The Brown centre looking at PISA Data from 2012 that suggests: […]
[…] Didau, D. (2015). Does engagement actually matter? Retrieved 2 November 2018, from https://learningspy.co.uk/learning/does-engagement-actually-matter/ […]