Lost in translation

It’s entirely my fault. I chose to subscribe to a daily email from Times Higher Education. No-one forced me. I could cancel at any time. It’s also my fault that I get snarky when they do one of these:

That’s from today’s THE email bulletin. The story to which it refers is behind a paywall, but my university subscription gives me access. If you don’t have access to the THE and your scepticism setting has been dialled back, you may find your brain registering the belief that

“teaching students online produces the same academic performance as face-to-face teaching”.

After all, that’s what you just read in the THE bulletin. If you are an inexperienced university manager, over-stretched educational policy maker, researcher for a free market think tank or EdTech entrepreneur, such a belief could prove unhelpful to the rest of us.

Turning to the story itself, in the THE, we are told:

“The results, which come at a time when universities across the world have had to shut their campuses and move to online instruction as a result of the coronavirus pandemic, found that students who were taught wholly online scored the highest on their average scores in assessments taken throughout the course. Those taught fully online scored, on average, 7.2 percentage points higher than the other two forms of learning.”

But if we look at the journal article on which this is based, we find that the researchers are quite clear about why the ‘fully online’ students did better. Unlike the students with whom they were being compared, the fully online students had three goes at each of the weekly assessment tasks. The students who were taught in ‘in-person’ or ‘blended’ modes did not have this advantage. The researchers describe this as “an artefact of the more lenient assessment submission policy for online students” (Chirikov et al., 2020, p2). In the journal article, this caveat comes immediately after the finding quoted above in the THE piece. Yet it didn’t make it into the THE or the THE’s email bulletin.

Casual readers of the email bulletin might also assume that the proposition about equivalent performance comes from a representative sample of comparisons between online and face-to-face teaching. They might be surprised to learn that the research was conducted in three Russian universities, in two courses: Engineering Mechanics and Construction Materials Technology. The online course instructors came from one of Russia’s “top engineering schools” while the instructors involved in the ‘in person’ and ‘blended’ modes worked in one of the students’ own universities. These are described in the journal article as ‘resource-constrained institutions’ and the researchers characterise the instructors in the ‘resource-constrained’ universities as having weaker educational backgrounds, fewer research publications and less teaching experience than the instructors from the ‘top engineering school’ (Chirikov et al., 2020, p2).

The THE article does not refer to any prior studies. The extensive review and meta-analysis work by Barbara Means and colleagues may as well not exist.

My point: I’m not criticising the researchers (Igor Chirikov and colleagues). They put a lot of care into this study and they are serious about improving access to STEM education opportunities. I just wonder how one of our ‘top trade papers’ (THE) can provide such bad service to its industry, and what we can do to help improve the situation.

References

Chirikov, I., Semenova, T., Maloshonok, N., Bettinger, E., & Kizilcec, R. F. (2020). Online education platforms scale college STEM instruction with equivalent learning outcomes at lower cost. Science Advances, 6(15), eaay5324. doi:10.1126/sciadv.aay5324

Means, B., Bakia, M., & Murphy, R. (2014). Learning online: what research tells us about whether, when and how. New York: Routledge.

Means, B., Toyama, Y., Murphy, R. F., & Baki, M. (2013). The effectiveness of online and blended learning: a meta-analysis of the empirical literature. Teachers College Record, 115, 1-47. 

The books: 2010 to 2019

I’ve been very lucky (a) to have the opportunity to work with some very productive and insightful co-authors and co-editors (b) to have the luxury of support from the Australian Research Council and the Australian Learning and Teaching Council.

Ellis, R., & Goodyear, P. (2019) The Education ecology of universities: integrating learning, strategy and the academy, Abingdon: Routledge/SRHE series. 252pp.

Ellis, R., & Goodyear, P. (Eds.) (2018) Spaces of teaching and learning: integrating research and practice, Singapore: Springer. 243pp.

Markauskaite, L & Goodyear, P. (2017) Epistemic fluency and professional education: innovation, knowledgeable action and actionable knowledge, Dordrecht: Springer. 651pp.

Carvalho, L., Goodyear, P. & de Laat, M. (Eds.) (2017) Place-based spaces for networked learning, New York, RoutledgeFalmer, 288pp.

Carvalho, L. & Goodyear, P. (Eds.) (2014) The architecture of productive learning networks, New York, RoutledgeFalmer, 294pp.

Luckin, R., Puntambekar, S., Goodyear, P., Grabowski, B., Underwood, J. & Winters, N. eds. (2013) Handbook of design in educational technology, New York, Routledge, 509pp

Ellis, R., & Goodyear, P. (2010).
Students’ experiences of e-learning in higher education: the ecology of sustainable innovation. Abingdon: Routledge, 232pp

Goodyear, P., & Retalis, S. eds. (2010).
Technology-enhanced learning: design patterns and pattern languages.
Rotterdam: Sense Publishers, 330pp.

Aligning education, digital and learning space strategies: an ecological approach

These are slides to accompany the presentation I made at ITaLI, University of Queensland, this morning.

Rob Ellis and I have a chapter on this in a new book edited by Ron Barnett and Norman Jackson.

Our book length treatment was published earlier this year by Routledge.

Tasks, activities and student learning

Talk at ITaLI, University of Queensland, 7th November 2019

The following references are cited in the slides/talk. Slides themselves are here: Goodyear UQ 2019-Nov-07 condensed.

Bearman, M., & Ajjawi, R. (2019). Can a rubric do more than be transparent? Invitation as a new metaphor for assessment criteria. Studies in Higher Education, 1-10.

Beckman, K., Apps, T., Bennett, S., Dalgarno, B., Kennedy, G., & Lockyer, L. (2019). Self-regulation in open-ended online assignment tasks: the importance of initial task interpretation and goal setting. Studies in Higher Education, 1-15.

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: what the student does (3rd ed.). Buckingham: Open University Press.

Carvalho, L., & Goodyear, P. (Eds.). (2014). The architecture of productive learning networks. New York: Routledge.

Ellis, R., & Goodyear, P. (2010). Students’ experiences of e-learning in higher education: the ecology of sustainable innovation. New York: RoutledgeFalmer.

Forbes, D., & Gedera, D. (2019). From confounded to common ground: Misunderstandings between tertiary teachers and students in online discussions. Australasian Journal of Educational Technology 35(4). doi:10.14742/ajet.3595

Goodyear, P. (2015). Teaching as design. HERDSA Review of Higher Education, 2, 27-50. Retrieved from http://www.herdsa.org.au/system/files/HERDSARHE2015v02p27.pdf

Hadwin, Allyson, and Philip Winne. 2012. “Promoting Learning Skills in Undergraduate Students.” In Enhancing the Quality of Learning, edited by John R. Kirby and Michael J. Lawson, 201–27. New York: Cambridge University Press

Krippendorff, K. (2006). The semantic turn: a new foundation for design. Boca Raton FL: CRC Press.

Laurillard, D., Kennedy, E., Charlton, P., Wild, J., & Dimakopoulos, D. (2018). Using technology to develop teachers as designers of TEL: Evaluating the learning designer. British Journal of Educational technology, 49(6), 1044-1058. doi:10.1111/bjet.12697

Shuell, T. (1986). Cognitive conceptions of learning. Review of Educational Research, 56(4), 411-436.

Suchman, L. (1987). Plans and situated actions: the problem of human-machine communication. Cambridge: Cambridge University Press.

Sun, S. Y. H., & Goodyear, P. (2019). Social co-configuration in online language learning. Australasian Journal of Educational Technology, 36(2), 13-26. doi:https://doi.org/10.14742/ajet.5102

Wisner, A. (1995a). Understanding problem building: ergonomic work analysis. Ergonomics, 38(3), 595-605.

Wisner, A. (1995b). Situated cognition and action: implications for ergonomic work analysis and anthropotechnology. Ergonomics, 38(8), 1542-1557.

The Sydney Business School ACAD video (3 mins) is here: https://player.vimeo.com/video/302378219

Impact and engagement

A few notes to accompany a panel session at Deakin, organised by CRADLE 14th October 2019.

Although my publications are reasonably well-cited and I can say that some of my work is taken up by other academics, my impact on policy and practice is quite marginal. There are claims I could make about specific areas of change in curricula or in how teams approach the design of learning environments. But these claims feel patchy to me: important in a specific program or university, but nothing that would count as credible evidence of impact at scale.

However – and this is an example of shiftily switching a practical into an academic problem – I am very interested in the pathways from research to policy and practice change. So, for example, I’ve been carrying out research on:

  • how university leaders construe the challenges of integrating educational, IT and physical infrastructure planning,
  • how to make educational design experience and design ideas easier to share and re-use,
  • how teams of academics and educational developers collaboratively design for students’ learning, and
  • what counts as ‘actionable knowledge’ in/for the design of programs of professional education.

I’ve also worked with AARE and other organisations, in Australia and elsewhere, on aspects of research policy: including approaches to the evaluation of research quality and impact and strategies for research capacity-building and engagement with ‘non-academic’ users of research.

At the panel session today, I summarised three ways that researchers in (higher) education tackle the challenges of ‘impact and engagement’. These descriptions are very broad brush, and not meant to offend. I’m calling them ‘thoughts and prayers’, ‘branding innovations’ and ‘research-practice partnerships’.

‘Thoughts and prayers’ is the default. A researcher writes up a study, an educational innovation or whatever, publishes a paper in a higher education journal and hopes that someone will read it, and be inspired to change what they do.

Much more noisy and visible is the work that goes on when a person or team coins a persuasive term and markets it hard. I will try not to be too cynical about this. Education is prone to fads and fashions and a set of research-based ideas can be taken up quite readily if they are presented as a discrete and coherent whole. I’m sure we can all think of some examples where a pithy phrase transforms into something that can be trade-marked, branded and/or sold as a commodity. Epistemic fluency,  teaching-as-design, design thinking, evaluative judgement, feedback literacy, visible learning, productive failure, flipped classrooms; even such large and hairy mammoths as PBL.

Only a small proportion of these get a breakthrough into the mass market. However:

  • those that do make it big tend to be used to set the mould (or expectations, or standards) for what educational impact should look like and
  • educational practices and educational systems have shown they are capable of radically reinterpreting research-based interventions and actually realising something very different from what was tested in the original research, and
  • what is easy to pick up as a package is easy to drop as a package.

A recent article in the ‘Fairfax’ papers illustrates this, with Deanna Kuhn’s work on ‘growth mindset’ as the example. The original research was deep, painstaking and insightful. The educational take-up, around the world, has been widespread and enthusiastic. But implementations are many and varied and some have moved a long way from anything Kuhn would recognise.  In other words ‘implementation fidelity’ is far from guaranteed in educational systems, so the connections between research, practice and outcomes can be very tenuous.

This brings me to the third approach, which I’ll subsume under the heading of ‘research practice partnerships’. There’s an excellent book on this by Bill Penuel and Daniel Gallagher. Rob Ellis and I summarised some of the ideas, customised for higher education, in the second half of our most recent book. The organising theme here is engagement, with impact on practice as one of the benefits – accompanied by a stronger, reciprocal, role for practitioners to shape research. The RPP idea has shaped some of our work in setting up the Centre for Research on Learning and Innovation at Sydney University, though we have a long way to go yet.

Such partnerships have also influenced a strand in my own research – such that I’ve chosen to research educational practices in which there’s a reasonable chance that research-based knowledge will prove useful. For example, common sense and good evidence suggest that teachers are much better placed to consult research when they are designing for learning (‘upstream’ of a learning activity) than when they are in the middle of a live teaching-learning event. In addition, when the right materials, tools or spaces become available at the right time, they are likely to become part of prevailing practices and have beneficial and sustainable effects. Research-based ideas that take on a material form, such that they can become entangled in – and reshape – existing practices, live a different life from those that sit silently in the literature. Hence, I’ve researched the dynamics of design teams’ working practices and have experimented with rendering research-based insights in readily materializable forms (such as design patterns).

My final point: I’ve had a close involvement in setting up and/or running four research centres in the last 30 years. I’ve been drawn to this mode of working for a number of reasons. But one of them is a realisation that the intensification of pressures on academic researchers means it’s not sensible to try to be outstanding at all aspects/phases of the research lifecycle. For one person to be energetically forming new ideas for projects, securing funding, recruiting and guiding research teams, writing for academic and practitioner audiences, overseeing a suite of dissemination activities, liaising closely with practitioner communities and policy-makers, making cases for internal resources, etc etc – that’s a recipe for burnout and disaster. We can’t all be good at all these things all of the time. Also, some of them really benefit from specialist skills. Hence: if you want to engage in a sustainable way in processes that are likely to improve the impact of your research, you are best advised to work closely with kindred spirits.

See also:

Recent articles on the LSE Impact Blog by John Burgoyne and Toby Green.

The UK REF Impact case studies from Durham on Threshold Concepts and Lancaster on Evaluative research improving policy and practice.

The DETYA report on The Impact of Educational Research – published in 2000, but thorough and full of insights.

 

New article: Instrumental genesis in the design studio

After a long wait, our paper on “Instrumental Genesis in the Design Studio” has just been published in the International Journal of Computer-Supported Collaborative Learning. For those without a library subscription, there’s free but read-only access here.

Abstract

The theory of Instrumental Genesis (IG) accounts for the mutual evolution of artefacts and their uses, for specific purposes in specific environments. IG has been used in Computer-Supported Collaborative Learning (CSCL) to explain how instruments are generated through the interactions of learners, teachers and artefacts in ‘downstream’ classroom activities. This paper addresses the neglected ‘upstream’ activities of CSCL design, where teachers, educational designers and educational technologists use CSCL design artefacts in specific design-for-learning situations. The paper shows how the IG approach can be used to follow artefacts and ideas back and forth on the CSCL design and implementation pathway. It demonstrates ways of tracing dynamic relations between artefacts and their uses across the whole complex of instrument-mediated activity implicated in learning and design. This has implications for understanding the communicability of design ideas and informing the iterative improvement of designs and designing for CSCL

Educational design embedded in university teaching practices

Over the last few years, I’ve been claiming that there is a huge amount of educational design knowledge embedded in the working practices of experienced university teachers. This knowledge is very unevenly distributed and we need better ways of sharing it.

With colleagues Lucila Carvalho, Kate Thompson, Pippa Yeoman and others, I’ve tried to promote some ways of working on this problem. Among them is the ‘ACAD’ framework, which is meant to help designers think separately about – and then bring into some kind of harmony – task design, social design and the design (or setting in place) of material and digital tools and resources. In other words, design needs to attend to (a) what students are being asked to do, (b) how they should work together to do it, (c) what tools etc they’ll need (with some careful thought about what can be digital, what should be in material form and so on).

All of this design thinking needs to be understood as non-deterministic: design works indirectly – what students actually do at ‘learn time’ is what shapes the actual outcomes of the task they tackle. But that dependence on what students actually do doesn’t absolve the teacher-as-designer of the responsibility for thinking things through carefully. Far from it.

I’ve been writing and giving talks about this for 20 years or so. Sometimes people get it. Sometimes I feel they don’t. The ACAD framework and some of the thinking behind it can be found in my other design papers on this site. There’s also a really good new paper in the British Educational Research Journal by Lucila and Pippa.

But just a few minutes ago I read this post by Danica Savonick on the hastac website and it is just fabulous: both as an example of the careful thinking that has gone into the design and (selfishly) as an illustration of what we keep banging on about with ACAD.

Please take 5 and read it. You don’t need to be a literature teacher. You just have to care about students learning.

And bye-the-bye, it’s a lovely illustration of what we talk about in our ACAD shtick.

Two key points:

1) You don’t need ACAD (or any formalised model of ‘how to do design for learning’) to come up with a design like the one Danica Savonick is sharing. I understand her example has emerged from her own practice and quite likely has evolved over a few trials. It’s what designers can do, without knowing they are designing or thinking of themselves as designers (or wearing black clothes). I see lots of academics solving very complex design problems without positioning themselves as designers or drawing on ‘how to design for learning’ texts or methods. NB in saying this, I’m not taking anything away from what Danica Savonick has designed. I don’t know her and for all I know she has some background in ID. (I just don’t think that’s the case though. The example reads like a pure distillation of knowledge accumulated in practice rather than anything inflected with justifications from learning and design theory.) Whatever, it’s a lovely piece of design.

2) Most of the knowledge bound up in the example is what design theorists Harold Nelson and Erik Stolterman call ‘knowledge of the real’ (rather than ‘knowledge of the (universally) true’). Of course, there is also ‘knowledge of the ideal’ – in the sense that Danica Savonick knows why this exercise is worth doing. But the design is replete with particulars– real things to get right – and has little truck with the illusory universal truths of learning theory. (“Group work is better than individual reflection”, “All classes should be flipped”, “Direct instruction beats discovery learning” etc.)

Nelson & Stolterman claim that design is the ‘first tradition’ in human development – before science and creative arts – and that it involves subtle inter-weavings of what is true, what is real and what is ideal. Skilled practice often involves design: we need to get better at recognising it and learning from it. Head for the hastac website now!

 

ACAD stands for ‘Activity-Centred Analysis and Design’

 

References/further reading

 

Nelson, H., & Stolterman, E. (2014). The design way: intentional change in an unpredictable world(2nd ed.). Cambridge MA: MIT Press.

Carvalho, L., & Goodyear, P. (Eds.). (2014). The architecture of productive learning networks. New York: Routledge.

Carvalho, L., Goodyear, P., & de Laat, M. (Eds.). (2017). Place-based spaces for networked learning. New York: Routledge.

Carvalho, L., & Yeoman, P. (2018). Framing learning entanglement in innovative learning spaces: Connecting theory, design and practice. British Educational Research Journal, 0(0). doi:doi:10.1002/berj.3483