Working with the universities we have, not the universities we wish we had.

This is partly prompted by Jesse Strommel’s post here warning against pedagogical models. 

Jesse has been a voice of reason and humanity in many debates about how we teach and how we relate to students in higher education – not just in situations where technology plays a part, but more generally. He works in North America and some of what he writes resonates very clearly with experiences in US universities. I have lots of friends and colleagues who work in US universities, and I read a lot of the research that comes from US-based education research, educational technology and learning sciences scholars. But all my direct experience of studying and working in higher education has been in Northern Ireland and England (1970s to 2003) and Australia (2003 onwards). There are important differences between countries in the ways higher education is organised, funded, described and experienced. There are significant terminological differences that can also cause confusion, undermine mutual understanding, or simply make our worlds seem strange and even alien. Some of these differences are obscured on Twitter (and in the terser forms of social media more generally), and can generate a bit of irritation. (“That may be how they do things in the US, but it’s not like that here.”)  

One of Jesse Strommel’s recurring pieces of wise advice is that we (teachers in higher education) should spend more time talking with our students, finding out about them, learning more about their real needs and desires, and about the lives they are leading before, outside and beyond university. He also talks about “teaching the students we have, not the students we wish we had.”   I have no trouble identifying with these sentiments, and I suspect that’s also the case with many of my colleagues, even though we’re not closely involved in the American HE discourse to which these words are a reaction. For example, some of the debates about ‘rigour’, grading, making allowances for students’ circumstances, the power of administrators and the cultural variations between universities take on different forms in the UK and Australia. Though we ‘get’ the main points being made, they can feel exotic and are expressed in ways that can cause us to feel well outside the worlds described. 

All that said, I’m trying to unpick what it is that Jesse is deprecating in his article. I’ve come up with the following possible readings:

1) Beware of ed tech companies, consultants, gurus and other snake-oil salesmen bearing gifts. They may be offering you a colourful diagram but their true motives could cause damage to you, your wallet and the people and things you care about.

2) Beware of any attempts to simplify and over-systematise what you do. One size does not fit all. Watch out for administrators and other powers-that-(would)-be who may weoponise rubrics, models, quality frameworks and other paraphernalia, in ways that stop you doing good work. Moreover, don’t do this to yourself: inflexible methods can be used to self-harm. 

3) Beware of specific models. They may be wrong, outdated, or prone to being misunderstood or misapplied. Among those listed in Jesse’s article are: learning styles, Bloom’s taxonomy (original and revised), ADDIE, scaffolding, design thinking, Quality Matters, andragogy and HyFlex. 

4) Beware of all and any models. Model-based action/thinking takes you in the wrong direction. Start by talking to your students.

5) There is also a cynical reading: that undermining the legitimacy of models undermines the credibility of other influencers and strengthens the position of those whose brand depends upon being understood as deeply and inherently good and wise. 

Taking these one at a time.

1) I am in full agreement. Caveat emptor. Especially when someone else is buying with your money. 

2) Absolutely. But pause for a moment to consider how we decide when something is as simple as possible, but not simpler.

3) For sure. But pause for another moment to consider that (a) the things referred to in the list are of quite diverse kinds – are they all actually models? And (b) we evaluate models of the world and models to guide action by different criteria. How, for example, should we think about scaffolding? Should we be testing the validity of the science, from Bernstein, Luria and Vygotsky through Wood, Bruner & Ross and on into the hundreds of studies across psychology, educational technology and the learning sciences? Or should we think about scaffolding (and fading) as designable elements in a learning environment that we have a responsibility to help create? How should we think about design thinking? As the infantilising pastime popularised by fans of IDEO? (Actually, that’s defamatory to infants, who could teach most of the Silicon Valley celebrities a thing or two about ideation and empathy.) Or as a set of resources for people to work together and construct more just and sustainable ways of living? (I’m thinking of Ezio Manzini and Hilary Cottam here, but pick your own.)

4) Cards on the table. One big part of my work has been to create frameworks for analysing and designing complex learning environments. Working with some very creative and industrious colleagues, I’ve helped construct some ways to help other people think about learning and design. We’ve done some of this by designing, some by analysing existing designs, and some by researching how design is done and what design tools and methods help teachers (and others) in higher education to design better. Skin in the game. If you’re interested, you can read more about the approaches we’ve taken by following up on other links from this site – some papers here; a good 3 minute video here

But the main point I want to try to make is the following. Yes, agreed, talk to students – early and often. But don’t kid yourself that this leads in any simple way to a plan of action. One of those subtle, unstated, differences that I pick up when I read some US-based commentators on good practice in higher education and compare what they say with the lived experiences of university teachers here in Sydney is that ‘talking to students’ is a more straightforward proposition when you have a class of 30 or 50 than is the case with Biology 101 or Psych 101 here – with one or two thousand students in the class. 

But let’s not get hung up on scale. Bring in the constraints of a curriculum that can’t be changed till the year after next, a squad of casual tutors who don’t get paid to attend course planning meetings and who may not know they have a job till the week the course starts, a set of teaching spaces last renovated in the 1950s, a digital and regulatory infrastructure that changes every couple of years, QA regimes that don’t measure what really matters, time-poor students who need to satisfice course requirements in order to juggle work and carer responsibilities, worsening job prospects, a risk-averse business sector, an anti-intellectual government and a tycoon-owned media whose business model depends on fanning new fronts in the culture wars. And Covid. And climate change. And colonialism.

About 20 years ago, I made a conscious choice to work with and for the teachers we actually have, in the universities we actually have. I hope that part of my work helps them, and their students, tool up for collective action to create worlds worth living in:  ‘new normals’ worth fighting for. But my work also needs to provide resources – including ‘tools to think with’ – that can be used next Monday, or perhaps the Monday after.  And to help with the complex challenges of distinguishing between what can be changed this week, this year, next year, and maybe never. Finding the edges of what can be changed, and how, is not always simple. Nor is it always easy to discern what should be changed, and what consequences may flow.

So, I guess I’m left wanting to say that I believe in the value of tools and methods that can help groups of people understand complex situations, and come to an agreement on how to move onwards. Talk is good. Raw observation and experience are good. But I’m not sure they are sufficient unto the day – especially Monday.    

Oh yes. I almost forgot.

5) I am good and wise enough to deprecate cynicism.

Lost in translation

It’s entirely my fault. I chose to subscribe to a daily email from Times Higher Education. No-one forced me. I could cancel at any time. It’s also my fault that I get snarky when they do one of these:

That’s from today’s THE email bulletin. The story to which it refers is behind a paywall, but my university subscription gives me access. If you don’t have access to the THE and your scepticism setting has been dialled back, you may find your brain registering the belief that

“teaching students online produces the same academic performance as face-to-face teaching”.

After all, that’s what you just read in the THE bulletin. If you are an inexperienced university manager, over-stretched educational policy maker, researcher for a free market think tank or EdTech entrepreneur, such a belief could prove unhelpful to the rest of us.

Turning to the story itself, in the THE, we are told:

“The results, which come at a time when universities across the world have had to shut their campuses and move to online instruction as a result of the coronavirus pandemic, found that students who were taught wholly online scored the highest on their average scores in assessments taken throughout the course. Those taught fully online scored, on average, 7.2 percentage points higher than the other two forms of learning.”

But if we look at the journal article on which this is based, we find that the researchers are quite clear about why the ‘fully online’ students did better. Unlike the students with whom they were being compared, the fully online students had three goes at each of the weekly assessment tasks. The students who were taught in ‘in-person’ or ‘blended’ modes did not have this advantage. The researchers describe this as “an artefact of the more lenient assessment submission policy for online students” (Chirikov et al., 2020, p2). In the journal article, this caveat comes immediately after the finding quoted above in the THE piece. Yet it didn’t make it into the THE or the THE’s email bulletin.

Casual readers of the email bulletin might also assume that the proposition about equivalent performance comes from a representative sample of comparisons between online and face-to-face teaching. They might be surprised to learn that the research was conducted in three Russian universities, in two courses: Engineering Mechanics and Construction Materials Technology. The online course instructors came from one of Russia’s “top engineering schools” while the instructors involved in the ‘in person’ and ‘blended’ modes worked in one of the students’ own universities. These are described in the journal article as ‘resource-constrained institutions’ and the researchers characterise the instructors in the ‘resource-constrained’ universities as having weaker educational backgrounds, fewer research publications and less teaching experience than the instructors from the ‘top engineering school’ (Chirikov et al., 2020, p2).

The THE article does not refer to any prior studies. The extensive review and meta-analysis work by Barbara Means and colleagues may as well not exist.

My point: I’m not criticising the researchers (Igor Chirikov and colleagues). They put a lot of care into this study and they are serious about improving access to STEM education opportunities. I just wonder how one of our ‘top trade papers’ (THE) can provide such bad service to its industry, and what we can do to help improve the situation.

References

Chirikov, I., Semenova, T., Maloshonok, N., Bettinger, E., & Kizilcec, R. F. (2020). Online education platforms scale college STEM instruction with equivalent learning outcomes at lower cost. Science Advances, 6(15), eaay5324. doi:10.1126/sciadv.aay5324

Means, B., Bakia, M., & Murphy, R. (2014). Learning online: what research tells us about whether, when and how. New York: Routledge.

Means, B., Toyama, Y., Murphy, R. F., & Baki, M. (2013). The effectiveness of online and blended learning: a meta-analysis of the empirical literature. Teachers College Record, 115, 1-47. 

Aligning education, digital and learning space strategies: an ecological approach

These are slides to accompany the presentation I made at ITaLI, University of Queensland, this morning.

Rob Ellis and I have a chapter on this in a new book edited by Ron Barnett and Norman Jackson.

Our book length treatment was published earlier this year by Routledge.

Tasks, activities and student learning

Talk at ITaLI, University of Queensland, 7th November 2019

The following references are cited in the slides/talk. Slides themselves are here: Goodyear UQ 2019-Nov-07 condensed.

Bearman, M., & Ajjawi, R. (2019). Can a rubric do more than be transparent? Invitation as a new metaphor for assessment criteria. Studies in Higher Education, 1-10.

Beckman, K., Apps, T., Bennett, S., Dalgarno, B., Kennedy, G., & Lockyer, L. (2019). Self-regulation in open-ended online assignment tasks: the importance of initial task interpretation and goal setting. Studies in Higher Education, 1-15.

Biggs, J., & Tang, C. (2007). Teaching for quality learning at university: what the student does (3rd ed.). Buckingham: Open University Press.

Carvalho, L., & Goodyear, P. (Eds.). (2014). The architecture of productive learning networks. New York: Routledge.

Ellis, R., & Goodyear, P. (2010). Students’ experiences of e-learning in higher education: the ecology of sustainable innovation. New York: RoutledgeFalmer.

Forbes, D., & Gedera, D. (2019). From confounded to common ground: Misunderstandings between tertiary teachers and students in online discussions. Australasian Journal of Educational Technology 35(4). doi:10.14742/ajet.3595

Goodyear, P. (2015). Teaching as design. HERDSA Review of Higher Education, 2, 27-50. Retrieved from http://www.herdsa.org.au/system/files/HERDSARHE2015v02p27.pdf

Hadwin, Allyson, and Philip Winne. 2012. “Promoting Learning Skills in Undergraduate Students.” In Enhancing the Quality of Learning, edited by John R. Kirby and Michael J. Lawson, 201–27. New York: Cambridge University Press

Krippendorff, K. (2006). The semantic turn: a new foundation for design. Boca Raton FL: CRC Press.

Laurillard, D., Kennedy, E., Charlton, P., Wild, J., & Dimakopoulos, D. (2018). Using technology to develop teachers as designers of TEL: Evaluating the learning designer. British Journal of Educational technology, 49(6), 1044-1058. doi:10.1111/bjet.12697

Shuell, T. (1986). Cognitive conceptions of learning. Review of Educational Research, 56(4), 411-436.

Suchman, L. (1987). Plans and situated actions: the problem of human-machine communication. Cambridge: Cambridge University Press.

Sun, S. Y. H., & Goodyear, P. (2019). Social co-configuration in online language learning. Australasian Journal of Educational Technology, 36(2), 13-26. doi:https://doi.org/10.14742/ajet.5102

Wisner, A. (1995a). Understanding problem building: ergonomic work analysis. Ergonomics, 38(3), 595-605.

Wisner, A. (1995b). Situated cognition and action: implications for ergonomic work analysis and anthropotechnology. Ergonomics, 38(8), 1542-1557.

The Sydney Business School ACAD video (3 mins) is here: https://player.vimeo.com/video/302378219

Impact and engagement

A few notes to accompany a panel session at Deakin, organised by CRADLE 14th October 2019.

Although my publications are reasonably well-cited and I can say that some of my work is taken up by other academics, my impact on policy and practice is quite marginal. There are claims I could make about specific areas of change in curricula or in how teams approach the design of learning environments. But these claims feel patchy to me: important in a specific program or university, but nothing that would count as credible evidence of impact at scale.

However – and this is an example of shiftily switching a practical into an academic problem – I am very interested in the pathways from research to policy and practice change. So, for example, I’ve been carrying out research on:

  • how university leaders construe the challenges of integrating educational, IT and physical infrastructure planning,
  • how to make educational design experience and design ideas easier to share and re-use,
  • how teams of academics and educational developers collaboratively design for students’ learning, and
  • what counts as ‘actionable knowledge’ in/for the design of programs of professional education.

I’ve also worked with AARE and other organisations, in Australia and elsewhere, on aspects of research policy: including approaches to the evaluation of research quality and impact and strategies for research capacity-building and engagement with ‘non-academic’ users of research.

At the panel session today, I summarised three ways that researchers in (higher) education tackle the challenges of ‘impact and engagement’. These descriptions are very broad brush, and not meant to offend. I’m calling them ‘thoughts and prayers’, ‘branding innovations’ and ‘research-practice partnerships’.

‘Thoughts and prayers’ is the default. A researcher writes up a study, an educational innovation or whatever, publishes a paper in a higher education journal and hopes that someone will read it, and be inspired to change what they do.

Much more noisy and visible is the work that goes on when a person or team coins a persuasive term and markets it hard. I will try not to be too cynical about this. Education is prone to fads and fashions and a set of research-based ideas can be taken up quite readily if they are presented as a discrete and coherent whole. I’m sure we can all think of some examples where a pithy phrase transforms into something that can be trade-marked, branded and/or sold as a commodity. Epistemic fluency,  teaching-as-design, design thinking, evaluative judgement, feedback literacy, visible learning, productive failure, flipped classrooms; even such large and hairy mammoths as PBL.

Only a small proportion of these get a breakthrough into the mass market. However:

  • those that do make it big tend to be used to set the mould (or expectations, or standards) for what educational impact should look like and
  • educational practices and educational systems have shown they are capable of radically reinterpreting research-based interventions and actually realising something very different from what was tested in the original research, and
  • what is easy to pick up as a package is easy to drop as a package.

A recent article in the ‘Fairfax’ papers illustrates this, with Deanna Kuhn’s work on ‘growth mindset’ as the example. The original research was deep, painstaking and insightful. The educational take-up, around the world, has been widespread and enthusiastic. But implementations are many and varied and some have moved a long way from anything Kuhn would recognise.  In other words ‘implementation fidelity’ is far from guaranteed in educational systems, so the connections between research, practice and outcomes can be very tenuous.

This brings me to the third approach, which I’ll subsume under the heading of ‘research practice partnerships’. There’s an excellent book on this by Bill Penuel and Daniel Gallagher. Rob Ellis and I summarised some of the ideas, customised for higher education, in the second half of our most recent book. The organising theme here is engagement, with impact on practice as one of the benefits – accompanied by a stronger, reciprocal, role for practitioners to shape research. The RPP idea has shaped some of our work in setting up the Centre for Research on Learning and Innovation at Sydney University, though we have a long way to go yet.

Such partnerships have also influenced a strand in my own research – such that I’ve chosen to research educational practices in which there’s a reasonable chance that research-based knowledge will prove useful. For example, common sense and good evidence suggest that teachers are much better placed to consult research when they are designing for learning (‘upstream’ of a learning activity) than when they are in the middle of a live teaching-learning event. In addition, when the right materials, tools or spaces become available at the right time, they are likely to become part of prevailing practices and have beneficial and sustainable effects. Research-based ideas that take on a material form, such that they can become entangled in – and reshape – existing practices, live a different life from those that sit silently in the literature. Hence, I’ve researched the dynamics of design teams’ working practices and have experimented with rendering research-based insights in readily materializable forms (such as design patterns).

My final point: I’ve had a close involvement in setting up and/or running four research centres in the last 30 years. I’ve been drawn to this mode of working for a number of reasons. But one of them is a realisation that the intensification of pressures on academic researchers means it’s not sensible to try to be outstanding at all aspects/phases of the research lifecycle. For one person to be energetically forming new ideas for projects, securing funding, recruiting and guiding research teams, writing for academic and practitioner audiences, overseeing a suite of dissemination activities, liaising closely with practitioner communities and policy-makers, making cases for internal resources, etc etc – that’s a recipe for burnout and disaster. We can’t all be good at all these things all of the time. Also, some of them really benefit from specialist skills. Hence: if you want to engage in a sustainable way in processes that are likely to improve the impact of your research, you are best advised to work closely with kindred spirits.

See also:

Recent articles on the LSE Impact Blog by John Burgoyne and Toby Green.

The UK REF Impact case studies from Durham on Threshold Concepts and Lancaster on Evaluative research improving policy and practice.

The DETYA report on The Impact of Educational Research – published in 2000, but thorough and full of insights.

 

New article: Instrumental genesis in the design studio

After a long wait, our paper on “Instrumental Genesis in the Design Studio” has just been published in the International Journal of Computer-Supported Collaborative Learning. For those without a library subscription, there’s free but read-only access here.

Abstract

The theory of Instrumental Genesis (IG) accounts for the mutual evolution of artefacts and their uses, for specific purposes in specific environments. IG has been used in Computer-Supported Collaborative Learning (CSCL) to explain how instruments are generated through the interactions of learners, teachers and artefacts in ‘downstream’ classroom activities. This paper addresses the neglected ‘upstream’ activities of CSCL design, where teachers, educational designers and educational technologists use CSCL design artefacts in specific design-for-learning situations. The paper shows how the IG approach can be used to follow artefacts and ideas back and forth on the CSCL design and implementation pathway. It demonstrates ways of tracing dynamic relations between artefacts and their uses across the whole complex of instrument-mediated activity implicated in learning and design. This has implications for understanding the communicability of design ideas and informing the iterative improvement of designs and designing for CSCL

Educational design embedded in university teaching practices

Over the last few years, I’ve been claiming that there is a huge amount of educational design knowledge embedded in the working practices of experienced university teachers. This knowledge is very unevenly distributed and we need better ways of sharing it.

With colleagues Lucila Carvalho, Kate Thompson, Pippa Yeoman and others, I’ve tried to promote some ways of working on this problem. Among them is the ‘ACAD’ framework, which is meant to help designers think separately about – and then bring into some kind of harmony – task design, social design and the design (or setting in place) of material and digital tools and resources. In other words, design needs to attend to (a) what students are being asked to do, (b) how they should work together to do it, (c) what tools etc they’ll need (with some careful thought about what can be digital, what should be in material form and so on).

All of this design thinking needs to be understood as non-deterministic: design works indirectly – what students actually do at ‘learn time’ is what shapes the actual outcomes of the task they tackle. But that dependence on what students actually do doesn’t absolve the teacher-as-designer of the responsibility for thinking things through carefully. Far from it.

I’ve been writing and giving talks about this for 20 years or so. Sometimes people get it. Sometimes I feel they don’t. The ACAD framework and some of the thinking behind it can be found in my other design papers on this site. There’s also a really good new paper in the British Educational Research Journal by Lucila and Pippa.

But just a few minutes ago I read this post by Danica Savonick on the hastac website and it is just fabulous: both as an example of the careful thinking that has gone into the design and (selfishly) as an illustration of what we keep banging on about with ACAD.

Please take 5 and read it. You don’t need to be a literature teacher. You just have to care about students learning.

And bye-the-bye, it’s a lovely illustration of what we talk about in our ACAD shtick.

Two key points:

1) You don’t need ACAD (or any formalised model of ‘how to do design for learning’) to come up with a design like the one Danica Savonick is sharing. I understand her example has emerged from her own practice and quite likely has evolved over a few trials. It’s what designers can do, without knowing they are designing or thinking of themselves as designers (or wearing black clothes). I see lots of academics solving very complex design problems without positioning themselves as designers or drawing on ‘how to design for learning’ texts or methods. NB in saying this, I’m not taking anything away from what Danica Savonick has designed. I don’t know her and for all I know she has some background in ID. (I just don’t think that’s the case though. The example reads like a pure distillation of knowledge accumulated in practice rather than anything inflected with justifications from learning and design theory.) Whatever, it’s a lovely piece of design.

2) Most of the knowledge bound up in the example is what design theorists Harold Nelson and Erik Stolterman call ‘knowledge of the real’ (rather than ‘knowledge of the (universally) true’). Of course, there is also ‘knowledge of the ideal’ – in the sense that Danica Savonick knows why this exercise is worth doing. But the design is replete with particulars– real things to get right – and has little truck with the illusory universal truths of learning theory. (“Group work is better than individual reflection”, “All classes should be flipped”, “Direct instruction beats discovery learning” etc.)

Nelson & Stolterman claim that design is the ‘first tradition’ in human development – before science and creative arts – and that it involves subtle inter-weavings of what is true, what is real and what is ideal. Skilled practice often involves design: we need to get better at recognising it and learning from it. Head for the hastac website now!

 

ACAD stands for ‘Activity-Centred Analysis and Design’

 

References/further reading

 

Nelson, H., & Stolterman, E. (2014). The design way: intentional change in an unpredictable world(2nd ed.). Cambridge MA: MIT Press.

Carvalho, L., & Goodyear, P. (Eds.). (2014). The architecture of productive learning networks. New York: Routledge.

Carvalho, L., Goodyear, P., & de Laat, M. (Eds.). (2017). Place-based spaces for networked learning. New York: Routledge.

Carvalho, L., & Yeoman, P. (2018). Framing learning entanglement in innovative learning spaces: Connecting theory, design and practice. British Educational Research Journal, 0(0). doi:doi:10.1002/berj.3483