Ten years after

It’s 10 years since the Labor government decided to kill the Australian Learning and Teaching Council (ALTC). Floods in Brisbane caused the Gillard government to look for cuts in higher education, in order to fund repair of damaged infrastructure.

A short, sharp campaign by many of us involved in Australian higher education led to a temporary reprieve on funding. Fatally, the government decided to shift key functions of ALTC into the federal bureaucracy, which made it easier to defund in successive budgets.

What’s more, a government department (the Office of Learning and Teaching, or OLT) could never match ALTC’s ability to energise a community of activists willing to commit their own time to the collective improvement of teaching and learning. Every dollar of ALTC money generated many dollars worth of extra voluntary effort. 

I’ve lodged a few resources from the time of the 2011 campaign here.

A selection of quotes from supporters.

Text I drafted arguing the case for the ALTC, which became the basis for a letter that Sally Kift and I wrote, which was published in The Australian.

Statements from the Joint Councils of Deans, the professional engineering and engineering education communities, the National Tertiary Education Union and the National Union of Students.

The Australian Association for Research in Education (AARE) was one of the first member organisations in the country to start mobilising support for ALTC. Its letter to the Prime Minister included the following paragraphs:

  • ALTC has played an extraordinarily beneficial role in Australian Higher Education in its very short history. It is hard to think of an organisation which has attracted such widespread support in the university sector – at both grassroots and senior management levels.
  • ALTC has emerged as a champion for collective action on the improvement of learning and teaching in higher education. It is contributing to the development of teaching methods and learning environments that support a wider range of student learning needs. In so doing, it helps universities become more inclusive institutions.

The Australian higher education press gave the campaign to save ALTC some reasonable coverage, though it looks rather lukewarm and disengaged, to my eyes.

John Ross, Campus Review, 31 Jan 2011: HE community fights ALTC closure

Joanna Mather, Australian Financial Review, 14 Feb 2011, Deans urge Gillard to save key teaching group

Since the closure of ALTC and the withering away of its successor (the OLT), people who care about the improvement of learning and teaching in HE have, from time to time, canvassed proposals for some new organisation or initiative in this important but neglected space. It turns out to be much harder to get a consensus around the design of something new than around the saving of something that already exists and is valued. The ALTC began life as the Carrick Institute, funded by John Howard’s conservative government. Today’s conservatives display a hostility towards universities that makes it difficult to envisage them investing in this area.

Labor’s decision ten years ago was a grievous mistake, but only a Labor government is likely to reinvest in learning and teaching in higher education in the foreseeable future. ALTC worked. We need it, or something very like it, to feature in Labor’s manifesto commitments.

Impact and engagement

A few notes to accompany a panel session at Deakin, organised by CRADLE 14th October 2019.

Although my publications are reasonably well-cited and I can say that some of my work is taken up by other academics, my impact on policy and practice is quite marginal. There are claims I could make about specific areas of change in curricula or in how teams approach the design of learning environments. But these claims feel patchy to me: important in a specific program or university, but nothing that would count as credible evidence of impact at scale.

However – and this is an example of shiftily switching a practical into an academic problem – I am very interested in the pathways from research to policy and practice change. So, for example, I’ve been carrying out research on:

  • how university leaders construe the challenges of integrating educational, IT and physical infrastructure planning,
  • how to make educational design experience and design ideas easier to share and re-use,
  • how teams of academics and educational developers collaboratively design for students’ learning, and
  • what counts as ‘actionable knowledge’ in/for the design of programs of professional education.

I’ve also worked with AARE and other organisations, in Australia and elsewhere, on aspects of research policy: including approaches to the evaluation of research quality and impact and strategies for research capacity-building and engagement with ‘non-academic’ users of research.

At the panel session today, I summarised three ways that researchers in (higher) education tackle the challenges of ‘impact and engagement’. These descriptions are very broad brush, and not meant to offend. I’m calling them ‘thoughts and prayers’, ‘branding innovations’ and ‘research-practice partnerships’.

‘Thoughts and prayers’ is the default. A researcher writes up a study, an educational innovation or whatever, publishes a paper in a higher education journal and hopes that someone will read it, and be inspired to change what they do.

Much more noisy and visible is the work that goes on when a person or team coins a persuasive term and markets it hard. I will try not to be too cynical about this. Education is prone to fads and fashions and a set of research-based ideas can be taken up quite readily if they are presented as a discrete and coherent whole. I’m sure we can all think of some examples where a pithy phrase transforms into something that can be trade-marked, branded and/or sold as a commodity. Epistemic fluency,  teaching-as-design, design thinking, evaluative judgement, feedback literacy, visible learning, productive failure, flipped classrooms; even such large and hairy mammoths as PBL.

Only a small proportion of these get a breakthrough into the mass market. However:

  • those that do make it big tend to be used to set the mould (or expectations, or standards) for what educational impact should look like and
  • educational practices and educational systems have shown they are capable of radically reinterpreting research-based interventions and actually realising something very different from what was tested in the original research, and
  • what is easy to pick up as a package is easy to drop as a package.

A recent article in the ‘Fairfax’ papers illustrates this, with Deanna Kuhn’s work on ‘growth mindset’ as the example. The original research was deep, painstaking and insightful. The educational take-up, around the world, has been widespread and enthusiastic. But implementations are many and varied and some have moved a long way from anything Kuhn would recognise.  In other words ‘implementation fidelity’ is far from guaranteed in educational systems, so the connections between research, practice and outcomes can be very tenuous.

This brings me to the third approach, which I’ll subsume under the heading of ‘research practice partnerships’. There’s an excellent book on this by Bill Penuel and Daniel Gallagher. Rob Ellis and I summarised some of the ideas, customised for higher education, in the second half of our most recent book. The organising theme here is engagement, with impact on practice as one of the benefits – accompanied by a stronger, reciprocal, role for practitioners to shape research. The RPP idea has shaped some of our work in setting up the Centre for Research on Learning and Innovation at Sydney University, though we have a long way to go yet.

Such partnerships have also influenced a strand in my own research – such that I’ve chosen to research educational practices in which there’s a reasonable chance that research-based knowledge will prove useful. For example, common sense and good evidence suggest that teachers are much better placed to consult research when they are designing for learning (‘upstream’ of a learning activity) than when they are in the middle of a live teaching-learning event. In addition, when the right materials, tools or spaces become available at the right time, they are likely to become part of prevailing practices and have beneficial and sustainable effects. Research-based ideas that take on a material form, such that they can become entangled in – and reshape – existing practices, live a different life from those that sit silently in the literature. Hence, I’ve researched the dynamics of design teams’ working practices and have experimented with rendering research-based insights in readily materializable forms (such as design patterns).

My final point: I’ve had a close involvement in setting up and/or running four research centres in the last 30 years. I’ve been drawn to this mode of working for a number of reasons. But one of them is a realisation that the intensification of pressures on academic researchers means it’s not sensible to try to be outstanding at all aspects/phases of the research lifecycle. For one person to be energetically forming new ideas for projects, securing funding, recruiting and guiding research teams, writing for academic and practitioner audiences, overseeing a suite of dissemination activities, liaising closely with practitioner communities and policy-makers, making cases for internal resources, etc etc – that’s a recipe for burnout and disaster. We can’t all be good at all these things all of the time. Also, some of them really benefit from specialist skills. Hence: if you want to engage in a sustainable way in processes that are likely to improve the impact of your research, you are best advised to work closely with kindred spirits.

See also:

Recent articles on the LSE Impact Blog by John Burgoyne and Toby Green.

The UK REF Impact case studies from Durham on Threshold Concepts and Lancaster on Evaluative research improving policy and practice.

The DETYA report on The Impact of Educational Research – published in 2000, but thorough and full of insights.

 

Australian educational research – getting your facts right

On the 19th March 2013, Alan Tudge MP, made a speech criticising the quality of educational research in Australia. The text of the speech is here. It includes the following statements.

“We are spending billions on education research, but it is not having the impact it should. Worse, our education faculties are failing to be engines for ideas at a time when school outcomes have dropped despite a huge increase in public funds for school education.”

“… it is a waste of public money if education research is not of a high standard and is not having impact. Over the last decade $1.7 billion has been spent on education research. It is a sector that has been growing steadily each and every year and now employs almost 3,000 people. If the ARC’s report is indicative of the decade, then we can say that nearly a billion dollars has been spent on below standard work. What would an extra billion dollars have achieved in, say, biotechnology, a research field that is universally at or above world standard? 

… education faculties are not having an impact at a time when high quality, evidence-based research is desperately needed.”

Bernard Lane, then working at The Australian, approached the Australian Council of Deans of Education (ACDE) for comment. I prepared the following notes for ACDE and the Australian Association for Research in Education (AARE).

Since the notes were written, the Australian Bureau of Statistics (ABS) have reversed some of the decisions that made educational research’s footprint unnecessarily hard to assess in the ERA exercises.

The number of free-standing education faculties has continued to diminish.

Notes from 26 March 2013.

Point 1: ERA’s definition of Education isn’t what Alan Tudge means by Education

Alan Tudge appears not to understand that the ERA definition of educational research doesn’t align with how universities are organised internally. We (AARE & ACDE) conducted a survey last year to learn more about what’s behind the ERA figures. 

It turns out that around 40% of the educational research assessed in ERA2010 and ERA2012 was produced by people who do NOT work in departments/faculties of education. 

Much of this research is being conducted by other university academics as part of improving their own approaches to teaching – these people are found in all departments/faculties. Their research is not about schools – it’s about higher education, which is generally judged to be an Australian success story. 

Also, not all the research done in education departments/faculties is labelled Education in ERA. The classification system used in ERA (designed by the ABS) excludes the following from Education: educational psychology (including how people learn), educational policy (including how to design and manage better education systems), sociology of education (e.g. understanding how social disadvantage effects educational outcomes). 

The ABS defined Education in such a way that these key areas are NOT included in the statistics or the ERA results for Education. Some of Australia’s most prolific educational researchers (esp. in psychology of education) find that their work is not classified as Education by the ABS and ERA. 

Point 2: He hasn’t caught up with the fact that there are very few free-standing departments & faculties of Education any more. 

Most are mixed into larger social science and/or professional education schools/faculties. His ‘aunt sally’ (or the straw man he’s trying to attack) isn’t there any more.

This makes it harder than it used to be to identify someone as being ‘from Education’ – so when Mr Tudge says that ‘education academics are missing’ (from debates in the print media) one wonders how he knows who is from Education & who not (or even, whether that’s a sensible question any more).   

Point 3: The international footprint of Australian educational research has put it among the top 3-4 fields of Australian research during the last decade – the period during which Mr Tudge reckons we were wasting money. ARC Annual reports, on several occasions during the last decade, have used the success of education research as an indicator of the international visibility of all the research ARC funds.

Point 4: Education research (on the ERA definition) is underfunded compared to other fields/disciplines. 

ERA2010 data showed that for every (full-time equivalent) Education researcher there is around $17k of funding per year; the equivalent figure for Studies in Human Society (sociology etc) is $36K, Economics $44k, Biology $90k, Medicine/Health $152k per full-time equivalent researcher. 

Point 5: Education (FoR13) has improved 2010-2012

Using the ARC ERA data, 12 universities have improved their rating, 3 have gone down. There’s no room for complacency, but these are encouraging signs of improvement.

Peter Goodyear, Professor of Education, University of Sydney. (Led a joint AARE/ACDE initiative 2011/12 on building Australian educational research capacity.)