Categorised in:

My University

Rod Beecham on The reductio ad absurdum of Commonwealth education policy

The Commonwealth Education Minister has been reported as saying that she wants tertiary students ‘to make decisions about where they want to study on the basis of robust information about the quality of education provided at each institution rather than on hearsay, inference from entry requirements or prestige’. I think that’s a splendid idea. I am at a loss, therefore, to understand how she imagines she is furthering the cause by introducing a My University website.

The indicators of ‘university quality’ have not, it seems, been finalised but, in relation to teaching quality, mention has been made of completion and attrition rates, the results of satisfaction surveys, and the performance of students in standardised tests. For research quality we have heard noises about journal publications and citations.

The point has been made again and again that teaching quality and research quality cannot be measured directly because they are entirely qualitative activities. Nothing that is essential to them is capable of numerical measurement. The desire to measure them nevertheless has sent people casting about for something they can measure that will substitute for the thing itself. This is why completion rates, Likert-scale student satisfaction questionnaires, standardised testing and volume of publications feature so prominently. In social research such indicators are called ‘proxy variables’.

As every social researcher knows, proxy variables must be handled with great care. If, for example, we decided to measure the intelligence of a group of people by using the number of tertiary qualifications held within the group, we would be immediately and rightly vulnerable to attack on a variety of grounds. What is the relationship between tertiary qualifications and human intelligence? Is intellectual endeavour the only arena in which intelligence can manifest itself? What do we mean by ‘tertiary qualifications’? Is a Bachelor of Laws equivalent to a Bachelor of Medicine? Why? Why not? And so on.

The attraction of proxy variables is that they purport to represent in numbers phenomena that are in themselves unquantifiable. The danger of proxy variables is that they can rapidly replace the phenomena they purport to represent. This is because they are much simpler and easier to understand than a complex, qualitative phenomenon and because, as numbers, they can be produced at will to provide ‘scientific’ evidence in support of a policy decision.

What students thought of their teachers, how they performed in standardised tests and whether they completed their courses of study tell us nothing beyond what they thought of their teachers, how they performed in standardised tests and whether they completed their courses of study. The number of journal publications by a given academic and the frequency with which his/her work has been cited tells us nothing beyond the number of his/her publications and the frequency with which they have been cited. To imbue such data with any further significance is immediately to make assumptions, and rather large assumptions.

But let us assume, for the sake of argument, that these data do provide what the minister boldly describes as ‘robust information’ about the quality of a university. What, then, are we measuring, and why? What, in other words, is the nature of a university, considered as an enterprise? This is not an irrelevant question, because the whole purpose of a quality management system is to ensure that the input to a given system is converted to the desired output as efficiently as possible every time.

Quality of Teaching
A trawl through the Websites of Australia’s thirty-nine universities reveals a significant level of reticence on the part of the institutions themselves about what they do. The University of Sydney suggests that it ‘creates leaders’. The University of Melbourne purports to sell ‘academic knowledge’, ‘career outcomes’ and ‘lifelong connections’. The University of Queensland asserts that its graduates are ‘in demand’. The University of Tasmania suggests that it will ‘expand your knowledge’ and allow you to ‘discover your place in the world’. The University of Adelaide alleges that it ‘could change your life’. The University of Western Australia asserts that it has ‘the highest quality undergraduates of any university in Australia’.

These claims scarcely begin to illuminate what it is you are buying when you pay your course fees. They do imply, though, that you will be a better person for the experience, as if a university is a sort of ‘character factory’, like the Boy Scouts or the traditional English public school. The claims suggest that a university’s input is its students who, the university asserts, will undergo some kind of transforming experience that will convert them to output (by the time they graduate, we must assume).

Let us assume, further, that this output takes the form of young people trained to contribute effectively to perceived areas of national importance such as medicine, agriculture, engineering and so on. How will the My University website measure the quality of this output? Completion rates tell you nothing beyond what proportion of students completed their courses. Student satisfaction surveys tell you nothing beyond whether the students are enjoying their experience at the time: they can’t tell you whether the students are going to be good doctors, agronomists or structural engineers. Performance in standardised tests means little unless the tests are administered after graduation in discipline-specific areas.

I suggest that the minister needs to think a little harder about what it is she is trying to achieve with a My University website. A young person who wishes to become a doctor, an agronomist or an engineer is hardly going to be in a position to make an informed decision about where s/he wants to study on the basis of completion-rate statistics, student satisfaction surveys and standardised test results.

Quality of Research
Teaching, of course, is only one of the things a university is supposed to do. How do the mooted quality measures relate to the other major activity, research?

The University of Sydney states that its research ‘spans all areas of human endeavour’, is based on ‘truth’ as a ‘core value’, and leads to ‘innovation’. The University of Melbourne presents numbers for the year 2008: ‘produced 117 articles or reviews of impact factor greater than 20 in which collaborative country addresses numbered 267 from 55 countries’. The University of Queensland wants ‘to achieve excellence in research and scholarship, and to make a significant contribution to intellectual, cultural, social, and economic life at a local, national, and international level’. The University of Tasmania mentions its ‘internationally recognised research profile in marine and Antarctic science, agriculture, forestry, food science, aquaculture, geology and geometallurgy, and medical research.’ The University of Adelaide, under the heading ‘Research Achievements’, lists its research income, its Go8 per capita income, its numbers of publications and numbers of higher degree by research completions. The University of Western Australia asserts that an emphasis on research and research training is one of its defining characteristics, indicating that it has ‘determined six strategic research areas and several emerging and seed priorities to provide appropriate focus and direction’ to its activities.

These claims suggest that the nature and meaning of ‘research’ is less important to Australian universities than creating an impression of vigorous research activity. A discernible undercurrent in these web promotions is the notion of commercially applicable research, or ‘knowledge transfer’, as the University of Melbourne calls it. The Vice-Chancellor of the University of Queensland, in an elegantly expressed introduction, uses the term ‘translational research’ as a way of collapsing the traditional distinction between ‘pure’ and ‘applied’. In other words, the ‘research’ undertaken by the universities is designed to attract conditionally released funding.

It will be recalled that journal publications and citations have been suggested as ways in which the quality of this research might be measured. Again, we see proxy variables in action. The quality of someone’s research cannot be measured according to any pre-existing standard because the purpose of research, properly defined, is discovery. The whole history of human advancement was written by people who tried something new or looked in a new way at something apparently familiar. To measure research quality by numbers of peer-reviewed publications is to assume that all publications are of equal importance and that a correspondence exists between research quality and volume. Albert Einstein might have scored well on this criterion in 1905, when he published four papers in Annalen der Physik that were, quite literally, epoch-making—but he wasn’t working in a university at the time. In any case, he would have done much less well in subsequent years, which suggests that excellence will be penalised under this rating system. If you propose energy quanta, a stochastic model of Brownian motion, a special theory of relativity, and the equivalence equation (E = mc2) all in one year, it seems unlikely that you will be able to sustain such a level of output in subsequent years, meaning that the ‘quality’ of your research, as measured by publication volume, has declined.

The other suggested measure of research quality, citation of your work by others, seems vulnerable for similar reasons. We have every reason to be grateful for the discovery of penicillin, but Alexander Fleming’s paper on the subject in the 1929 British Journal of Experimental Pathology was little noticed at the time. His university career may have come to an inglorious end if his ‘performance’ had been judged according to the number of citations by his peers.

Implications
The Commonwealth government suggests that the universities educate our future ‘professional’ workforce, create future ‘leaders’, and drive much of our economic and regional ‘success’. The website of the Department of Education, Employment and Workplace Relations says that universities play ‘a key role in the growing knowledge- and innovation-based economic health of Australia’.

These statements, I think, go far towards explaining the vagueness and incoherence of our universities’ own stated reasons for being. With their funding cut and their financial viability increasingly dependent on fee revenue from overseas students, Australian universities must present themselves as places where prospective young ‘professionals’ can increase their brand equity. Teaching, therefore, is to be measured in terms of customer satisfaction, and research is to be measured in terms of productivity—meaning publication and citation volume and, in the national context, commercial applicability.

This is wrong not so much because it has the effect of marginalising and destroying humanities disciplines such as literature, history, philosophy and classics, and pure science disciplines such as physics and chemistry—this destruction can always be justified on the grounds that such disciplines are not ‘useful’—but because it has the effect of separating educational effort completely from the ostensible reasons for which it is undertaken. It is the inevitable consequence of proxy variables.

If analysis of Australia’s economic position indicates that we need more school-teachers, for example, how will completion and attrition rates, student satisfaction surveys and standardised test results assist the Commonwealth government in assessing our universities’ response to this perceived need? If the number of corporate failures suggests that we need more skilled auditors and forensic accountants, how will completion and attrition rates, student satisfaction surveys and standardised test results assist the Commonwealth government in assessing our universities’ response to this perceived need? The answer, of course, is that they won’t assist at all. There is no correlation between what is measured and the ostensible reason for the measurement.

Will the proposed measurement of research output help? Let us suppose that Australia’s economic performance is adversely affected by the outbreak of a new strain of influenza in our major trading partners, leading to trade embargoes to reduce the risk of the infection spreading. Will the Commonwealth government re-direct all research funding into virological research and immune responses? Would it make any difference? Suppose there is a revolution in Chile, causing the base metals operations of BHP Billiton in that country to be suspended, with a flow-on effect on commodity prices and the value of BHP Billiton shares. How will the number of articles contributed by Australian academics to the Journal of Futures Markets or the International Journal of Advanced Manufacturing Technology indicate the quality of our universities’ response to the problem?

The utter incoherence of stated higher educational policy and the stated purpose of universities is a consequence of measurement being made into an end in itself, a transferable process indifferent to subject. The purpose of a quality management system, as any manufacturer knows, is to ensure that the input to a given system is converted to the desired output as efficiently as possible every time. The proposed measures of university quality, however, do not even begin to do this. The urge to measure has supplanted the reasons for undertaking measurement. Quality, as a consequence, has lost its meanings, which must always be contextual, and become instead a floating abstraction to be associated with whatever proxy variables are expedient. The reductive fatuities of the My University website are a paradigmatic example of the process, and would simply be funny if they weren’t so likely to increase the sum of human misery.

Rod Beecham is an independent Education Advisor
http://www.rodbeecham.com.au/publications.html

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>