Performance through the lens of evaluation: How to stretch evaluative thinking with strategic decision-making tools

DOIhttp://doi.org/10.1177/1035719X221120293
Published date01 December 2022
Date01 December 2022
Subject MatterAcademic Articles
Academic Article
Evaluation Journal of Australasia
2022, Vol. 22(4) 237253
© The Author(s) 2022
Article reuse guidelines:
sagepub.com/journals-permissions
DOI: 10.1177/1035719X221120293
journals.sagepub.com/home/evj
Performance through the lens
of evaluation: How to stretch
evaluative thinking with
strategic decision-making
tools
Benjamin Harrisand Lyn Alderman
University of Southern Queensland, Toowoomba, QLD, Australia
Abstract
Australian higher education continues to be plagued with decreases in funding and
increases in regulation. This is leading to a reality within many universities where quality
alone is not enough to consider academic programs viable and sustainable. Today,
universities compete f‌iercely for students, as increasing or retaining market share is
required to remain f‌inancially sound. Universities usually make these diff‌icult decisions
through an analysis of internal student data as a metric of performance. Factors such as
declining student enrolments, high attrition rates, low progression rates and poor
student feedback would typically strike university executives as alarming; however, this
is often not the full picture. This process can often become political and not grounded
in evidence-based informed decision-making, as strategic decision-making to reduce
academic programs may have direct impact on academic employment. Moreover, these
analyses often lack independent evaluation and consideration of the broader envi-
ronment. This can lead to tensions between faculty and university administration,
which may lead to political outcomes guided by passionate academic debate rather than
strategic evidence-based decision-making. This theoretical article outlines how an
internal evaluation team can contribute to this exercise to stretch evaluative thinking by
applying a range of strategic decision-making tools to evaluate academic program
performance.
Corresponding author:
Benjamin Harris, University of Southern Queensland, West Street, Toowoomba, QLD 4350, Australia.
Email: Benjamin.Harris@usq.edu.au
Keywords
strategy, academic program review, program prioritisation, BCG analysis, evaluation,
higher education
Introduction
The Tertiary Educational Quality and Standards Authority (TEQSA) is a federal
government agency that is the external regulator for the Australian higher education
sector. Each self-accrediting university is reviewed in a seven-year cycle. In 2019, an
external driver for change at the University of Southern Queensland was a condition
placed on its registration by Teritary Education Quality and Standards Agency (2018),
and the main theme in the condition was a lack of consideration of evidence by ac-
ademic board to inform proper academic governance of the universitys activities. One
response was the academic division of the university embarked on a range of strategies
including implementation of a new Academic Plan, 20212023 with the subproject
being the University of Southern QueenslandsAcademic Quality Framework, 2019
2022 (University of Southern Queensland, 2019). These strategic initiatives were the
catalysts to engender academic debate as to what constitutes the quality of the student
experience, guide and embed good practice through academic policies and procedures
in learning and teaching, and reconsider how the current suite of academic programs
strategically support the ambitions and goals of the university.
This article will discuss four tools that have been adopted by University of Southern
QueenslandsAcademic Quality Project and sparked the desire to invest in an eval-
uation team to provide this reporting. Some of these tools may be known to evaluators,
and specif‌ically educational evaluators. However, the authors argue that while each of
these tools can be used independently, using the tools in combination creates a more
robust evaluation of the complete environment the academic program operates in. The
four tools discussed in this article are the academic program review, academic program
prioritisation, BCG analysis and Rogs (2012) context-sensitive evaluation.Itis
argued that such reporting would add an additional objective layer of data that leads to
more evidence-based strategic decision-making regarding academic program viability
and sustainability. When considering expansion into new disciplines, outcomes of the
application of the four tools also offer objective data when considering moving into a
new opportunity space.
This article forms part of a special issued for the journal and is based on the
theoretical model Continuous Learning Framework and its four elements of quality
Accountability, Improvement, Performance and Investment (Alderman, 2014;2022,
this issue). Therefore, the application of the performance element from the Continuous
Learning Framework has offered the University of Southern Queensland increased
insight and objectivity when reviewing existing and new academic programs.
238 Evaluation Journal of Australasia 22(4)

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT