For some time, there has been a "performance" component in the budget allocation from the Faculty of Engineering to the Schools in the Faculty. UNSW has also been keen of late to push "performance indicators" (PIs) as a means of distributing the funds UNSW receives under the Commonwealth Learning and Teaching Performance fund. In both cases, the intention presumably is to try to modify the processes and behaviour within the Schools. Do any of the PIs necessarily lead to better student learning outcomes? Doubtful. But even so, there is still a funding imperative to comply. If we are required to "follow the rules", how can we do it at minimal cost to ourselves and with as little impact as possible on the teaching practices in this School that do actually assist student learning. This brief report aims to describe what factors are being considered for PI purposes.
The UNSW Performance Indicators are tied directly to the requirements in the Commonwealth Government's Teaching and Learning Performance Fund. Some have little impact at School level, but others are based on an accumulation of data from the Schools, and presumably the School data will lead to a determination of funding. Some are reflected in the Faculty PIs described below. The indicators are:
A current and recent institutional learning and teaching plan or strategy that is publicly available on the University's website.
Evidence of systematic support for professional development in learning and teaching for sessional and full-time academic staff. The documentation must be publicly available on the University's website.
Evidence of probation and promotion practices and policies which include effectiveness as a teacher. The documentation must be publicly available on the University's website.
Evidence of systematic student evaluation of teaching and subjects that informs probation and promotion decisions for academic positions. The documentation must be publicly available on the University's website.
Evidence that student evaluations of subjects are publicly available on the University's website.
Full details are available at: http://www.ltu.unsw.edu.au/content/userDocs/LTPF2008%20Stage_1_UNSW.pdf
Some comments on the impact of these indicators at School level:
Around accreditation time, the Faculty asked all Schools to write a teaching and learning plan. Our current plan is available. That was 2006. They will no doubt be asking for another soon in order to satisfy the "current" requirement.
The University is currently looking at how effectively the Faculties induct sessional academic staff. Engineering has asked each School to produce a report on how they do e.g. tutor training, etc. CSE is perceived to be the best School in the Faculty in terms of how it deals with its tutors.
I'm not sure that this PI has a significant effect on practices in the School. Maybe it effects attitudes to teaching.
UNSW is clearly interested in collecting data on whether student evaluations are carried out. It's less clear how it tracks the flow-on into the probation and promotion system.
Faculty-wide summary statistics for the CATEI evaluations are posted on the UNSW web site (http://www.unsw.edu.au/learning/pve/catei.html). Whether these meet the criteria that "evaluations of subjects" are available is doubtful; on the other hand, the UNSW branch of the NTEU is adamant that such data should never be posted.
The Faculty of Engineering has published a set of PIs for Schools for 2008. They follow five broad areas:
The only one directly relevant to the TC is the Student Experience. I have reproduced below (without permission) the Faculty 2008 PIs for this area.
Be the destination of choice for students with the highest potential irrespective of background
|Rating 1||Rating 2||Rating 3||Rating 4||Rating 5|
|No L&T plan||L&T plan with minimal goals and no facility or resource development plan||Clearly implemented and reviewed L&T plan including facility and resource development|
|<70% course outlines meet UNSW requirements including graduate attributes and feedback to students||80% course outlines meet UNSW requirements including graduate attributes and feedback to students||All course outlines meet UNSW requirements including graduate attributes and feedback to students|
|<80% exam marks submitted by UNSW deadline||>90% exam marks submitted by UNSW deadline||All exam marks submitted by UNSW deadline|
|CEQ in bottom half of GO8+||CEQ in top 50% of GO8+||CEQ at top of GO8+|
|<60% students in Good Standing||>70% students in Good Standing||>80% students in Good Standing|
|<80% courses reviewed by CATEI each 3 years and have proper reports to HoS and Dean||90% courses reviewed by CATEI each 3 years and have proper reports to HoS and Dean||All courses reviewed by CATEI each 3 years and have proper reports to HoS and Dean|
|Student CATEI evaluations average <2.7||Student CATEI evaluations average 3||Student CATEI evaluations average >3.3||No students have an international experience||>5% students have an international experience||>10% students have an international experience|
How does the School rate in each category?
The Learning and Teaching Plan put together for accreditation fits the Rating 3 criteria.
Based on the 08s1 Course Outlines (and these were more "compliant" than those for previous sessions), we still have < 70% of Course Outlines that address both "graduate attributes and feedback to students". Most Course Outlines (with one or two exceptions) follow well the UNSW "requirements" (which I thought were originally described as "guidelines").
In 07s2, more than 90% of exam marks were submitted on time. Only a couple of courses were late.
On carrying out CATEI, we are clearly in the Rating 5 band. Our "student CATEI evaluations", by which the Faculty means the single number for overall satisfaction with the course (a measure whose usefulness is debatable), average 3.09. In fact, for most CATEI questions, the CSE average score is a little above the Faculty average score.
I don't have data on our CEQ or how many students have "an international experience". Neither do I currently have data on academic standing within the School, but this seems to be a measure that is easily manipulated and thus not particularly meaningful.
John Shepherd, March 2008