1
THE “BALANCED SCORECARD” IN FACILITIES MANAGEMENT for ‘internal’ management and external benchmarking Paul Coronel and Anne Evans Directors, Benchmarking PLUS Melbourne August 1999 ACKNOWLEDGEMENT This paper is the result of collaborative work between a number of people and the of the AAPPA Board. In particular we wish to acknowledge the input of time and intellectual capital from the following heads of facilities management organisations, convened by Brian Fenn who was the catalyst for the whole exercise. They played a team leader role in particular aspects of development: Brian Fenn Sam Ragusa Andrew Frowd Neville Thiele David Spedding Kelvin Crump
Queensland University of Technology; Griffith University; University of Wollongong; University of South Australia; Deakin University; Facilities Management Services, TAFE, Queensland;
Also the input of a number of other facilities managers who participated in a workshop to assist in identifying relevant performance measures for facilities management within a Balanced Scorecard framework.
CONTENTS Introduction Some characteristics of top performing organisations How the ‘balanced scorecard’ fits in The essential characteristics of the Balanced Scorecard approach How it can apply to FM in a tertiary education context (for both internal management and benchmarking purposes) Linkages between organisational levels and functions Establishing appropriate objectives for FM in the Balanced Scorecard context Broad structure of a Balanced Scorecard A sample of the KPIs in a Balanced Scorecard for facilities management Linkages between KPIs at different levels Characteristics of KPIs in a balanced Scorecard - ‘internal’ versus ‘benchmarking’ versions Conclusion APPENDIX:
Detailed format of Balanced Scorecards for facilities management
2 INTRODUCTION Some characteristics of top performing organisations It is clear from personal experience and consensus among observers of exceptional organisations, that top performing organisations have a number of aspects in common including the following: • ‘Customer Focus’: a clear understanding of their customers’ identity and their key needs, and an effective means of response at both the strategic and the operational level; • Leadership: particularly in the sense of defining what things the organisation must ‘get right’ and to communicate this clearly throughout the organisation; • Use of information to manage: a coherent performance reporting structure that helps integrate the actions of managers at different levels in pursuit of their key objectives; • A restless quest for improvement: they will do what is necessary to ensure they identify what is ‘best practice’ and adapt it to fit their own organisation as soon as possible.
HOW THE ‘BALANCED SCORECARD’ FITS IN The Balanced Scorecard is an approach to setting up performance measurement structures that help an organisation with some of the characteristics mentioned above. It was first featured in the Harvard Business Review early in 1992*. (Ref to K&N article) Since then it has been used in many countries and industries as the basis of a top-down reporting structure which knits together the desired strategic perspective of an organisation with its management actions at various levels. A smaller number of organisations have also used it as a framework for performance comparisons (“performance benchmarking”) with other organisations. It clearly makes sense to use similar measures for external comparisons to those that one uses to manage inside the organisation. The essential characteristics of the Balanced Scorecard approach are as follows: • performance measurement from four perspectives ensures the focus is not merely on short term cost / financial performance (See slide 2: the question posed in each of the four perspectives help give the essence of what is being measured.); • a review of the key objectives of the organisation. Existing objectives may need to be restated or modified to gain the desired clarity and ‘balance’ between the four
3 perspectives of the Balanced Scorecard approach. (Our experience has shown that this is necessary more often than not, both from clarity and balance viewpoints.); • linking each of these objectives to between one and three key performance measures which together are used as the ‘scorecard’ at the top of the organisation (See slide 3: a typical scorecard has between 12 and 24 key performance measures in all; having fewer means the risk of too narrow an outlook while risking more confusion and lack of focus.); • ‘cascading’; that is connecting the key performance measures at the top level with similar measures at other levels in the organisation. These should be of direct relevance to managers in the various jobs at these other levels. Thus ‘cascading’ s the communication which goes with delegation and ability. In this way the perspective and consequently the actions of managers in different parts of the organisation are more co-ordinated and focussed on achievement in accord with a balanced set of key objectives.
HOW IT CAN APPLY TO FACILITIES MANAGEMENT IN A TERTIARY EDUCATION CONTEXT (for both internal management and benchmarking purposes) Linkages between organisational levels and functions In my job I see a very wide variety of industry sectors and organisations. Two of the most complex, in of services delivered to the end or ‘customer’ are public hospitals and local government organisations. Facilities management organisations in the tertiary education sector rank up there in my view. For this reason we propose a structure we have used elsewhere; a ‘Top Level’ Balanced Scorecard with ing Balanced Scorecards for major functions within facilities management. (This is illustrated in slide 4). The four ing scorecards shown are indicative; the precise number and focus of the ing scorecards in any single facilities management organisation will depend on the range and magnitude of functions that are under the FM Manager’s jurisdiction. Also to a degree on the way in which that person has divided the ability for the various functions between managers reporting to him/her. Establishing appropriate objectives for FM in the Balanced Scorecard context It goes without saying that facilities management s the fundamental activities of the University. The objectives for FM should therefore be compatible with those of the ‘parent’ organisation.
4 Balanced Scorecard for ‘Internal’ purposes ( for use by individual FM Managers to help run their own organisations) In this case the facilities management objectives as a total set will be unique and specific to the circumstances of the particular FM organisation and its parent body. Individual objectives may be the same or similar to other FM organisations but it is highly unlikely that this will be true of the whole set. Balanced Scorecard for ‘External’ purposes (for use by a number of FM Managers to benchmark quantitative performance measures and qualitative measures / practices between their organisations) In this case the facilities management objectives must be generic to be able to serve all the participating organisations and have a degree of relevance to each of them. The objectives provide a stimulus for choosing a balanced set of key performance indicators but their role need go little further. Broad structure of a Balanced Scorecard The implications are shown in slides 5 and 6. Balanced Scorecard for ‘External’ purposes For benchmarking purposes the Balanced Scorecard has a set of generic objectives and key performance measures (KPIs). Balanced Scorecard for ‘Internal’ purposes For ‘internal’ purposes the Balanced Scorecard has a set of objectives specific to the individual FM organisation, and the KPIs are used to measure the level of achievement against those objectives. In addition the Facilities Manager will probably wish to: •
define targets for the managers responsible for each KPI;
•
document strategies and tasks by which the objectives will be achieved;
•
fix abilities among his/her people for each strategy and /or task.
These additional features MAY - but do not have to - be included on the scorecard structure. It is up to the individual manager. (Two of the leading people in our working party, Sam Ragusa and Andrew Frowd, have included some or all of these elements in scorecards for their own organisations. Both approaches differ somewhat from each other, as is to be expected).
5 A SAMPLE OF THE KEY PERFORMANCE INDICATORS IN A BALANCED SCORECARD FOR FACILITIES MANAGEMENT These are set out on the next two pages. Note that it is a sample only and that the full version is attached at the end of this paper. It is in much greater detail, and includes the ‘Top Level’ scorecard and the four ing scorecards relating to Maintenance, Capital Works, Cleaning and Security. Note in the sample that there is a mix of quantitative and qualitative measures. Note also that each of the quantitative measures is stated in specific . This is critically important when benchmarking, but also for internal purposes. To highlight what I mean, consider the contrast between a KPI stated as ‘satisfied customers’, ‘we will satisfy our customers’, or something similar, and the KPI as stated in the sample below ‘score on customer satisfaction survey’. The former is more difficult to report on internally and impossible to use for benchmarking. Unfortunately I have seen many instances of the former in my work over recent years. In addition in the full version the qualitative measures are derived from a series of questions with multiple choice answers. Each answer has a different ‘score’ so that an overall figure can be calculated; this is both for the purpose of benchmarking with others and for simple format reporting of year to year trends for internal purposes.
6 Sample of the KPIs in a Balanced Scorecard for facilities management
Customer Perspective
Quantitative measures • Score on customer satisfaction survey • Number of complaints per period (per EFT customer staff or per EFTSU) • % compliance with provisions of service level agreements • LTIFR (for customer staff / students) • Incident levels per period (for example security incidents, safety incidents) Qualitative measures / practices • Alignment of strategy with the parent organisation strategy • Practices regarding use of SLAs
Financial Perspective
Quantitative measures • ‘Top Level’ Cost ratios
(for example operating cost per EFTSU)
• Breakdown of the above by function such as cleaning and security (cost per EFTSU, cost per sq metre) • Asset employed ratio
(for example $’000 asset value installed per EFTSU)
• Budget balance ratio
(for example budget variance $ per budget $)
7
Internal Business Perspective
Quantitative measures • Lost Time Injury Frequency Rate (for FM staff & contractors) • Management overheads as % of direct service delivery costs (for example capital works and cleaning services) • Project time variance as % of original plan time • Project cost variance as % of original project cost (both relating to capital works) Qualitative measures / practices • Asset Management & Maintenance practices
Innovation & Learning Perspective (people, processes, technology)
Quantitative measures • % budget spent on improved technology • Annual training days per FM staff member • % improvement in customer satisfaction index • % improvement in operating cost per EFTSU Qualitative measures / practices • Quality practices & process improvement techniques used • Incentives / rewards employed
8 Linkages between KPIs at different levels The sample just given does not show an important element of choosing KPIs; that is the value of having linkages between KPIs at ‘top level’ and the same or similar KPIs at other levels of the facilities management organisation. It is one thing to have a set of detailed measures, but it is another to have the ability to take the ‘helicopter view’ whenever necessary and see the linkages between that view and ‘ground level’. Decision making can be more strategic and yet maintain a practical monitoring and implementation connection with strategy. As a simple example the degree of ‘customer satisfaction’ with facilities management is an aggregate of the satisfaction with its various elements; maintenance, security and so on. Hence if a Customer Satisfaction Score is derived by means of a customer survey for each of the important elements of facilities management it can be summarised at the top level as an average or weighted average. Comparisons with other facilities management organisations are facilitated as is internal management summary reporting, trend analysis, and resource allocation decisions. Some other KPIs where I believe this approach is worthwhile: • Complaints per EFTSU per period • Operating cost per EFTSU • Trends in each of the above and in Customer Satisfaction Scores • Budget balance ratio • Injury rates • Annual training days per person.
Characteristics of KPIs in a balanced Scorecard - ‘internal’ versus ‘benchmarking’ versions There are some important differences which I would like to point out, which influence the design of a choice of KPIs for collective benchmarking purposes as opposed to one for use by an individual facilities management organisation. These are shown in the table over the page.
9
KPIs for benchmarking vs KPIs for internal use
KPIs for collective benchmarking
KPIs for individual organisations
No limit to the number used
Shouldn’t have too many
Individual KPIs don’t need to be relevant to All KPIs need to be relevant all participants
Need absolutely specific definition
Not quite so critical
Reasonable approximations are acceptable
Reasonable approximations may be acceptable
Must be structured to accommodate size differences in participating organisations (eg # complaints/EFTSU)
Not needed for internal comparisons (hence could just have # of complaints)
Mentioning targets is inappropriate (unless the targets are being benchmarked)
Setting targets can be beneficial (eg, ‘Reduce complaints by 10%)
CONCLUSION In this paper I have covered a number of important points in the design and use of the Balanced Scorecard approach, for both collaborative benchmarking between facilities management organisation and management within an individual organisation. The detailed scorecards which form the remainder of this paper are the result of considerable collaborative work between us and a number of individuals who were acknowledged at the beginning of the paper. Hence most of the people involved are highly accomplished facilities managers. That said, there is no doubt room for further refinement. Possible pathways for facilities managers Benchmarking: The scorecards are set out primarily for benchmarking purposes, and it is hoped that they will provide a platform for an addition or adjunct to the excellent AAPPA
10 Benchmarking Survey already in use. If so the performance indicators and qualitative practices questions will provide a ‘template’ for those who wish to follow this route. Internal management: Even though the scorecards are set out primarily for benchmarking purposes, I trust that using the points raised in this paper, facilities managers who are so inclined will be able to adapt the structure and tailor them for use in their own organisation. In this case each can select which of the performance indicators and customer satisfaction survey questions suit their current circumstances, and supplement them with others if desired. Before doing so they will wish to make the objectives more specific to the circumstances of their own institution, and they way wish to set targets and assign individual responsibilities for ‘delivering’ against those targets and indicators. Combination approach: Of course it is possible to do both. In this case some or all of the performance indicators used for internal purposes and qualitative practices questions will also be benchmarked externally, thus providing a more informative basis for self evaluation, planning, and target setting. I thank you for your attention today.
11
Slide 1
The “Balanced Scorecard” in Facilities Management: for internal and benchmarking purposes
Paul Coronel Benchmarking PLUS Wellington, Septe mber 1999
Slide 2
The “Balanced Scorecard” Approach: Performance from Four Perspectives Financial Perspective
Customer Service Perspective
“How are we perceived by our customers?”
“How well do we deliver, financially?”
Innovation & Learning Perspective
Internal Services Perspective
“What must we excel at?”
“Can we continue to improve and create value?”
© B enc h mar k in g PLUS
12
Slide 3
Overview of Structure of Balanced Scorecard Financial Perspective Objectives
Customer Service Perspective Objectives
Objectives
K PI s KPIs
Internal Services Perspective
K PIs
Objectives
K PIs
Innovation & Learning Perspective Objectives
KPIs
1 to 3 objectives p er perspective 1 to 3 KPIs per o bjective Hence 12 to 24 KPIs total
© B enc h mar k in g PLUS
Slide 4
University s Obje ctives linkages
High level FM obj ect ives Incorporated into/drive
FM Balanced Scorecards Top level scorecard
Maintenance
Cleaning
Security
Capital Works
© B enc h mar k in g PLUS
13
Slide 5
Structure of High Level Scorecards for Benchmarking
Financial Perspective Objectives (generic)
KPIs
Customer Service Perspective Objectives (generic)
Internal Services Perspective Objectives (generic)
KPIs
KPIs
Innovation & Learning Perspective Objectives (generic)
KPIs
© B enc h mar k in g PLUS
Slide 6
Structure of High Level Scorecards for Internal Management
Financial Perspective Objectives (specific)
KPIs
Strate gies
Customer Perspective Objectives (specific)
K PIs
Internal Perspective Objectives (specific)
Strategies
K PIs
Strategies
I nnovation/learning Perspective Ob jectives (specific)
KPIs
Strategies
© B enc h mar k in g PLUS
Balanced Scorecard for management & benchmarking
Prepared for AAPPA Conference, September 1999
DETAILED FORMAT AND STRUCTURE OF BALANCED SCORECARDS FOR FACILITIES MANAGEMENT On the succeeding pages the scorecard structure referred to in the presentation paper is set out in detail.
Top Level scorecard; scorecard for Cleaning; scorecard for Security; scorecard for Maintenance; scorecard for Capital Works.
There are five scorecards in all: • • • • •
Each contains the four perspectives - customer, financial, internal processes, and innovation and learning. Also each contains qualitative measures/practices as well as quantitative measures.
They are set out primarily with benchmarking in mind. However on the last sheet of this detailed layout there is a page of a scorecard shown in a format suitable for internal management purposes. To facilitate comparison between the alternative layouts, the subject matter (objectives and KPIs) shown in it corresponds closely with the subject matter in the first sheet of the scorecard for Maintenance which appears on page 16 of this section.
When commencing the first year of a co-operative benchmarking exercise it is common to reduce the total number of measures to make it easy for a greater number of organisations to participate. Even so, not all participants are usually expected to enter every single item of data required to complete the set. In succeeding years the measures are refined and extended in the light of the developing experience of the participants.
_
!
%
#$
"
We trust the material shown in this detailed format will provide a basis for those organisations who wish to extend the scope of their benchmarking, to get started.
Page 14 of 36
Balanced Scorecard for management & benchmarking TOP LEVEL SCORECARD Customer Perspective Performance Indicators
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective Customer satisfaction index
This assumes there is a customer satisfaction survey for each group of ‘services’ (perhaps during the process with the client regarding performance against SLAs - see ‘Qualitative Measures / Practices’).
(“How do our customers see us?”) Achieve highest possible level of customer satisfaction
‘Services’ can & should be grouped, eg, Security, Maintenance, Cleaning; so that they form part of a separate identifiable survey and can be used in the respective scorecard.
Each survey should be simple, capable of being scored. (See the example in the paper given in the breakfast session).
To calculate the Customer satisfaction index shown in this (top level) scorecard, an average can be calculated from the score for each survey. In this way it would be linked to all subsidiary scorecards.
Number of complaints per EFTSU attending
A summary of some Capital Works Scorecard KPIs
Linked to all relevant subsidiary scorecards
This would be very useful for both internal management and external benchmarking.
Capital Works time & budget performance index
A numerical rating of answers against these question is possible for benchmarking purposes.
(
'
)*
+,-
./
012
4
,
3
67
5
8
See ‘Qualitative Measures / Practices’ in this section.
(
They are also good ‘self assessment’ questions for internal management purposes.
_
'
Achieve alignment with the University’s direction and with our customers’ needs.
Page 15 of 36
&'
Balanced Scorecard for management & benchmarking
Customer Perspective (Continued) ‘Qualitative Measures / Practices’ (Please circle the answer which most applies)
Planning (representation and alignment)
To what extent do you believe that you have identified the key University planning forums? Not at all / we have identified some / we have identified most of them / we know all of them
Prepared for AAPPA Conference, September 1999
8
67
5
On what proportion of the relevant planning forums is FM invited to sit at the planning table?
3
None / some / most / all of them
,
How would you judge the level of involvement of FM in the development of University’s strategic plan?
4
No input / some input or involvement but should be asked for more / the FM function is properly represented and listened to
012
Please comment on the nature of the involvement
./
Mostly informal / Mostly formal - we are asked for specific information at certain steps in the process / Fairly even mix of both
+,-
Does the FM function have a long term / strategic plan, ie 3 years or more?
)*
No / such a plan has been established for less than 5 years / for 5 years or more
'
How would you rate the degree of alignment between the FM strategic plan and the University’s strategic plan?
(
(Eg: - key objectives / strategies in the University’s strategic plan are analysed for their implications for FM; - University’s projections and analysis of trends are incorporated in medium to long term FM planning.) ANSWER: Low / medium / high degree of alignment
(
Do you receive from senior University management (DVCs and/ or PVCs) on FMs performance in regard to the above aspects?
_
'
No input Level of involvement should be increased / decreased / about right Level of alignment should be increased / about right.
Page 16 of 36
&'
Balanced Scorecard for management & benchmarking Customer Perspective (Continued) Planning (customer input)
Prepared for AAPPA Conference, September 1999
Yes / No
Do you have a formal process to gain an understanding of the various customers’ physical resource and service requirements? Yes / No
Yes / No
Yes / No
Are service delivery options developed to address each customer group’s requirements? Are these service delivery options discussed with each customer group? Are Service Charters / SLAs developed or modified as a result?
Yes / No
'
In the case of capital works, are capital works plans modified as a result?
Yes / No
8
67
5
Is agreement with the ‘customer’ on the most appropriate delivery option and service level generally achieved?
3
Service Level Agreements / Service Charters
,
We have Service Charters / SLAs with our client groups
4
For none of our client groups / Some / Most / All
012
We measure our performance at least annually against the SLA
./
For none of the SLAs that are established / Some / Most / All
+,-
Our SLAs are set after some form of consultation with the client regarding their key needs
)*
For none of the SLAs that are established / Some / Most / All
'
When we measure our performance against the SLAs, we use the client’s rating.
(
For none of the SLAs that are established / Some / Most / All
_
(
Set up appropriate links to the relevant subsidiary scorecards eg, Cleaning, Security, Maintenance, Grounds
Page 17 of 36
&'
Balanced Scorecard for management & benchmarking
Financial Perspective Comments / Source of data
Prepared for AAPPA Conference, September 1999
Performance Indicators
(“How do we look to our financial stakeholders ?”) Objective
High level measure of the intensity of asset utilisation
Ability to manage the budget. Link to subsidiary scorecards (except Capital Works?).
Assets employed ($A’000) per EFTSU attending
Budget over-run as % of budget (operating)
Ditto. Link to Capital Works scorecard.
Annual net operating cost ($A) per EFTSU attending
Budget over-run as % of budget (capital)
Link to subsidiary scorecards - eg Maintenance Index of 1% to 1.5% of ARV, Cleaning cost per sq m or per EFTSU, Security cost per EFTSU attending,
Obtain value for money and manage our budget
A composite of Maintenance Index plus funding per unit for other services
Ditto. Net means any revenue (eg hiring of facilities) is deducted from operating costs. Link to subsidiary scorecards
Obtain adequate funding for effective facilities management
‘Qualitative Measures / Practices’ Management information for FM
Hardly any manager believes they have a totally effective management information system. With this in mind can you please comment:
Within reason, our management information is:
The main problem is:
Reliable / not reliable Easily accessible / not Up to date / takes a long time to arrive Accurate enough for the purpose / not Within the FM function / not Other departments / not General / specific systems (please comment below)
_
'
(
(
'
)*
+,-
./
012
4
,
3
8
67
5
Comments: (eg, the specific systems of most concern provide information on the level of backlog maintenance) ……………………………………………………………………………………………………………………………………………………………………………………………… ……………………………………………………………………………………………………………………………………………………………………………………………… …… ……………………………………………………………………………………………………………………………………………………………………………………………… …
Page 18 of 36
&'
Balanced Scorecard for management & benchmarking
Internal Process Perspective Performance Indicators
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective
Defined as revenue from usage of the Institution’s assets by ‘outside’ parties, less direct costs relating to such usage. ‘Revenue’ includes the commercial value of free or discounted usage to charitable or other community groups.
(“What must we excel at?”) Effective Asset Management
Net annual value of commercial opportunities realised.
8
67
5
See Maintenance & Capital Works Scorecard. The overall ‘score’ would appear here.
3
Rating on Asset Management (self assessment
,
See ‘Qualitative Measures / Practices’ below .
4
Rating on Asset Management (self assessment)
012
Anticipate & adopt appropriate technology for FM
./
Qualitative Measures / Practices
+,-
Technology
)*
From your own knowledge, how appropriate is the facilities management technology that is used in your area of the University? Please consider and score each of the aspects below: - cost 0 1 2 3 4 - reliability 0 1 2 3 4 - fitness for my purpose 0 1 2 3 4 - keeps me ‘competitive’ in my field 0 1 2 3 4
'
0 = I don’t know / can’t give a knowledgeable answer.
(
4 = Totally Appropriate
(
1 = Totally Inappropriate
_
'
Comments: ……………………………………………………………………………………………………………………… ……………………………………………………………………………………………………………………………………….. ………………………………………………………………………………………………………………………………………..
Page 19 of 36
&'
Balanced Scorecard for management & benchmarking
Innovation and Learning Perspective Performance Indicators
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective Trend in total annual Operating Cost per EFTSU
(“Can we continue to improve and create value?”) Improve key elements of Facilities Management
All link to subsidiary scorecards. - Operating Cost to Operating Cost elements, eg Security - FCI to Maintenance or Capital - Customer Satisfaction to each individual scorecard.
Trend in Facilities Condition Index Trend in Customer Satisfaction Index
(In each of the KPIs proposed, ‘trend’ means % movement between each of last 3 annual figures)
Links to subsidiary scorecards
8
67
Annual training days per EFT facilities management person.
./
5
Develop our people
3
Source of data: · Financial System · Staff Development Records
,
Annual training cost per EFT security person.
4
Qualitative Measures / Practices
012
Knowledge and Skills
+,-
Have you identified the knowledge and skills required to optimise the contribution of FM to the University over the next several years?
)*
No / Yes, informally / Yes, a formal analysis has been undertaken in all areas
'
Have the gaps between existing and required knowledge and skills been identified?
(
No / Yes, informally / Yes, a formal training needs analysis has been undertaken for all areas
(
Have you a plan for filling these gaps?
_
'
No / Yes, an informal plan / Yes, a formal plan has been developed and is being implemented
Page 20 of 36
&'
Balanced Scorecard for management & benchmarking
Prepared for AAPPA Conference, September 1999
Comments / Source of data With scores for this measure from other scorecards, summarises up to the Top Level Scorecard.
SCORECARD FOR CLEANING Customer Perspective Performance Indicators Number of complaints per EFTSU attending
Summarises up to the Customer Satisfaction Index in the Top Level Scorecard, along with customer satisfaction scores from other scorecards.
)*
+,-
./
012
4
,
3
8
67
5
Objective Achieve highest possible level of customer satisfaction
Customer satisfaction index
The survey should be simple and capable of being scored. See ‘Qualitative Measures / Practices’ below for example.
SLAs either as part of contract specification or internal agreement with customers. Score provided by means of regular checks of service provided by Cleaning Supervisor.
'
Percentage Compliance with Service Level Agreements
See also ‘Qualitative Measures / Practices’ below.
(
Waste is garbage which requires general (not special) handling from buildings, and bins along paths, etc (not landscaping waste).
(
Waste volume/sq m Waste Volume/EFTSU
_
'
Be Environmentally responsible
Page 21 of 36
&'
Balanced Scorecard for management & benchmarking
Customer Perspective (Continued) ‘Qualitative Measures / Practices’ Customer Satisfaction Ratings: 1 = not acceptable; 2 = unsatisfactory; 3 = satisfactory; 4 = good; 5 = excellent
Sensitivity and understanding of customer needs Competence and expertise displayed with the advice or service provided. Reliability of service provided. Timeliness / speed of response to service requests Efforts made to solve problems and follow through provided to customers on services delivered.
We have Service Charters / SLAs with our clients
For none of the SLAs that are established / Some / Most / All
None / Some / Most / All
Prepared for AAPPA Conference, September 1999
We measure our performance at least annually against the SLA
For none of the SLAs that are established / Some / Most / All
Service Level Agreements
Our SLAs are set after some form of consultation with the client regarding their key needs
For none of the SLAs that are established / Some / Most / All
)*
+,-
./
012
4
,
3
8
5
67
For none of the
We use the client’s own rating of our performance at least annually against the SLAs.
'
Regular checks of service provided against the provisions of the SLA, are SLAs that are established / Some / Most / All conducted by the Cleaning Supervisor or equivalent position.
(
Waste Volume
(
Whether measured /frequency / used for change process.
_
'
We measure the volume of garbage which requires general (not special) handling from buildings, and bins along paths, etc (ie not landscaping waste).
Page 22 of 36
&'
Annual net operating cost per EFTSU attending.
Performance Indicators
‘Net’ means any revenue is deducted from operating costs. ‘Cost’ means the total all up including for example contracts, staff up to head of FM group, materials and consumables, waste removal, window cleaning, pest control, hazardous waste, recycling, sanitary bin service, grease trap service, landfill charges, cleaning of curtains and furnishings.
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Ditto per square metre
Measures the ability to manage the budget.
Balanced Scorecard for management & benchmarking Financial Perspective Objective
Budget over-run as % of budget (operating)
Objective
LTIFR (for Cleaning Staff)
Performance Indicators
Industry Standard, from OH&S data
Comments / Source of data
(With scores for these measures from other scorecards, summarises up to the Top Level Scorecard).
Work Safely
Internal Process Perspective
Minimise overheads
Cleaning management costs include staff and overhead costs for cleaning supervision, as well as a share of corporate staff and overheads up to head of FM function. This will also apply if cleaning is contracted out.
(
'
)*
+,-
./
012
4
,
3
8
67
5
Measures materials control, & environmental responsibility to some degree.
Ratio of cost of Cleaning Mgmt to Total Cleaning Cost.
Ratio of Materials Cost/Total Cleaning Cost.
(
See above and Customer Perspective.
_
'
Be Environmentally responsible
Page 23 of 36
&'
Balanced Scorecard for management & benchmarking
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Performance Indicators
Perhaps not all three cost ratios - just the ‘per EFTSU’ ratio’
Innovation and Learning Perspective Objective Trend in total Operating Cost per EFTSU (Also per EFT and per sq metre)
Halls Common Areas vacuumed
monthly
weekly
‘3 Star’
daily
fortnightly
weekly
‘4 Star’
three times daily
daily
weekly
daily
‘5 Star’
Other (state frequency)
In each of the KPIs proposed, ‘trend’ means % movement between each of last 3 annual figures
With scores for these measures from other scorecards, the cost and customer satisfaction measures summarise up to the Top Level Scorecard.
Improve key elements of Facilities Management Trend in Customer Satisfaction Index Trend in LTIFR
‘Qualitative Measures / Practices’ Benchmarking of ‘Cleaning Standard’ - Cleaning practice regularity This simple rating system might need refining/expanding but could well be a start.
Offices vacuumed
daily
twice daily
Level of service / frequency
Toilets cleaned
daily
daily
ITEM
High Use Toilets cleaned
daily
8
67
5
twice weekly
3
Internal Bins emptied
,
daily
4
daily
012
twice weekly
./
External Bins emptied
+,-
monthly
)*
3 monthly
'
12 monthly
(
Cob webbing internal
(
Windows cleaned - Internal
_
'
Windows cleaned - External
Page 24 of 36
&'
Balanced Scorecard for management & benchmarking
(see ‘Qualitative Measures / Practices’ on the next page)
1. Satisfaction score on response to a simple survey
Performance Indicators
Summarises to the Top Level Scorecard, along with scores on this measure from other scorecards.
Summarises to the Customer Satisfaction Index in the Top Level Scorecard, along with customer satisfaction scores from other scorecards.
Survey of a small representative sample of students and staff.
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective
2. Number of complaints per EFTSU attending
SCORECARD FOR SECURITY Customer Perspective
Achieve highest possible level of customer satisfaction
3. Relative incident levels
Number of incidents per 1000 EFTSU on campus divided by number of equivalent incidents (eg theft) per 1000 population in the suburbs in the immediate environment of campus.
,
Make an impact on incident levels
Source of data: Security System; Security Log Book; University Insurance Records; Finance Systems.
012
‘Qualitative Measures / Practices’
+,-
3 3
)*
3 = more than satisfactory)
'
2 = satisfactory;
(
(Rating scale: 1 = inadequate;
2 2
(
Customer Satisfaction / perceptions
1 1
3
8
67
5
2 3 No No 3
3
1
1 2 Yes Yes 2
4
1
'
Please give us your perceptions / experience with the following:
_
./
Your personal safety/security within my workplace – eg classroom, laboratory, office. Your personal safety/security moving about the campus – walking along pathways; driving or cycling on campus roadways; using stairways, elevators, toilets, public areas. Your personal safety/security during evenings and weekends on campus. Security of personal belongings – eg money/valuables, books, vehicles, etc. Have you experienced a personal safety/security incident this year? If yes, was the incident reported to the security service? If yes, please give us your perception of the response by the security service.
Page 25 of 36
&'
Balanced Scorecard for management & benchmarking Financial Perspective
Prepared for AAPPA Conference, September 1999
,
3
8
67
5
Comments / Source of data
4
Performance Indicators
012
Objective
./
1. Annual operating cost per EFTSU attending.
+,-
Effective cost management
)*
‘Cost’ means the total all up including for example contracts, staff up to head of FM group. For benchmarking purposes, categorised into the following (respondents would complete the section - or sections in the case of multiple campuses - which apply to them: Metropolitan Campus - 7 Day, 24 hour service $ - Other span of hours $ City Campus - 7 Day, 24 hour service $ - Other span of hours $
'
(The scores for these first two measures are summarised in the Top Level Scorecard, along with their equivalents from other scorecards).
(
2. Budget over-run as % of budget (operating)
(
3. Security cost per incident
_
'
The intention is to provide a rough judgement of the degree of security problem on campus balanced against the resources devoted to it. The source of the data is the Security Log Book and the Finance System.
Page 26 of 36
&'
Balanced Scorecard for management & benchmarking
Objective 1. LTIFR (for Security Staff)
Performance Indicators
Data from security log books.
Industry Standard, from OH&S data
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Work Safely 2. Annual number of security/safety escorts per EFTSU attending.
Internal Process Perspective
Monitor intensity of security effort 3. Annual number of recorded calls/requests to security service for assistance per EFTSU attending.
Monitor the balance between various security services
5. Percentage of total annual person hours and operating costs applied to: - Crime / incident investigation - Securing/lockup buildings and facilities - Surveillance/mobile and foot patrols - Attendance to personal safety requests - Car parking control and regulation - Escort duty - Response to out of hours general inquiries.
Data from analysis of security log books and apportionment of operating costs.
4. Annual person hours of security service provided per EFTSU attending.
Monitor the mix security incidents
(
'
)*
+,-
./
012
4
,
3
5
8
Data from analysis of: - Security System - Security Log Book - University Insurance Records - Finance Systems
(
67
6. Number of safety / security incidents (per 1000 EFTSU) on campus by type of incident: - Theft personal property - Theft University property - Damage personal property - Damage University property - Injury/assault to student/staff/visitor - Harassment to student/staff/visitor
_
'
Data shown in the three rows above can be used to monitor the level and balance in the mix of security services, to maintain efficiency and effectiveness.
Page 27 of 36
&'
Balanced Scorecard for management & benchmarking Innovation and Learning Perspective
Prepared for AAPPA Conference, September 1999
Comments / Source of data
8
67
5
Performance Indicators
3
Objective
,
1. Annual training days per EFT security person.
4
Develop our people
012
Source of data: · Financial System · Staff Development Records
./
2. Annual training cost per EFT security person.
+,-
Data from annual budget.
)*
Develop our services
'
3. % of annual total security budget committed to improvement of services or installation of new services.
(
Data from annual budget.
(
4. Cost per EFTSU attending, of access control and other electronic security services.
_
'
Invest in appropriate technology
Page 28 of 36
&'
Balanced Scorecard for management & benchmarking
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Performance Indicators
SCORECARD FOR MAINTENANCE Customer Perspective Objective Provide a safe environment An improvement in the Safety Condition Index which is defined as
Source of data: Maintenance management system Finance system
Academic and general staff. Staff availability assumed to be 46 weeks. Source of data: as above plus workers compensation records
1.1 Reduce the backlog of safety related maintenance Reduction in value of safety related maintenance backlog
SCI = 1 -
Lost time hours of all staff / total of staff hours available
Source of data: as above
Total Value of safety related maintenanc e backlog at begiining of period
1.2. Reduce lost time injuries
Reduction in WH&S Incidents defined as Noofincidentsinlastp eriod −noofincidentsinpresentperiod Noofincidentsinlastp eriod
Source of data for 2.1 and 2.2: Maintenance system / Help desk data / Quality system
1.3. Reduce WH&S incidents
Provide reliable services % response time achieved within published targets for various categories Number of complaints received per semester per EFTSU
Annual survey conducted at each site. Survey should be site specific to take into local issues. Some questions may be common across the system. Scoring system common eg 4=good, 5= excellent on a 5 point scale.
2.1 Rapid response to breakdowns 2.2. Reduce complaints
Score on customer satisfaction survey
Also need to establish Facility Condition Index which should be simultaneously measured to indicate effectiveness of maintenance.
For benchmarking purposes, the categories and targets would need to be defined and commonly accepted. Otherwise a measure such as ‘average response time between call and return to service’ would need to be used.
Provide an aesthetically pleasing environment
Maintenance $ as % of ARV by building or campus. E xpenditureonMainten ance AssetRepalcementValueofAssetCategory
(
'
)*
+,-
./
012
4
,
3
8
67
5
Measured as a rolling 5 year average (eventually) of: (Maintenance Index) =
(
Hence need to establish a condition-based maintenance plan. See ‘Qualitative measures / Practices’ in the internal process Perspective.
_
'
Provide equity of service
Page 29 of 36
&'
Prepared for AAPPA Conference, September 1999
8
67
5
Balanced Scorecard for management & benchmarking
3
Financial Perspective
Comments / Source of data
,
Performance Indicators
4
Objective
Source of data for 5.1 and 5.2: Maintenance management system and Asset s
012
Optimise the maintenance dollar
./
%
+,-
↔ 100
)*
Maintenance Index
'
5.1 Gain adequate maintenance funding
It is suggested that the maintenance Index should be between 1 and 1.5% (Source: APPA/AAPPA Research 1980-1998, Dr Frank Bromilow (CSIRO), NCRB, FMA)
Maintenance index should be assessed in conjunction with movement in the FCI
(
APPA/AAPPA Research 1980-1998 concludes that an FCI of 0.9 or above is an indicator of a manageable backlog
(
M aintenanance E xpenditure = Asset Replacement Value
FCI = [1 - (Total Backlog Maintenance / Institution ARV)]
_
'
5.2 Maintain an adequate value for the Facility Condition Index
Page 30 of 36
&'
Balanced Scorecard for management & benchmarking
Performance Indicators
Source of data: Maintenance management system and Asset s
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective Corrective maintenance expenditure divided by total maintenance expenditure
Completion of answers to the questions and comparison of the scores achieved provides a basis for benchmarking.
Internal Process Perspective
Adequate maintenance planning
See ‘Qualitative Measures / Practices’ below
A Facilities Condition Index is calculated
Part of the assessment is some form of physical inspection
No / for some assets / for most assets / totally comprehensive
Annually / every 2 years / every 3 years / less frequent
Yes / no
/ every 2 years / every 3 years / less frequently /
A prioritised list of work requirements is developed
Yes / no
Annually
A score would need to be set for each possible answer ( eg first question below: Annually = 4; Every 2 years = 3; Every 3 years = 2 and so on ).
Adequate maintenance systems / practices
‘Qualitative Measures / Practices’ Asset Management - Facilities Condition management
A Facilities Condition assessment is conducted
The prioritised list of work requirements is a key to determining backlog maintenance works
Yes / no
never
The prioritised list of work requirements helps to determine minor capital works projects
A cost estimate is derived for each item/project in the work requirements
No / for some items / for most items / all items
No / for some items / for most items / totally comprehensive
For backlog maintenance:
A recommended year of action is also identified
+,-
./
012
4
,
3
8
67
5
/ 3 years / longer term
No
'
/ for 1 year ahead / for 2 years
This results in a costed plan for infrastructure sustainability
(
plan
(
Yes / no
_
'
This has been endorsed by the leadership group at my University / Institute / establishment.
Page 31 of 36
&'
)*
Balanced Scorecard for management & benchmarking
Internal Process Perspective (continued)
Prepared for AAPPA Conference, September 1999
For planned maintenance / corrective / backlog / deferred
Asset Management - Maintenance routines and decisions
A maintenance management system is in place.
Yes / no Yes / no
Routines are documented and delivered by the maintenance management system. CM, PM, B&DA are ed for separately.
None / some / most / for all systems
Comments / Source of data
Yes / no
A Planned Maintenance system is in place There are documented procedures for evaluating outsourcing decisions.
Performance Indicators
Innovation and Learning Perspective Objective
Trend in total Operating Cost per EFTSU
These don’t all necessarily have to moving in the right direction for the ‘result’ to be good. For example rapidly recovering a poor FCI may impact severely on operating cost per EFTSU.
)*
+,-
./
012
4
,
3
8
67
5
Trend in Facilities Condition Index
Improve key elements of Facilities Management
'
(In each of the KPIs proposed, ‘trend’ means % movement between each of last 3 annual figures)
(
Trend in Customer Satisfaction Index
(
Trend in lost time injuries (see Performance indicator 1.2 in Customer Perspective)
Annual training days per EFT maintenance person
_
'
Improve skills of workforce
Page 32 of 36
&'
Balanced Scorecard for management & benchmarking SCORECARD FOR CAPITAL WORKS
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Performance Indicators
Customer Perspective Objective
It assumes there is customer satisfaction survey for each project and that a ‘client’ (spokesperson for key s) has been identified.
Not at all / yes but it was a partially satisfactory /
A weighted average figure is calculated to take into the different size of individual projects. Source of data: project budgets and schedules.
The score for each survey (calculated as % of maximum possible score) can be calculated and summarised, along with the score from other customer satisfaction scores, to an average score in the ‘Top Level Scorecard’ for Facilities Management as a whole.
The survey should be simple, capable of being scored.
Customer satisfaction index (See indicative survey questions below)
Average time overrun as % of planned project duration
Achieve highest possible level of customer satisfaction
On time completion
Customer Satisfaction survey
We were consulted at the planning stage about our requirements satisfactory
012
,
3
Yes / no
Not at all / to some degree / yes we were
./
We were given realistic expectations of what the completed project would provide.
+,-
Not at all / to some degree / to an appropriate degree
)*
We were kept adequately informed during the project.
'
Not at all / in part / yes
(
The completed structure has proven satisfactory
(
None / some / most / for all systems / don’t know
'
Any associated systems contain appropriate technology. Operating costs so far have proven to be within our expectations.
Yes / no
_
8
67
5
4
Remedial work has been unnecessary or of a very minor nature.
Page 33 of 36
&'
Balanced Scorecard for management & benchmarking
Budget over-run as % of budget (operating budget)
Performance Indicators
Ditto. Reported to Top Level Scorecard
Ability to manage the budget. Reported to Top Level Scorecard
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Objective
Budget over-run as % of budget (capital budget)
Assesses the keenness to check costs and approaches with competitors
Financial Perspective
Manage our budget effectively
% of capital works by value, competitively tendered
Project competitively tendered
Internal Process Perspective Comments / Source of data
8
67
5
Performance Indicators
3
Objective
,
Effective Capital/Asset Management
4
% which the capital works group annual operating budget represents of the capital works completed over the same period
012
% management costs to capital works value
./
Minimise management overheads
+,-
Assesses some of the relative merits of in-house projects and placing projects with contractors
)*
Optimise project cost performance
'
Budget over-run as % of budget (capital budget) separated into: n sum of projects completed by in-house people n sum of projects completed by contractors
(
Highlights needs & opportunities to review and categorise project management practices for different sized projects
(
Monitor portfolio for economies of scale
_
'
Average % management costs to capital works value by size strata of projects: n all projects less than $25000 n all projects > up to $100,000 n all projects > $100,000 up to $1,000,000 n all projects > $1,000,000
Page 34 of 36
&'
Balanced Scorecard for management & benchmarking
Comments / Source of data
Prepared for AAPPA Conference, September 1999
Performance Indicators
Innovation and Learning Perspective Objective
Trend in % management costs to capital works value
On a longer term basis, measures the degree to which the capital works function is being effective in managing the capital assets aspects - even though some of the measures may be indirect.
)*
+,-
./
012
4
,
3
8
67
5
Trend in total facilities assets employed per EFTSU
Improve key elements of capital assets management
'
(In each of the KPIs proposed, ‘trend’ means % movement between each of last 3 annual figures)
(
Trend in Customer Satisfaction Index
(
Trend in LTIFR for capital works staff and contractors
Annual training days per EFT capital works person
_
'
Improve skills of workforce
Page 35 of 36
&'
Balanced Scorecard for management & benchmarking BALANCED SCORECARD FOR INTERNAL MANAGEMENT PURPOSES MAINTENANCE FUNCTION
KPIs
Prepared for AAPPA Conference, September 1999
'
Actions by whom / by when
This corresponds to the first page of the customer perspective of the scorecard shown for the maintenance function on page 16. The key difference is that this layout provides for specific targets, responsibilities, and deadlines to be documented on the same page.
Strategy / Target
Performance is measured by an improvement in the Safety Condition Index which is defined as Reduction in value of safety related maintenance backlog
SCI = 1 -
(
Customer Perspective
TO PROVIDE A SAFE ENVIRONMENT
Objective 1.
1.1 Reduction of the backlog of safety related maintenance by 20% p.a.
Lost time hours of all staff (academic and general), divided by total of staff hours available (46 weeks).
Total Value of safety related maintenanc e backlog at begiining of period
1.2. Reduction of lost time injury as % of total hours by 10%.
Reduction in WH&S Incidents =
8
67
5
Noofincidentsinlastp eriod −noofincidentsinpresentperiod Noofincidentsinlastp eriod
3
1.3. Reduction of WH&S incidents by 10%
,
% response time achieved within published targets for various categories
4
2.1. Ensure all response times are met with 95% confidence rate.
012
TO PROVIDE RELIABLE SYSTEMS AND SERVICES
./
Number of complaints received per semester per EFTSU.
+,-
Respondents giving a score of 4 or 5 or of good or excellent on a scale of 5
)*
2.2. Number of complaints received per semester 3.1
(
2.
3.
Perform regular Occupancy Evaluations and Condition Audits
_
'
TO PROVIDE AN AESTHETICALLY PLEASING ENVIRONMENT
Page 36 of 36
&'