'Beyond Bricks and Mortar - Building Quality Clinical Cancer Services' Symposium 2011
Quality Systems Assessment - Professor Clifford Hughes
prev pageprev page| TOC |next page
Chief Executive Officer, Clinical Excellence Commission NSW
Download powerpoint presentation by Professor Clifford Hughes (PDF 3001 KB)
Introduced by Norman Swan:
Welcome back. So let's pursue in some more detail, with a bit more granularity this issue of quality and safety. How to achieve it. How to take a systemic approach rather than just blaming the bad eggs and following that theme of Edwards Deming, looking for continuous quality improvements and reducing variation. And for some reason we've got two surgeons in this next session. So guard your loins. The first of whom is Cliff Hughes, who runs the Clinical Excellence Commission in New South Wales, longstanding commitment to safety and quality, a cardio thoracic surgeon by training and has pursed that commitment to - and one of the founders of one of the most innovative group practices in Australia, which you wonder why other specialists actually don't model and very few do, which becomes a very safe way of practicing in the public and private sectors.
Please welcome, Cliff Hughes.
Prof. Cliff Hughes:
Thanks, Norman. Good to be here. Time is short, so I want to keep moving and I want to talk to you about a new program, a relatively new program in New South Wales that we've developed to help deal with this issue about safer systems and better care. And it really builds quite nicely - builds quite nicely on the concept of bricks and mortar. I'm actually wearing some spirit level cufflinks if you want to have a close look later on, before I fly out.
And it comes down to the concept of what happens as something is being built. And it's not just about the bricks and mortar. There are two measures that are of critical importance on a building site. The first is the tape measure and the second is the spirit level. And they are absolutes. And you can use those to determine a whole lot of other things.
We are, unfortunately, not in a position to have those absolute levels in Health in most instances. But we do have to find a way to get around the problem. If you don't get things true when you build them, what are you going to do to stop it from toppling over with the sorts of headlines which we see here which are all too common in any jurisdiction across Australia?
We were concerned about this and it was pointed out to us after Brad Walker, SC, did an enquiry into the Campbelltown and Camden Hospitals, that there was no real way of assessing how good we were at building systems in our bricks and mortar hospitals or other facilities to determine whether we had a safe system for patient care, and we were asked to set up that program.
Now it's very easy for us to think about why, what and who. But we really need to find out how you can go about doing that. This was a totally new concept. We couldn't find anyone talking about it anywhere in Health. So we decided we'd look elsewhere and find out the answers and we went to the so called high reliability industries. Now the high reliability industries are easy to identify. They're the ones that you see on the front page of the newspapers when something goes wrong.
And it's interesting that they have certain hallmarks, that they constantly self assess, not only their outcomes, but also the systems that they have in place. Even those that have had major catastrophic front page news headlines, have got in place now systems to deal with the problems that lead to trouble.
So we've developed, in conjunction with external consultants, some guiding principles around what we've called a quality systems assessment in Health. You can see the points here and I'm not going to go through them all. Suffice to say that this is not accreditation, it is not a competitor with accreditation. Rather it underpins, and in many cases goes before, and allows teams to think about what might happen when accreditation is necessary and it is. It is certainly not a pass fail exercise. There is no certificate for participating in the quality systems assessment.
So what is its purpose? To identify state wide policy and program gaps and report back the results around those results to the public as well as to the system. And to assess the degree of effectiveness of implementation of those policies at the local level, the local team level. It is not meant to be simply a tool for accreditation in the pass fail exercise.
So how does it work? On the basis of all the information we receive from our incident information management program, from advice from clinicians, we have developed a five year framework. The first year of our framework, we did a survey of all the clinicians in New South Wales by tier level - I'll come back to that in a moment. To identify what they saw were the gaps in their own system application in their unit, their facility or the area health service.
Out of that we identified some particularly important gaps and we let the system know that next year, we'd come back and we would specifically evaluate in more detail, those particular quality systems that were or were not in place across the state. We'd do another four or five the following year and another three or four the year after that. And after the five year cycle is complete, come back and do a whole system assessment for each unit, each area health service and each facility. So there are three tiers.
This is the cycle that we go through and you can see there's an online multilevel self assessment. That is fed back as raw data which is analysed by our own team and we then go back to the unit with that information and ask them what improvement plans they are going to develop? They do the work. Then we go back five months later and we verify two things, the data that they provided to us and whether or not they're implementing their improvement plans. This is a very different process than other programs where you're simply feeding stuff into the centre of your office and waiting forever for it to come back.
Some interesting things that have come out of our program. The first was the participation. This was voluntary in one sense and - it's very hard to mandate things in Health, there's always someone who doesn't want to do it - and it was interesting that the immediate uptake in 2007 was 82% of all the clinical and ward units across the state. There were eight area health services involved in this, around about 260 hospitals. The rest of those numbers relate to clinically based teams or ward areas. You can see that now more than 93% of our units are responding to this program.
First question, why such uptake in a voluntary reporting program - unusual? We then had to set about making sure we asked the right questions and we did that in conjunction with the clinicians themselves. We also did it in conjunction with experts who'd looked at the like of Deming and (inaudible) and so on and had the areas where we knew that systems were routinely failing. And we examined the feeling of the staff about what systems were in place. We were not examining outcomes. We asked them, did they know about what was in place. And here is the cycle of the issues that they identified and asked for us to review in the following three years. You can see 2009, clinical handover, communication, deteriorating patient, medication safety. All pretty basic stuff. Staff were concerned. 2010, open disclosure, teamwork, HAIs and last year - sorry this year - sepsis, paediatric, mental health and delirium and next year we'll repeat the baseline.
Top of page
Now there are none of those that were a surprise to anyone working in the system. So if they were not a surprise, why is it that we haven't dealt with them before? Well, we suddenly realised that unless you actually analyse the systems that you have in place, you don't know when you fail to provide the care. What we found was some very basic statements.
First of all, if you have a look here, you can see that there was a high compliance in activities where there is clear policy directives. So our blood watch program where there are clear guidelines. People have those systems working. But if there is a lack of policy, such as mortality review, audit medical record review and so on, there is very poor compliance. I'll show you some numbers about this in a moment.
So we began to see that the systems that we have in place actually do impact on the potential performance measures and outcomes that we wanted to have a look at. What we do is give back to each unit and each facility and each area health service, a graph that looks something like this. You can see that the olive green is what you really want to here, almost always or often. You can see some of the issues we asked around the infection control process.
At first sight, you think, "Well, we're doing pretty good about that". We can actually see that we're getting right up here in the 80% to 90% for most of these questions. But then when we actually ask them - well, people were actually measuring them - suddenly the reason was obvious. They simply didn't know, they were guessing at their answers. This is one of the issues that we found to be particularly important and still is with respect to hand washing.
People talk about compliance but they're not measuring it. This one is around non policies and again you can see the variations between those who have policies, results, clinical audits are provided to clinical staff. Pretty much most units are the - and I can't read that on the screen - are periodic audits of clinical practice for high risk procedures occur? Yes, they do in a small group. But all deaths in the unit are reviewed. But the high risk ones are ignored.
We then ask the staff what they thought were the risks to patients safety. At a clinical unit level, this is what they told us. They actually didn't tell us that it was a matter of not enough staff or not enough money, they told it was medication management, clinical management and falls.
They had a different view of the world than we did centrally. This is coming from the staff itself. When we actually had a look across the three layers, we found we got different answers from an area health service, as we did from a clinical manager in a facility, as we did from a head of a unit or a nursing unit manager.
So here, for instance, when we're looking at medication safety, at the clinical unit level only 25% of respondents performed regular audits. Yet the Area Health Services told us they've been performing reliably across their facility. We went and looked at that further in 2009 and 80% of the area health services responded that they had a policy that related to medication safety.
But when we had a look at the units, 35% and 22%, there's a big disconnect between what people thought was really happening with respect to clinical systems. We can go on through a whole series. This is the one about the unit level responses to medical management and you can see here with anticoagulation that only 50% of departments had a process for managing anticoagulation. Yet we knew that to be the most dangerous - one of the most dangerous drugs in the system. We had a look at that with between the flags process. We were able to review our own programs and have a look and see just what the problems were. This is the response to open disclosure. It's been in New South Wales now for over five years. And you can see we've actually made some significant strides. And the general level open disclosure occurs in departments about 95% of the time. Still got to work on the high level one. But we're now getting some idea about the systems which will ultimately produce different outcomes.
We then began to look at what was happening around newer areas. And the first is the issue around teamwork. These are just examples that came out of QSA. Here were what the staff told us were barriers to teamwork. Staffing levels and/or staff mix. So here comes the resource issue. If you're too busy, it's very hard to take the time to build staff. Lack of time or their workload, 38% of staff felt that. And so we started to get a picture of what we needed to do at a unit level to bring systems to bear on the problems that drive the change.
Now in case you think that this is a bureaucratic exercise, this shows you just who it was that was completing the surveys in our program. Nursing staff complete half of them. In many cases the nursing staff complete them on behalf of the unit manager, the medical unit manager. But 16% of our doctors are, in fact, proceeding to get involved in this process. Allied health, about 6%. Ambulance and paramedics - interesting - 12%. About 6% of management executives are involved in the program.
So a covering people that are trying to do the work are not simply doing another bureaucratic exercise. Not only that, we don't rely on the information that we're told. Five months after all the data's been collated, we go back out with a series of teams - and we don't have time to show you all of them - and we verify the data. We have over 60 clinicians from all hospitals who form teams and go out and sit down, look at the data, ask for questions, like to see where the data is collected.
For instance, if you tell us that you do a mortality review, we ask you to show us the last three monthly minutes. So we know whether it's happening or not on the site. This is the most extraordinary slide, I think, that I've come across related to any form of audit.
You can see there three years review and the accuracy of the responses that we got verified by independent assessors on site looking at the evidence, 98.6% on two occasions and 97.8% on the survey this year. And look at the numbers of responses that we verified, that's more than a 20% verification each year and over five years, every unit will have a verification process done.
Second question. Why are we getting such an accurate response to an audit of systems in our health jurisdiction? Not going to go through this in detail except to say that there's an opportunity for us to identify gaps and provide recommendations, but more importantly, there's an opportunity for a unit to give us exemplars of what they've done well, so that we can tell other units just what should happen.
Now we publish all these and here are the first four reports. The last one is about to come out, waiting for DG sign off just at the moment. They actually put these questions to you. Why so many respond? Why such accuracy? Can I suggest to you that this is because it's not a bureaucratic distant or central exercise, it's clinicians on the ground looking at what they do, looking at what systems are in place and providing them with a clinical risk management tool to correct those system gaps before they become issues of our reporting, through our surgical audit or death audit, bureaucratic audit or even a coronial audit.
This is a clinically based risk management program which the system has embraced wholeheartedly and our assessors are welcomed. But that's not enough, we need to be sophisticated in our level and make sure that we are actually up to speed with new technology.
And the secret to that is to ask each of the units to demonstrate their improvement plans. Next year, when we go back to see those units who submitted an improvement plan for a gap, they know that we will be asking them what is your improvement plan delivering, has it solved the problem and if it's an exemplar, we want to tell the rest of the state. This is not simply a matter of number for numbers sake, it's numbers to drive the improvement of care.
If you want to talk to us about this program, there are all the addresses, these slides and more will actually be in your handouts for you. We'd be happy to talk to you. But I'd suggest to you that if you want to drive a good system further, don't think just about bricks and mortar, think are the people and the systems within it true - as true as possible for the possible absolutes that we can get?
Spirit levels measure the horizontal but they also measure the vertical. They both need to be straight lines. Thank you very much.
(applause)
Top of page
Norman Swan:
So, Cliff, whilst people will come up with improvement plans, often what you find when you feedback data to people and they start realising that you get change happening anyway, have you got anecdotes to tell of change that's happened just off their own back without actually writing down an improvement plan because they've realised that there's obvious stuff going on?
Prof. Cliff Hughes:
Oh yes. Perhaps one of the most dramatic example, country hospital, geriatric type ward, mostly female population, cold climate, vinyl floors, farmers' wives are the patients, they knit bed socks, they were still using alcohol based hand rubs - back rubs - that was a bit of a slip of the tongue, wasn't it - and talcum powder.
Norman Swan:
So a lot of people were skating over floors?
Prof. Cliff Hughes:
Absolutely especially when they gave the diuretics in the middle of the afternoon. Before we even got there, that nursing unit manager who identified the problem and changed all those things - you couldn't have knitted bed socks, you had to have rubber soled ones. Back rubs went. Diuretic dosage was changed. Bedside handover became the rule rather than the tearoom handover. Seventy five per cent reduction in their incidence of falls in the space of 12 months.
Two inches away in the next ward, which happened to be a neurology ward, they still had a high rate of falls because the two units hadn't spoken to each other. And when I spoke to the General Manager, he said, I'm not surprised, those two NUMs haven't spoken to each other for 20 years…
Norman Swan:
And neurologists are locked into their steel trap brains.
Prof. Cliff Hughes:
That was your comment.
Norman Swan:
Of interest here, of course, is team based care and working in teams. It gets better outcomes, we know the story and what you said, I'm sure rings true for a lot of people here. It's time, it's effort, the reward at the end of it is sometimes questionable, you're talking about patients who probably don't need some - or the multidisciplinary team conferences, etcetera. How are you approaching this issue on a systemic level?
Good point. Out of the learnings from this, a number of our programs arise and on 7 September, we're going to be launching a program called In Safe Hands. Which is bringing all of the teams, including allied health and the other groups into the fold and say, "In our unit, how can we improve our care?" We know over the last four years, there's been no change in the culture of our doctors, our nurses, our allied health professionals towards multidisciplinary care.
That's the challenge. But the staff have told us that that's the challenge. Now we've got to do it, so we're putting our usual projects, CPI programs around that issue, as a result of the QSA.
Norman Swan:
And just tell us more about this gap, between the fact that Area Health Service on its books, given that you've got high veracity there, so telling the truth that on their books they've got a medication safety - medication safety program but down at the clinical level, they haven't got a clue that there's one. Either it's an inappropriate medication safety policy or it's badly communicated, have you dissected what the story there is?
Prof. Cliff Hughes:
It's a bit of both of those and having headed up a department for many years, I know I had a shelf full of red folders which were policy documents from New South Wales Health. I don't recall I ever read them and that was my mistake. They actually had ways in which we could improve. But they were this thick. What we need are clinical guidelines that are one or two pages thick, that actually enable us to develop and implement a policy that works at the coalface.
So we're trying to bring both together. The area health services now realise that their communication or it's now the Department of Health, that their communication didn't get beyond the sea sweep in the Area Health Service. It wasn't getting to clinical governance, it certainly wasn't getting to Heads of Department.
Norman Swan:
So you talked about the NUM who changed the vinyl floor situation in Orange or wherever, but what about the situation which is quite common where people will say, "Look, they've just got stupid rules, the roster's are out of link or they're silly rules about X, Y and Z and it just mitigates it and we can't get anybody to listen to us at an organisational level to change the system", which is actually a much tougher thing than getting people to wear rubber socks.
What examples - have you got any examples of that feedback working?
Prof. Cliff Hughes:
That has been one of the challenges about leadership and in Health we failed to adopt the policies of the big industries to grow young dynamic free-thinking leaders. In our clinical leadership program, which is now into its fourth year, it's been so successful that the DG has asked us to double the number of intake for the last two years, 150 coalface programs designed by young enthusiastic clinicians to solve those simply insoluble systems.
One quick example in one of our spinal injuries unit, they had poor outcomes, they had huge length of stay in hospital and they, like most spinal injury units, would only take people when they were ready to operate. They adopted, as a result of this program, a no refusal policy and an immediate assessment and where appropriate early intervention. Their beds, they went down, they were able to accommodate all their referrals and they were taking referrals from the other major spinal injuries unit. That came from the coalface. They knew how to solve the problem. We had to let them.
Norman Swan:
Any questions or comments for Cliff? You've said it all. Thanks, Cliff, that was great, thanks.
(applause)
Top of page
prev pageprev page| TOC |next page
Help with accessing large documents
When accessing large documents (over 500 KB in size), it is recommended that the following procedure be used:
- Click the link with the RIGHT mouse button
- Choose "Save Target As.../Save Link As..." depending on your browser
- Select an appropriate folder on a local drive to place the downloaded file
Attempting to open large documents within the browser window (by left-clicking)
may inhibit your ability to continue browsing while the document is
opening and/or lead to system problems.
Help with accessing PDF documents
To view PDF (Portable Document Format) documents, you will need to have a PDF reader installed on your computer. A number of PDF readers are available through the Australian Government Information Management Office (AGIMO) Web Guide website.

