Encouraging Best Practice in Residential Aged Care Program: Final Evaluation Report
12 - Discussion and Conclusions
The EBPRAC program represents the most comprehensive, coordinated, approach to implementing evidence-based practice in residential aged care undertaken in Australia, involving 13 projects working with facilities in 108 locations across six states. Previous work has been limited, generally undertaken on a small scale and within short timeframes. Where there has been large-scale investment (e.g. Dementia Essentials Training under the Dementia Initiative) the target audience has been narrowly defined.
Accreditation and best practice
The Australian Government’s quality assurance framework for residential aged care comprises four elements - accreditation, building certification, complaints handling and supporting users’ rights. The Aged Care Standards and Accreditation Agency has responsibility for the accreditation of residential aged care services. They carry out this responsibility by managing the accreditation process using the Accreditation Standards and by assessing, and strategically managing, services working towards accreditation. There are different approaches to improving the quality of aged care services. One such approach is to focus on strengths and expand them and the EBPRAC program represents a ‘strength’ that can be expanded upon.As this evaluation has shown, changing practices to meet the standard of the best available evidence is not easy, even with the level of resources available in the EBPRAC program. At a strategic and policy level the question is not about ‘what do we want the whole industry to do’ but ‘what strengths do we want to build on?’ The concept of ‘strengths’ can be framed in a number of ways. For example, facilities already recognised as providing a high standard of care and which are ‘receptive’ to becoming even better could receive assistance to do so.
Another way of thinking about ‘strengths’ can be the evidence itself i.e. in which areas of practice is there good evidence to support practice change? This is the approach taken in the UK national health service with the campaign around ‘10 high impact changes’ (NHS Modernisation Agency 2004) and the 5 million lives campaign by the Institute of Healthcare Improvement in the USA which focused on 12 interventions known to reduce harm in hospital (see the website for the campaign at http://ihi.org/IHI/Programs/Campaign). Rather than trying to be the ‘best’ at everything the aim is to set priorities based on the evidence about ‘what works’. The recommendations arising from the program evaluation (see Section 13.2) include elements of the ‘dual pyramid approach’ by suggesting links with the current system of accreditation (recommendations 8 and 9) and focusing on changes with the potential for ‘high impact’ (recommendations 15, 16 and 17).
Implementation
Evidence-based practice has significant overlaps with the concepts of change management, quality improvement and the dissemination of innovations. The literature about how to implement evidence is extensive but the findings are often equivocal, even at what is generally accepted to be the highest level of evidence - systematic reviews and reviews of systematic reviews. Much of the literature is not as useful as it might seem at first, primarily due to various methodological issues and lack of details about, for example, implementation.These findings from the literature are relevant to any consideration of what was done as part of the EBPRAC program and what was achieved. Despite extensive progress reports and final reports it is not entirely clear how extensively changes in practice were implemented. Most of the changes were small-scale, undertaken by many different members of staff, at all hours of the day and night. There is no easy way of ‘measuring’ how often, or how well, residents’ teeth are cleaned or moisturiser applied to their skin to prevent skin tears. In day-to-day practice, with so many one-to-one interactions between residents and staff, there is no way of capturing how residents are spoken to by staff while being showered, assisted with their meals or helped to go back to sleep in the middle of the night. The corollary of this is that without neat and tidy measures of how extensively changes have been implemented it is very difficult to make judgements about why improvements may have taken place.
Implementation was generally consistent with what projects set out to achieve, as described in their original funding submissions and project plans. Some projects did not just implement ‘evidence’ but also added to the available evidence. Changes in project scope usually involved an increase in scope, particularly regarding the development and delivery of education programs. There were some delays but nothing that would be considered unusual in a program of this scale and complexity. No project implemented activities ahead of schedule. It was difficult to assess the ‘stage’ of implementation or degree of implementation although there were indications that in the case of some projects full implementation in all facilities was not achieved.
All projects followed a consortium approach, all except one with stable leadership. Good relationships were established between the consortiums and participating facilities. Some projects with large distances between consortium partners had to spend more time and energy building partnerships than more localised projects. It appeared that all project phases (governance, establishment, implementation and evaluation) were facilitated where consortiums involved project leads with strong pre-established relationships and credibility amongst consortia members and/or with participating facilities. Consortiums involving facilities which were part of large aged care organisations increased the potential for knowledge transfer within those organisations because of the infrastructure and systems in organisations of that scale.
The implementation strategies adopted across the 13 projects were wide-ranging and consistent with what is found in the literature on evidence-based practice. Many strategies were employed to engage and support staff in changing practices, with the most common being education of one form or another, use of local facilitators (champions, link nurses) and the collection of data (either of clinical performance or data on individual residents) which was then fed back to staff to inform future actions. All of these interventions have been shown to be effective, to varying degrees, in other settings. The evidence is not always strong, which is more a reflection of the current ‘state of the science’ regarding how to change practices rather than the methods themselves.
Some elements of what the projects implemented were standardised, particularly with regard to training. However, much of what was implemented was not standardised and varied from facility to facility, in response to local needs. This is entirely appropriate but, again, does make it more difficult to interpret the results achieved by each project. Some projects (e.g. oral health, pain management) adopted more of a ‘top down’ approach by indicating to staff what should be done (here is the evidence, so this is what you should do), while leaving scope for how it might be done. Other projects used more of a ‘bottom up’ approach, where staff decided what they would implement and how they implemented it.
In general, the approach to educating staff focused more on one-to-one and small group learning rather than large-scale, primarily didactic, education. The evidence from the literature supports this approach. Three interventions with limited application in residential aged care were featured in the program – academic detailing, action research and the Collaborative Methodology. All projects adopted a multi-faceted approach to change, which is recognised as more effective then reliance on single strategies. All projects included a financial incentive for facilities to participate, usually to cover the costs of staff training. The rationale for projects selecting the implementation strategies they used was underpinned by a mix of evidence, previous experience and available expertise.
Residents had little influence on project design and implementation. Projects tended to focus on keeping residents informed rather than seeking their opinion about what should happen. Various approaches were undertaken to achieve this including the use of posters, brochures, newsletters, speaking at resident meetings and media releases to local newspapers. Cognitive difficulties made communication with some residents difficult.
Top of page
Evaluation
Many activities were undertaken both to change practices and to collect data for an evaluation. Project evaluations used a ‘before and after’ design i.e. measuring a series of variables before implementation, and then measuring the same variables after implementation. Some projects added elements of evaluating process, to work out what might be going on during implementation. In the absence of any control groups there is a need for caution in interpreting the results of such evaluations. There are many other reasons why improvements may have taken place other than the project itself, particularly given the factors that have been shown in this evaluation to be important influences on implementation (e.g. leadership and management support). The projects did not take place in a vacuum – many other changes took place at the same which influenced implementation.The extent of data collection for the project-level evaluations was extensive, with much of the data collection also informing project implementation. Across all projects 18% of project funds were devoted to evaluation, ranging from a low of 9% to a high of 46%. On the whole, project evaluations were consistent with what was proposed in the original funding submissions and evaluation plans. The exceptions are the infection control project which did not undertake key elements of its evaluation and the three projects which intended to conduct economic evaluations but ended up not doing so.
Impact
Collectively, the projects had a positive impact on residents, staff and facilities, with considerable variation between individual projects, and within each project. Changes to the care received by residents were diverse. Many of the changes built on work that had been done previously in participating facilities and were relatively small scale and incremental in nature. In part this reflects the focus of the program and the available evidence but is also indicative of the capacity of the sector to change. The capacity to change is dependent on the availability of resources, including the knowledge and skills of staff, the nature of daily work and the influence that a wide range of factors that are largely outside the control of those trying to bring about change can have e.g. turnover of facility managers which had a significant impact on some facilities and some projects.A useful analogy for the EBPRAC program is that of throwing a rock into a pool of water. The rock (the project) makes a big splash but by the time the effect has ‘rippled out’ to the periphery it is much smaller. The ‘ripple effect’ of implementation results in lots of small changes, rather than a few large changes. The important thing is that the changes keep happening and achieve a cumulative effect.
All projects provided some evidence that practices had changed for the better, although for two projects the changes were minimal. The pain management project had the best evidence of improved practices by being able to show increased adherence to ‘best practice’ guidelines between project commencement and project end. For most projects the scale of practice improvement was difficult to assess.
Generally, resident outcomes were difficult to measure or the evidence for improvements in outcomes were not particularly strong but this is not unexpected and is consistent with what is found in the literature. For many people residing in aged care facilities maintaining health status rather than improving health status may well be a satisfactory outcome. The best evidence that resident outcomes improved came from the three behaviour management projects and projects that focused on prevention which resulted in improved oral health and a reduction in wounds.
Impacts on staff were mixed but generally included improvements in awareness, confidence, knowledge and skills. Staff had improved access to and use of evidence-based resources and tools. There was evidence in some projects of greater collaboration between nursing staff and personal carers, as well as with health and allied health in the planning and provision of care.
Impacts on facilities included improvements to the physical environment, better access to equipment and outside services and improvements in key processes and systems of care.
Each project identified the main outcomes that were to be achieved over the course of the two years. Many of the intended outcomes were expressed in ways that made it difficult to determine whether the outcomes had in fact been achieved, which may have contributed to a lower rate of achievement than if the outcomes had been expressed more precisely. Projects had more success achieving intended outcomes for facilities and staff than for residents.
Top of page
Key success factors
Much of the program evaluation was ‘framed’ by the evidence from the literature about the ‘key success factors’ for implementing evidence-based practice. The results from the program evaluation indicate that three of these factors were of particular importance – a receptive context for change, the availability of adequate resources and engaging the relevant stakeholders.An important ‘key’ to successful implementation was leadership - where it comes from is not so much of an issue as long as it comes from somewhere and does not rely on one person. This finding is not only based on the results of the evaluation regarding program implementation (Section 3.5.1) but also the views of the high level stakeholders interviewed for the evaluation (Section 11.5). It is recognised that investments have been made in the past to improve leadership skills within the sector but this finding warrants serious consideration being given to building on that investment (Recommendation 10).
Important though leadership may be the mixed results for the impact of the program (see Chapter 4) indicate that there are no ‘magic bullets’ for successful implementation of evidence-based practice in residential aged care. Considerable resources were spent educating staff (see Section 5.1). However, education can only take things so far. Knowledge is a necessary pre-condition for change to occur but is insufficient on its own to change behaviour. The projects demonstrated that it was important for staff to be able to ‘see’ the benefits of what they are being asked to do and to understand why changes were necessary. This was more important than simply being told that there was ‘evidence’ to support a change taking place. Many of the changes involved additional work, at least initially, which was difficult to incorporate into a pattern of daily work characterised by ‘busyness’ and lots of routine. These and other lessons from the evaluation have been summarised in a series of ‘principles of practice’ detailed in Section 6.
Program objectives
Table 30 provides a summary of the how the objectives of the EBPRAC program were met, drawing on evidence from across the program evaluation. The relevant section(s) of the report containing the evidence to support the conclusions about whether objectives were met or not are included in the third column of the table.Table 30 Summary of achievement of EBPRAC objectives
| Objective | Evidence to support achievement | Source of evidence in report (Section) | Objective met or not met |
|---|---|---|---|
| Improvements in clinical care | Diverse range of changes made to clinical care but difficult to measure extent of implementation. Outcomes for residents highly variable. | 4.2 | Objective met. |
| Opportunities for aged care clinicians to develop and enhance their knowledge and skills | The program provided extensive opportunities to develop and enhance knowledge and skills, primarily of nursing staff and personal carers. | 4.4 | Objective met. |
| Support staff to access and use the best available evidence in everyday practice | Extensive support (education, facilitation, education resources, financial resources) provided to staff to access and use evidence in everyday practice. | 4.4 5.1 5.2 | Objective met. |
| Clearer industry focus on improvements to clinical care | Limited awareness of the EBPRAC program amongst those not directly involved in the program. The program may well provide a clearer focus in the future but this will depend on the extent to which findings and resources are made known to the industry. | 11.2 | Objective not met. |
| Wide dissemination of proven best practice in clinical care | Wide dissemination has occurred, primarily within the program. A lot will depend on the extent to which the resources developed by the projects are widely disseminated. | 8 | Objective met. |
| Develop national clinical or educational resources and evidence summaries that support evidence-based practice in aged care and are able to guide the ongoing development of accreditation standards | Comprehensive educational resources and evidence summaries have been developed. It is too early to assess the extent to which this work will guide the ongoing development of accreditation standards. | 5.2 5.2.1 | Objective partially met. |
| Build consumer confidence in the aged care facilities involved in EBPRAC | Objective not well incorporated into project activities. Consumer confidence not evaluated by any project. | Not applicable | Objective not met. |
It is concluded that four of the seven objectives were met, one was partially met and two were not met. This is a good result, given that some of the objectives are ambitious or it is too early to fully assess the extent to which some objectives have been met
Ongoing use of evidence
The EBPRAC program has been operating ‘in parallel’ with existing mechanisms for developing evidence. Section 2.2 includes two such examples:- Publication in the same year (2009) of two guides to implementing evidence-based falls prevention in residential aged care, one published by the Australian Commission on Safety and Quality in Health Care and one published by the EBPRAC Round 1 falls prevention project.
- Development of evidence-summaries by the EBPRAC Round 2 wound management project at the same time as the Joanna Briggs Institute, as part of their work maintaining the JBI COnNECT (Clinical Online Network of Evidence for Care and Therapeutics) Aged Care website, continue their work of producing evidence summaries, including wound care.
According to Dearing ‘the state of the science (what researchers collectively know) and the state of the art (what practitioners collectively do) co-exist more or less autonomously, each realm of activity having little effect on the other’ (Dearing 2006, p 5). The EBPRAC program has helped to bridge the gap by bringing researchers and practitioners together. It is important that this momentum be maintained.
Dissemination
Dissemination of project activities was extensive, with over 2,200 dissemination activities estimated to have ‘reached’ over 200,000 people. Dissemination activity was focused at the project-level rather than the program level, with the national workshops being a major exception. Feedback indicated that the workshops largely met the workshop aims, assisted in understanding how individual projects fitted within the program, were a worthwhile use of time and a useful way of promoting networking, interaction and the sharing of ideas.The majority of projects had similar dissemination and/or marketing strategies, with a focus, particularly in the early part of each project, on activities at the local level (project branding/logo, newsletters, engagement of key local stakeholders). Later dissemination strategies included presentations at conferences and the Better Practice seminars run by the Aged Care Standards and Accreditation Agency. The publication of journal articles by project teams has the potential to significantly add to the available literature on how to implement evidence-based practice in residential aged care. It is unclear how nursing staff, personal carers and other staff throughout the industry will be able to readily access information about individual projects or the program more generally.
The program has resulted in the development of a significant volume of materials (education programs, tool kits, evidence summaries) which require some means of dissemination and regular updating. To a certain extent this has happened already with two Round 1 projects making their resources available on the Internet. To facilitate ongoing dissemination there is a need to consider the use of some ‘higher level’ dissemination strategies such as linking the work of the EBPRAC program with the JBI COnNECT Aged Care website, the Aged Care Channel and the education programs of the Aged Care Standards and Accreditation Agency. Such dissemination would benefit from a degree of planning, rather than allowing it to occur in an ad-hoc fashion. These findings are the basis for a series of recommendations regarding dissemination (see Section 13.2, recommendations 12-14).
Important resources for dissemination of the findings from the EBPRAC program are the final reports produced by each project. There is some variability in the way the final reports were compiled. Some are relatively ‘self contained’, providing a good summary in the main body of the report. Some rely on extensive appendices and some refer to other documents produced during the lifetime of the project.
There is also some variation in the quality of the reports. Some are well-written but would be enhanced by more attention to formatting and appropriate use of tables and figures to illustrate points and summarise data. Some reports include too much detail, which can make it more difficult to understand what happened and what was achieved during the project.
The reports should be made widely available but before doing so there would be merit in employing an editor to work with the lead organisations to prepare a suite of final reports that were consistent in style and quality. In most cases readability would be improved by editing the text and focusing on the key issues. The reports could be put together as a monograph series (Recommendation 11).
Sustainability
Sustainability is probably the most challenging aspect of any program. The sustainability tool used during the evaluation measured factors that have been shown to influence sustainability. When the results from using the tool at the beginning and end of each project are compared it shows an increase in the likelihood of project activities being maintained. The areas with the greatest potential for improvement by project end were being able to show that the changes improve efficiency and make jobs easier, that the appropriate infrastructure (staff, equipment, job descriptions, policies, procedures, communication systems) is in place to support the change and that senior leaders are engaged. Sustainability will depend more on factors within each facility (e.g. the presence of leadership and management support), than what was done by each project.Sustainability of what has taken place so far will not be helped by undertaking more projects. What is required is a more strategic approach that supports the ongoing development and implementation of evidence, at the same time as providing a receptive context for implementation to take place. Of critical importance to that ‘receptivity’ is the availability of people who can provide the necessary leadership. Jeon et al (2010) argued that ‘there is an urgent need for a national strategy that promotes a common approach to aged care leadership and management development, one that is sector-appropriate and congruent with the philosophy of person-centred care now predominant in the sector’ (p 1).
Future options
There is a fundamental contradiction in the EBPRAC program – each project focused on one area of practice when, in reality, each facility needs to focus on multiple areas of practice (even more than the nine clinical areas covered by the program). The program has demonstrated how difficult it can be improving practices in just one area.The EBPRAC program is a major initiative to improve the use of evidence within residential aged care. Sometimes the evidence is packaged in something like the national palliative care guidelines. More often, the evidence can be more difficult to find, spread across multiple guidelines in multiple locations. The guidelines that exist require updating at regular intervals. The significant volume of educational materials being generated by the program will require some means of dissemination and regular updating. What is being learnt about changing practices will need to be incorporated into the daily life of facilities and the structure of the industry if it is not to be lost as ‘just another program’ that came and went.
One of the challenges for the future of EBPRAC is how to incorporate the dynamic nature of ‘evidence’, across all areas of practice, into ongoing work to maintain and improve evidence-based practice. There is scope for greater coordination to avoid duplication, facilitate consistency in the production of evidence, share knowledge about how best to implement evidence-based practice and link the various resources that are currently available. Existing mechanisms would benefit from the establishment of a central agency, separate from government, with responsibility for supporting the ongoing implementation of evidence-based practice in residential aged care. The roles and responsibilities for such an agency are set out in recommendations 2-7 (Section 13.2)
Much has been learnt from the 13 projects included in the first two rounds of EBPRAC. It would be preferable to invest in making the most of what has been learnt so far before embarking on more projects of a similar nature (Recommendation 15). If any projects are to be funded similar to those in Round 1 and Round 2 it may be better to more closely align those projects to 'real world' conditions, without some of the resources allocated to Round 1 and Round 2 (Recommendation 18).
There is a need for ongoing research into how best to implement evidence-based practice. Despite the considerable volume of work done to date, primarily in acute hospital services, there are still considerable gaps in knowledge about how to implement evidence in an efficient and effective way. Research into the roles and responsibilities of local facilitators would be a good place to start (Recommendation 19).
Top of page
