Comorbidity treatment service model evaluation

Methodology

Page last updated: August 2009

The Comorbidity treatment service model evaluation project was established in May 2008. It has three components: a literature review; development of an evaluation tool; and a service model evaluation. The tasks involved in the three components are outlined in figure 4 below.

Throughout the project, AIPC project staff received advice on methodology and feedback on draft documents from two experts on mental health and alcohol and drugs, Drs Virginia Lewis and John Pead from the Australian Centre for Posttraumatic Mental Health, University of Melbourne.

Component 1: Literature review
Component 2: Development of an evaluation tool
Component 3: Service model evaluation

Figure 4: Components of the comorbidity treatment service model evaluation project

Text equivalent below for Figure 4: Components of the Comorbidity Treatment Service Model Evaluation project
Top of page

Text version of figure 4

Component 1 - Literature review

  • Review of literature of key areas in comorbidity treatment service delivery models
  • Gather feedback from key informants to confirm findings of literature review and address any gaps
  • In consultation with DoHA develop evaulation questions informed by the literature review. Continue to Component 2.

Component 2 - Develop evaluation tool

  • Develop a draft treatment service model evaluation tool based on evaluation questions
  • Pilot the draft evaluation tool with a small number of diverse service types
  • Revise the evaluation tool and seek feedback from pilot participants and DoHA
  • Finalise treatment service model evaluation tool.
Deliver mid-project presentation to DoHA.

Component 3 - Service model evaulation

  • In consultation with key informants and DoHA identify services to be evaluated, representative of a range of comorbidity treatment models
  • DoHA to invite identified agencies to participate in the evaluation
  • Implement evaluation. Provide ongoing support to recruited agencies
  • Analyse data and prepare draft report - including brief reports of each participating service. Continue to Completion.

Completion

  • Finalise and submit evaluation report, dissemination plan & financial report.

Component 1: Literature review

The literature review was undertaken between May and August 2008. The review started with a literature search, drawing from a large base including local and international peer reviewed journals, and government and non-government agencies. Strategies for finding relevant literature included:
  • Search of computerised databases: CINAHL3; Medline4; EMBASE5; Informit6 (by subjects: Health [which includes DRUG] and Social Sciences); PsycINFO7; and the Cochrane Library (including the Database of Reviews of Effectiveness [DARE]).

  • Review of key journals for latest content not yet included on databases.

  • Search of state, territory and federal government websites (including AIHW) for information relating to activities around comorbidity policy, strategies, programs and reports.

  • Search of various Australian information clearinghouses, networks and databases, including: the Australian Drug Info Clearinghouse, provided by the Australian Drug Foundation (ADF); Australian Drug Information Network (ADIN); Register of Australian Drug and Alcohol Research (RADAR), provided by the Alcohol and other Drug Council of Australia (ADCA) National Resource Centre; and Australian mental health associations/foundations.

  • Search of various international websites including: Drugscope (UK); National Health Service (UK); and Substance Abuse & Mental Health Services Administration (SAMHSA [USA]). A general web search was also conducted through Google.
Top of pageThe parameters of the literature search were as follows:
  1. The years 2000–2008 (unless alerted to particular pre-2000 study or document through review of post- 2000 documents).

  2. English language literature—but no specific exclusion of country.

  3. Focus on peer reviewed studies or publications on: treatment models or frameworks; service delivery and implementation; service improvement and management of comorbidity, with some additional contextual information on other aspects relevant for background.

    1. Included: Peer reviewed studies pertaining specifically to co-occurring mental health and substance use issues.

    2. Excluded: Peer reviewed studies on singular service delivery or interventions for substance use or mental health problems where comorbidity was not specified as integral to the program under review / study.

  4. Grey literature and other government or agency documents with relevance to comorbidity.

  5. No specific exclusions were made regarding populations (i.e. youth, prisons, Indigenous), but there was also no specific focus on any particular population.
On completion of the literature review, the draft report was delivered to DoHA and distributed to 12 key informants for review. The informants were determined by both DoHA and the evaluators and are experts in at least one of the following areas: comorbidity; mental health; substance misuse; rural/ metropolitan health care settings; and/or service delivery design. Feedback was received from 10 of the key informants on areas including gaps in the research and areas for further research. Informants were also asked to make suggestions for treatment services to be evaluated as part of the third component of the project.

La Trobe University Human Ethics approval had been obtained to interview key informants.

After consultation with key informants, the literature review was finalised and submitted to DoHA.

Component 2: Development of an evaluation tool

Drawing on the literature review and the consultation with key informants, a program logic was developed for the service model evaluation. A program logic is a hypothesised map of cause and effect. It is visually expressed in a map that depicts the inputs and activities that have been put in place or have been undertaken, and the structures and processes that have been developed in order for the changes or impacts to occur for clients, so that the goals or the outcomes of a program or treatment service can be achieved.

Broad questions were developed from the domains and sub-domains of the program logic map, and these formed the basis for the development of a treatment service model evaluation tool designed to gather information on the impact of service delivery models on treatment outcomes. The tool had a focus on the service structure and diagnostic and treatment methods. The focus was not on the detail of methods of assessment, diagnosis, interventions, and client outcomes, unless these are related to the type of treatment service model. Further, the intention was not to evaluate the performance of individual treatment services.

The tool took the form of a survey which, due to its length, was divided into two parts and administered online via the SurveyMonkey website (www.surveymonkey.com). The survey was designed to collect quantitative and qualitative data.

The survey was piloted with two treatment services in Melbourne in the presence of a researcher from the AIPC. After small adjustments were made to the tool, the survey was made available online late in December 2008.

Prior to making the survey available online, La Trobe University Human Ethics approval to survey staff of treatment services had been obtained.
Top of page

Component 3: Service model evaluation

All treatment services that were to be evaluated were perceived as providing a good service. As noted above, the evaluation did not focus on the performance of treatment services, but sought to identify elements of good practice.

Suggestions for treatment services to be included in the evaluation were sought from the 10 key informants and staff of the Comorbidity and Strategic Directions Section, Drug Strategy Branch, DoHA. The list of services to be included was finalised in agreement with DoHA, taking a range of considerations into account. These included different treatment models, different client age groups (i.e. adult, child and adolescent), rural and metropolitan locations, residential and non-residential services, and different states/territories.

The treatment services included in this evaluation do not represent an exclusive list of services that are perceived to be good services. The list of services suggested for participation in the evaluation was more comprehensive than the 17 services that were evaluated. The scope of the project limited the number of services to be included in the evaluation.

Initial contact with services to be evaluated was made through a letter of invitation from the Assistant Secretary of the Drug Strategy Branch, DoHA. One of the invited services declined to participate. The reason provided was that the service's comorbidity model was still evolving. Seventeen treatment services accepted the invitation to participate in the evaluation. They were provided with information about the evaluation and invited to complete an online survey in two parts. The survey was designed in a way that allowed completion of the two parts by different staff. Part I required a good general knowledge of the program/service, while Part II included more specific questions and required completion by a clinician or program manager. The surveys were completed by clinicians and/or program managers. With one exception, the same person completed Part I and II. Two surveys were completed jointly by two staff.

The survey was first available online on 18 December 2008. The last completed survey was received on 13 February 2009. Sixteen services completed the online survey, and one service requested and completed an electronic version of the survey in the form of a Microsoft Word document.

Copies of Part I and Part II of the survey are included as Appendix 1. While the survey questions are identical to those in the online survey, the skip logic8 in the online version of the survey is not reflected in the version included as Appendix 1.

After completion of the online survey, data were received from SurveyMonkey as Excel files and imported into the statistical analysis software SPSS9. Analysis of the data occurred at service level and across services. Due to the small number and diversity of treatment services evaluated, only descriptive statistics were used in the analysis of quantitative data. The qualitative data obtained from the survey provided additional and more in-depth information about the domains of the program logic map.

Apart from this report and the literature review, individual service reports for the 17 participating services were prepared. The participating treatment services had an opportunity to review the draft individual service reports to ensure their survey responses were accurately described. Further, a dissemination plan for the distribution of the report (in electronic format) to AOD and MH treatment service delivery organisations nationally has been prepared as part of the project.

Footnotes

3 Cumulative Index to Nursing and Allied Health Literature
4 Medline is the US National Library of Medicine's bibliographic database covering the fields of medicine, nursing, dentistry, veterinary medicine, the health care system, and the preclinical sciences
5 Excerpta Medica Database, a biomedical and pharmacological database
6 A database of Australasian scholarly research
7 An abstract database that provides systematic coverage of the psychological literature
8 Skip logic allows custom paths through a survey that respondents follow depending on their response to a particular question
9 Statistical Package for the Social Sciences

Top of page