Better health and ageing for all Australians

Evaluation of the NT MOS projects

2. Methodology

Up to Closing the Gap: Northern Territory

prev pageTOC |next page

It was essential for the success of this project that we adopted a fully collaborative approach with all parties: within both the Australian and Northern Territory Governments; in the relevant communities; with the MOS Plus Expert Reference Group (ERG); and the Evaluation Advisory Group (EAG) to ensure a high level of ownership of the process and the outcomes. At the same time, we were committed to ensuring our consultancy remained independent and that our consultation and reporting methods stood up to scrutiny.

Approach
Mixed methods
Cultural safety
Evaluation scope
Evaluation framework
Evaluation phases
Evaluation questions

Approach

The evaluation was designed to include both summative and formative elements, to document lessons learned and to make recommendations that may be used for consideration for future policy and planning. (Refer to Appendix A: Methodologies for further detail relating to formative and summative evaluation approaches).

Our methodology for formative evaluation in this evaluation involved qualitative approaches including site visits to remote areas of the NT and interviews with key stakeholders, as well as quantitative analysis of the program data as it emerged. Through our site visits, the evaluation team engaged 'learning partners' who exchanged information, stories and data, and had an opportunity to express views, raise concerns and build their body of knowledge. Through an early analysis of this data, key issues and potential service barriers or gaps were identified and fed back to those responsible for funding and provision of the MOS Plus services, and to practitioners in the field.

Success Works' approach to formative evaluation is underpinned by Appreciative Inquiry (Cooperrider, 1987). This approach identifies what is working well and the strengths of the program and then engages practitioners in identifying how this good or best practice can be achieved more consistently across the entire service model.

Our approach to summative evaluation determines what has been achieved by the initiative and the overall relevance of those achievements given the research findings. The questions we asked of the data focussed on what difference the initiative has made and what has been learnt of relevance to other services, programs and policies.
Top of page

Mixed methods

As constructivist evaluators, we assume there are many different ways of interpreting the intent and impact of a program or service. (The constructivist view is that the 'truth' is individually and collectively constructed by the people who experience it.) We therefore gathered together the understandings and experiences of the MOS Projects from a diverse range of perspectives. Triangulating the findings was critical when analysing the data, and qualitative and quantitative data shared equal place within this process.

Success Works therefore combined qualitative and quantitative methodologies in our approach to this evaluation. We used qualitative approaches to gain an initial understanding of the issues of interest and concern for government, non-government and community stakeholders for the MOS Projects' services. This helped us formulate research questions for use in interviews, and site visits, and we supplemented this with analysis of other available data sets.

Although quantitative analysis of service data sets is unlikely to directly answer evaluation questions, the use of such data was important for setting the context for our further analysis, in particular through providing information around both program scale and program scope.

Further qualitative investigation allowed us to triangulate this information and to gain a deeper understanding of the context as well as enablers and barriers to effective MOS Plus service establishment and implementation.

Cultural safety

Success Works is committed to cultural safety in our approach to consulting and to reflecting the voices and perspectives of Indigenous communities in our work. Our MOS Projects Evaluation team included experienced Indigenous consultants who lead the initial engagement and local consultation with Aboriginal organisations and individuals.

Cultural safety in this evaluation required us to ensure the involvement of relevant cultural perspectives at all stages in the evaluation including:
  • evaluation approach: The action learning/formative evaluation approach used for this evaluation is considered to be best practice in Indigenous contexts 5 (e.g. Walker, Ballard and Taylor, 2003)

  • evaluation planning: Success Works evaluation team ensured an understanding of cultural imperatives in our planning for the evaluation and in our approach to site visits. This included seeking permission for access to land through the respective Land Councils, and gaining appropriate Ethics approvals. The project also involved the engagement of an OATSIH convened EAG which has Indigenous representation.

    Engagement and communication is a significant issue in the remote NT communities targeted by the MOS Plus program. Remote Aboriginal communities in NT are currently subject to a number of recently introduced outreach services and activities. There is confusion in community about 'who is who' in this 'service space', and outreach services can be viewed as one 'welfare mob' (without distinction) by community members. Therefore, contact with communities was made in advance of our proposed consultations where feasible, via the NT MOS Plus staff. Our approach was designed to be as least intrusive and potentially confusing in communities as possible, by accompanying the MOS Plus service staff already known in community.

  • data collection: Our experienced Indigenous consultants acted as the leaders of the consultations at the local level and used a mix of 'yarning' or a narrative approach to data collection, supplemented by broad questions to ensure clarity and understanding of the responses.
Top of page

Ethics committees and land councils

Success Works was committed to ensuring the inclusion of the experiences of communities and families (where appropriate) in our evaluation. To do this we required approval from two Human Research Ethics Committees (HRECs):
  • Central Australia Human Research Ethics Committee (CAHREC), Centre for Remote Health (Flinders University and Charles Darwin University)
  • Human Research Ethics Committee (HREC), NT Department of Health & Families and Menzies School of Health Research
We also received endorsement from the Remote Health Branch, from the then NT Department of Health and Families.

Ensuring an ethically and culturally appropriate approach to visiting and liaising within communities also required approval from three Land Councils across multiple Community Councils:
  • Tiwi Islands Council (Pirlangimpi)
  • Northern Land Council (Barunga, Binjari, Manyallaluk, Nauiyu, Maningrida, Millingimbi)
  • Central Land Council (Amoonguna, Ntaria, Elliott, Borroloola, Utopia, Tennant Creek).
The effort to secure Ethics Committee and Land Council approval was extensive.

Visits to community also entailed advising the respective Government Business Manager (GBM) by registering on the Visiting Officer Notification (VON), a measure introduced after the NTER.

Evaluation scope

Our evaluation was designed to assess the implementation of the MOS Projects model and its impact and outcomes for the target population.

The evaluation was to identify:
  • the extent to which the MOS Projects have met their objectives and outcomes

  • the extent to which the MOS Projects have reached their target population

  • the effectiveness of case-related clinical services to Aboriginal children, in line with good practice clinical processes, cultural safety and in the context of the legislative framework in the Northern Territory

  • the outcomes of non case-related services in raising awareness of the issues of child abuse and related trauma in remote communities in the Northern Territory

  • the effectiveness of non case-related services as an intended pathway to case-related clinical service delivery

  • effectiveness of engagement with, and services to, remote primary health care organisations

  • the extent to which MOS Projects benchmark key elements of good practice in terms of service provision, clinical governance, staffing profiles, and continuous quality improvement

  • effectiveness of the Mobile Outreach Database (MOD) data system outcome measures for the MOS Projects' clients

  • the extent to which the MOS Projects have provided an equitable and adequate level of service in relation to need in remote areas of the NT

  • the extent to which the MOS Projects have attained the required geographic spread of services across remote NT

  • extent to which the quality of the relationship of local people with the MOS Projects positively impacts on participation in the services offered and outcomes.
Top of page

Evaluation framework

In consultation with the Department of Health and Ageing and with the EAG an evaluation framework was developed which identified the key evaluation questions based upon the outputs and short term outcomes identified in the project logic and the underlying service assumptions and informed by the project interactive framework. (The evaluation framework is attached as Appendix B).

The framework contains the following elements:
  • Project logic is a description of how MOS Projects works in theory to benefit the target group. It maps the inputs, activities, outputs and short/medium/long term outcomes together in a logical fashion underpinned by evidence from the literature. This project logic was developed in consultation with both the NT Department of Children and Families (NT DCF), and the Department of Health and Ageing. Importantly, the project logic allowed short term outputs and outcomes to be identified and measured in this evaluation, and theoretical links to the longer-term outcomes of MOS Projects to be made explicit. It also identified assumptions within MOS Projects, which were also tested in this evaluation.

  • Interactive framework provides another key process to understand how the range of MOS Projects services work. The Framework was developed by Success Works and is a tool that allows the capture and consideration of a range of variables at a range of levels, and the interaction between these variables. In essence it raises the question:"What will influence the outcomes of the MOS Projects, and therefore what do we (as service managers and as evaluators) need to know about?"

  • Evaluation questions were then developed for the evaluation based on the outputs and short term outcomes identified in the project logic and the underlying assumptions.
Informed by the evaluation questions, our evaluation was designed to assess the implementation of the MOS Projects model and its impact and outcomes for the target population. It identifies the strengths and achievements of the MOS Projects, as well as the challenges and areas where change could improve outcomes for Aboriginal children, young people and their families in remote areas of the NT.

Limitations of the evaluation framework

It is not feasible to assess the efficacy of a therapeutic counselling service for individuals and families without tailored clinical measures (such as before and after measures, and time-sequenced longitudinal follow up), particularly in a service which is still relatively new in communities.

A number of the objectives of the MOS Projects to be addressed in this evaluation are long term outcomes of the service. It is not possible to evaluate the medium and long term outcomes without an extensive, longitudinal process. These longer term outcomes will also be impacted by a range of other factors that are out of the control of the MOS Projects, and of the Department of Health and Ageing.

Evaluation phases

Success Works adopted a four phase methodology for this evaluation. (Refer Appendix A: Methodologies)

Phase 1 – Project establishment:

The establishment phase ensures that all parties are clear about mutual expectations and responsibilities. Over the life of the project we revisited the outcomes of this baseline meeting to ensure the project was on track, and that the original perceptions reflected the realities encountered during the following evaluation phases " particularly the consultations with stakeholders. The project methodology and timelines were articulated in the evaluation project plan.

Phase 2 – Development of the evaluation framework:

A project logic, based on the program documentation and influenced by the literature review, was developed in consultation with the Department of Health and Ageing, and the NT DCF to inform the development of the interview questions for the consultation process in Phase 3. An evaluation framework, with associated tools and processes, was then developed, which identified the key evaluation questions based upon the outputs and short term outcomes identified in the project logic and the underlying service assumptions and informed by the project interactive framework.

This phase also involved preparing applications to the Northern and Central Land Councils, and Tiwi Islands Council and to the appropriate human research ethics committees.
  • Human Research Ethics Committee (HREC) of the NT Department of Health & Families and Menzies School of Health Research
  • Central Australian Human Research Ethics Committee
  • Remote Health Branch (NT Department of Health and Families).
Top of page

Phase 3 – Implementation of the evaluation framework:

In this phase we used the evaluation framework to develop the data collection tools and system. We collected both quantitative and qualitative data, to identify the aspects of the MOS Projects that are working well and ways to improve the program into the future.

Quantitative data analysis focused on data available from the MOD, including demographic and aggregate data over time and the NT DCF MOS Projects quarterly reports. The interrogation of service activity data included: community visits; case and non-case-related data; referrals - sources and abuse categories; and client outcomes.

Consultation with over 100 key stakeholders was undertaken to gain diverse perspectives and understanding of the establishment and provision of MOS Projects. (Refer to Appendix C for list of interviewees).

A range of stakeholders from: the three levels of government; non government agencies; outreach service providers; local organisations within regions and communities; community members, and families were interviewed during the consultation.

These encompassed individuals in varying roles in urban, regional and community agencies, including both outreach and locally-based services:
  • Government or service provider head office: 12 agencies/divisions
  • Regional agencies: 9 agencies
  • Local/community agencies: 29 agencies
  • Community traditional owners, community leaders: 15.
MOS Projects:
  • MOS Expert Reference Group: 11
  • MOS Plus team members: 13
  • MOS Plus client.
Telephone consultations were undertaken with key stakeholders who were unavailable for face-to-face meetings, including individual interviews with the MOS Plus Expert Reference Group members.

Stakeholder interviews commenced in Darwin in November 2010. This comprised predominantly government, non government and outreach service providers and members of the MOS Plus management team.

The selection of communities to be visited by the evaluation team was made on the basis of geographical location, community size and demographics to ensure community diversity in the data collection. As is clarified elsewhere in this report, all communities in remote NT have been visited by MOS Plus. The snapshot of communities visited for the purpose of this evaluation should not be interpreted as having been visited by MOS Plus in response to identified cases of child abuse or neglect.

Initial community consultations had been scheduled in communities in the Katherine and Top End Regions at that time. However, the visits did not go ahead due to initial delays with the Northern Land Council granting permits and then the limitations of the permits to accommodate last minute schedule changes. Such changes, due to either weather limiting access to community and/or changes to scheduled visits by the MOS Plus team, are indicative of the service delivery challenges in this area. This resulted in rescheduling of all community visits across Top End, Katherine, Barkly and Central Australia Regions for March 2011.
Top of page
Some changes to the original schedule for these visits also experienced late changes due to Sorry Business in one community, and inability to access other communities due to the exceptionally long wet season and stormy conditions.

Consultations in community with local organisations; and community members and leaders were a crucial aspect of the evaluation framework - to 'test' the validity of the perspectives gained in consultations with government and outreach service providers and the trends evident in the quantitative data.

Consultations have been conducted in ten communities (refer to image below):
  • Amoonguna
  • Ntaria
  • Ti Tree
  • Tennant Creek
  • Nauiyu (Daly River)
  • Millingimbi
  • Barunga
  • Pirlangimpi
  • Borroloola
  • Binjari
Members of the Success Works MOS Projects evaluation team also presented at meetings of the MOS Plus ERG. In November 2010 an overview of methodology and approach to be used in the evaluation was discussed, and at the March 2011 and June 2011 meetings an update and overview of the emerging findings of the evaluation was provided.

Teleconferences with the EAG were also convened at key junctures in the evaluation process: development of the evaluation framework; discussion of the Second Progress Report including key findings following consultations in community; and discussion of the draft final evaluation report including recommendations.

Image. Geographical location of consulted communities

Text equivalent below for Image. Geographical location of consulted communitiesTop of page
Text version of Image
The locations of the ten communities where consultation for the evaluation were conducted, starting from the north of the map, are:
  • Pirlangimpi (on the north-central coast of an island directly off the NT north coast, opposite Darwin)
  • Millingimbi (on the north-east coast of NT, east of Darwin)
  • Nauiyu or Daly River (south of Darwin)
  • Barunga (east of Katherine, half way between Katherine and the eastern boundary of the NT)
  • Binjari (a short distance south-west of Katherine)
  • Borroloola (south-east of Katherine, close to the eastern boundary of NT)
  • Tennant Creek (central NT)
  • Ti Tree (south-central NT, north-east of Alice Springs)
  • Ntaria (south-west of Alice Springs)
  • Amoonguna (south-east of Alice Springs).

Phase 4 – Analysis and reporting

Reflecting the evaluation approach, ongoing analysis occurred over the course of the evaluation. This phase reflects the final analysis at the conclusion of the evaluation, and the preparation of this report including findings and recommendations for future service delivery.

Evaluation questions

The evaluation questions developed within the evaluation framework were based on the outputs and short term outcomes identified in the project logic and the underlying assumptions. (Refer to Appendix B: Evaluation framework). To inform discussion of the findings in Chapter 3, the evaluation questions are also presented alongside the contract schedule evaluation requirements with which they align, for easy reference. (Refer to Appendix A: Methodologies).

Informed by the evaluation questions, this evaluation assessed the implementation of the MOS Projects service model, and its impact and outcomes for the target population. These findings identify the strengths and achievements of the MOS Projects, as well as the challenges and areas where change could improve outcomes for Aboriginal children, young people and their families in remote areas of the NT.

Diagram. MOS Projects service model

Text equivalent below for MOS Projects service modelTop of page

Text version of Diagram

The evaluation framework, evaluation questions, project logic and interactive framework are based on the evaluation objectives. Along with the other elements, the evaluation questions drive the data collection, and inform the findings, analysis and recommendations.

Data

The findings draw upon both quantitative data and qualitative data.

Quantitative data analysis was conducted using data available from the MOD, and the NT DCF MOS Projects quarterly reports to OATSIH, unless otherwise indicated. Data relating to the assessment of underlying trauma recorded at the time of referral to MOS Projects was collated by MOS Plus staff, by reviewing the past case notes for each case. Unless otherwise indicated, the data in this report covers only the period of the evaluation (1 July 2008 - 31 March 2011) and therefore excludes the pilot period of the MOS Projects.

The qualitative data was derived from extensive consultation with over 100 stakeholders, including site visits to ten communities.

Footnotes

5 Eg Walker, R, Ballard, J and Taylor C (2003) "Developing paradigms and discourses to establish more appropriate evaluation frameworks and indicators for housing programs" AHURI Final Report No. 29, Western Australia



prev pageTOC |next page