According to CDC Guidelines “timeliness reflects the speed between steps in a public health surveillance system”96. The timeliness of the trachoma surveillance system can be evaluated by assessing the different steps in the system (see Figure 3.10). The interval usually considered first is the amount of time between the onset of a health-related event and the reporting of that event to the public health agency responsible for instituting control and prevention measure97. The trachoma surveillance system is also evaluated in terms of the availability of information for control of a health related event, including immediate control efforts, prevention of continued exposure, and program planning.

4.9.1 Timeliness of reporting to national surveillance system

The time taken from the identification of a child with trachoma to reporting that event to the public health service responsible for control is normally very short, because treatment is often part of the screening process. Accordingly, this time is not further considered and the evaluation focuses on the time taken to report trachoma screening data to the public health surveillance system which varies considerably across jurisdictions. The evaluation found that timeliness is dependent on a number of factors including: the size of the region; the number of ‘at risk’ communities in a region; the number of trained personnel available for screening; the availability of local community health nurses to assist with follow-up and deem it a priority; the relationship the screening program has with schools (level of assisted coordination); and the ease with which consent is gained from parents.

Given the remote location of most communities screening personnel are at times required to travel large distances. Furthermore, data are collected on paper, which implies that it needs to be entered manually upon returning to a clinic or population health unit, often a considerable period of time after collection. Other issues include data collection about follow-up with treatment for family and communities, which takes time and is often not the priority of the local community health nurses. Furthermore, data collection and entry are not standardised across WA, SA and the NT, nor is the process standardised within jurisdictions particularly the NT. This variation complicates and lengthens the data collection process.

WA is considered to have the most timely and efficient screening process. The screening occurs in the same time period across the State and children with trachoma are treated on the spot and their family members and community (if required) are treated shortly after by the local community health nurse. The associated data are generated as part of the screening and treatment processes. In the NT trachoma screening mainly takes place as part of the HSAK Program, which runs across the year, usually early in the school semester therefore trachoma screening does not take place in a single period. Some regions conduct their own screening program (not part of the HSAK program) which results in inconsistencies in screening and data collection. For regions where screening is integrated, data are collected on the HSAK form used for all school aged children. These data are not directly comparable to the NTSRU requirements which means that re-entry of data is required. In SA, there is only one person coordinating the eye health program. This person is involved in screening over the year, which is limited in its breadth due to resource capacity. Data entry is also done by the same person, who is time poor, thereby compromising timeliness of national reporting.

Findings:With respect to the timeliness of reporting trachoma prevalence to the national surveillance system the evaluation team finds:
  • there is considerable variation in the timeliness of reporting, due to logistical and geographic issues associated with isolated communities as well as resource shortages; and
  • timeliness of reporting would be improved by further standardisation of collection and reporting processes and appropriate resourcing.

4.9.2 Timeliness of data for use in program planning

Most surveillance teams commented that meeting the deadline for data submission was always an issue due to resource levels. CERA also made the comment that they received the data almost always after the deadline which required them to work through the Christmas/New Year period to analyse the data to prepare the required tables/figures and associated commentary. Once the data are on hand at the NTSRU it is checked, tables and text are produced and then it is sent back to jurisdictions for verification. If any re-analysis is required as a result of this validation process, additional time is required. At present, about nine months are required to get data from varied sources to meet surveillance requirements.

The time taken from screening for trachoma through to producing the annual Trachoma Surveillance Report is approximately 12 months. Based on consultation with a number of stakeholders the length of time between screening and producing the report is considered excessive. As was the case in 2008, results in the final report were available after or just before screening was conducted again in certain jurisdictions. This lag meant that the annual report was not available in time to allow for any changes to the program to be made based on outcomes from the previous year. However, it is the case that most jurisdictions keep data at the local level; the use of these data for evaluation and assessment of the program varies.

Findings: With respect to the use of the reports from the national surveillance system for program planning the evaluation team finds:
  • the time lag from data collection to publication of report is too long for the data to be effectively used for program development and planning; and
  • timeliness of reporting would be improved by the use of electronic data collection and a web based reporting system.top of page

96CDC (2001) Updated Guidelines for Evaluating Public Health Surveillance Systems. Recommendations and Reports July 27, 2001/50(RR13);1-35
97CDC (2001) Updated Guidelines for Evaluating Public Health Surveillance Systems. Recommendations and Reports July 27, 2001/50(RR13);1-35