Guidance & lessons learnt from the IHLFS.
In Summer 2012, THET and British Council facilitated sharing and learning events for UK Link Coordinators as part of the evaluation of the International Health Links Funding Scheme. The recommendations and good practice guidance have been compiled using the findings from these events. The content is broken down by key points, accompanied with a lesson or recommendation either for those involved in a health partnership or for the Health Partnership Scheme (the successor to the IHLFS).
One Link working in Uganda found that carrying out a baseline survey helped to refine their project plan by drawing out and therefore highlighting the real issues facing the hospital. They also found that, even though there may have been no data or a baseline of zero, just going through the process of discovering this revealed other problems that were important for the partners to be aware of such as in weak record-keeping processes and highlighting the need for clear staff responsibilities.
Lesson for health partnerships and HPS: There are real benefits to doing a proper baseline measurement, not only for monitoring and evaluation purposes but for developing the partnership element by articulating the problems and agreeing the approach together. Health partnerships funded by the Health Partnership Scheme must state their baseline at the start of a project.
UK Coordinators agreed that project monitoring must be kept as simple as possible. To ensure that data gathering tools e.g. logbooks, are going to stand a better chance of yielding results, devise them in partnership. In the majority of cases, Links are required to do their own project monitoring, rather than having an external consultant do this for them, and so it is highly important that the tools can be used and understood by those without expertise in monitoring.
One Link was using a rating scale for getting a picture of trainees’ confidence in using new techniques. This tool was well-known by the UK partner but when implemented in Uganda, they found that respondents tended to select only the more extreme ratings at either end of the scale, meaning that the results were influenced by the tool. To address this, the Link had to give extra training in the use of the tool and adapt the language to ensure that it was accurate for the context. This same Link learnt much about the importance of standardising terms for data collection when a new data collector for the questionnaire interpreted the term “serious incident” differently to the previous data collector. This led to problems with consistency of results.
Lesson for health partnerships: M&E plans should be devised by both partners. What does the developing country partner currently do? What might be the implications of introducing a monitoring tool in terms of workload, understanding, and consistency of use? Ensure that terms used in data collection are standardised, not open to interpretation, and sufficient training has been given in their use.
Tip: Add nuance to your training evaluation questions to get more meaningful responses.
A number of Links assess the quality of their training by asking trainees to complete course evaluations. One Link approached the problem of seemingly superficial answers by rephrasing the questions to include probes: “We have looked at the structuring of the questions we ask, ‘Anything we could do better? What did you most enjoy? What did you least enjoy? To try and dig a bit deeper.”
Links agreed that clear communication on the purpose of data collection must be in place from the outset of the project. Without clear communication of the reasons for data collection, Links found it was hard to get the necessary level of commitment to do it, making it harder to track results. One Link recommended giving feedback to the partners on monitoring activities and results throughout the project as this helped everyone to understand why the project was being measured in the way it was. For one Link working in Malawi, the differences in understanding of the purpose of data collection meant that partners were not gathering the same types of data for donor reports. For example, the Malawian partner was focused on funding problems whereas the UK partner wanted to include details about how the project was developing as a whole.
Greater understanding of the reasons for collecting data can also bring greater commitment to this activity and Links spoke about the importance of ownership in improving data collection. One radiography project in Uganda spoke about the team’s motivation for good data collection as they had to analyse the data as a team, create graphs to show changes in the number of reject films [a sign of quality], and discuss the results: “I think the more you can get people to own the data, if they're collecting it and using it to improve, that's when you start seeing change and people getting motivated. If the data goes into the system or report, there's no value to them, so the data quality is less of a concern. If they know they're going to analyse it, draw their own graph, discuss it as a team, the data quality shoots up, because they've got to use it. That's a cultural change.”
Guidance for health partnerships: Communicate the reasons for data collection and find ways to feed the data back into the team/department such as graphs on the walls so that positive changes in results are visible, or discussing results with the team at the developing country institution. In terms of the data we get from health partnerships, THET will communicate how we use your data, and continue to work on resource development, in direct response to the lessons health partnerships have learnt.
The clear message from Links is to avoid data collection for its own sake and instead consider what you can feasibly do with the data once you have collected it. Several Links noted that they had reviewed their ambitions for data collection as they were expending a lot of time and energy on generating information that they did not then have the time to analyse and act upon. In terms of more efficient data analysis, one project working in Ghana took on students from their institutions to help with data analysis; a good solution for them and an opportunity for the students to get involved with the Link.
Guidance for HPS: THET will review reporting requirements to ensure that health partnerships are not taking on too much. Health partnerships in turn need to regularly review the volume of data they are gathering versus their capacity to analyse and act on it. Health partnerships need to be clear on who their stakeholders are, what information they require, and what difference it could make to the project if their stakeholders have that information – and then they can decide what data they need to collect.
Tip for health partnerships: Hand scanners help with recording and archiving data while on a trip.
One Link invested in a hand scanner for the UK partner to take on trips to their overseas partner institution. They can quickly record data and save paper records digitally so there’s less risk of losing them.
In speaking about what happens when the doctors leave the endoscopy training sessions, one Link working in Malawi noted, “Where possible, staff go away with a personal development plan, but once they go back to their hospitals it’s quite hard to know if that’s being carried out.” This was echoed by another Link working in Uganda who felt they could not really know if the activities they had planned together with their Ugandan partners continued when the UK team left: “…the difficulty with management when you go back you’re not sure if they’ve done it or not, because of not having a true project lead in the country. You get the impression they’ve done it, but no evidence except walking round yourself and seeing the changes.” These quotes illustrate that maintaining momentum is a key challenge facing the health partnership model where visits by the UK partner are brief and relatively infrequent, and email/phone communication is difficult.
Lesson for HPS: Health partnerships will not bring about change quickly. Rather, change is an incremental process where the partnership has to overcome such difficulties as: barriers in communication; institutional barriers against the implementation of new ways of working; and restraints on the amount of time UK and developing country teams can devote to this work, to name some of the most common issues. Project leadership in the developing country is crucial for both overcoming these barriers and keeping track of progress.
Changes in staff practice or even patient outcomes are very hard to attribute directly to the Link’s intervention given the number of other factors that can impact on change, which are beyond the remit of the project. One Link in Uganda reflected that changes in the local community’s attitudes towards the use of violence on the mental health ward may have been as much down to an article run in a local newspaper as their own work in this area. In terms of recording change, a Ugandan Link spoke of their frustrations at struggling to record changes in a formal way such that they can show the connection between their capacity-building activities and improved practice. They recognised that their observations and the conversations they have with students are valid, and positive, indications of success but it is challenging to present this anecdotal evidence in a report: “The things I’ve noticed have just been anecdotal, frustrating, but it doesn’t diminish the impact of it. The students say, ‘I’ve used what you taught me’, means a lot to me but it is difficult to put in a report.”
Guidance for health partnerships: these projects are not the same as academic research; partners cannot do controlled trials, they do not have access to consistent, quality data, and the resource available to gather, analyse and interpret data is very limited. Therefore, the value of observations that may at first appear merely anecdotal must not be underplayed. Find a way to capture what people say, what you can see, and other informal measures of change such as in visit reports, semi-structured interviews, or ad hoc trainee feedback. This data will then help to illustrate and reinforce the quantitative data you are reporting on in your project plan.
Tip for health partnerships: Improve confidence in your results: use multiple data collection methods (‘triangulation’)
This could mean running an audit of hospital records to get a quantitative measure for changing practice and coupling this with staff interviews for qualitative results about changing practice. Diverse methods means more sources of information for your data. This can bring you closer to being able to see changes in practice than if you use only one method to record change.
One palliative care project in Zambia spoke about the problems that arose when it was discovered that records created and circulated by the ministry of health were in a format that could not be read by the specialist hospital: “There was a situation where there was a lot of data going around and none of it was readable by anybody else. As we worked with the programme we realised that stuff we had taken for granted, wasn’t happening.” Another Link working in Zimbabwe articulated a problem with baseline data that is common to many Links: “If you can have a baseline that’s not just zero, it’s better for demonstrating improvement. We’ve got the same problem with neonatal mortality and morbidity [data availability]; they haven’t been recording it.”
Guidance for HPS: THET’s expectations for evidence of project impact are based on the understanding that data is poor quality or hard to access. The focus for health partnerships is demonstrating change at the Outcome level of their project plans, where indicators should be chosen with context and resource in mind i.e. do your indicators accurately reflect the resource and data you will have at your disposal? Likewise, the baselines can include narrative or the use of proxies where hard data does not exist.
One Link had a strategy of making sure that their Ghanaian partners took ownership of monitoring their progress; it was their responsibility to define their objectives and on a yearly basis they had to be able to show that those objectives had been met before the partner released the next tranche of funding. This approach may not be appropriate for all health partnerships and there are instances where data collection was its own incentive. For instance, in a radiography Link in Uganda, the UK project lead found that trainees in the radiography department were motivated to collect data on their practices as it meant they could see how their skills were improving over time. For the radiography department, the data provided a visual account of the improvement in the team’s skills, which has been a sufficient incentive for the team to keep it up.
Lesson for health partnerships: data collection has benefits beyond monitoring activities: it can be a tool for greater ownership of the project by the developing country partner and it can provide the basis for motivating trainees to keep working towards greater change in practice.
Tip for health partnerships: Trainees can take charge of tracking their own development.
For a nurse-leadership training project in Malawi, the trainees are given three tasks and asked to keep a diary to show how they address the tasks: “What they also have to do is have a diary. At the end of each [training] session, they are given three tasks related to them as individuals and their communities. They have to say how that session affected their three areas…” This tool puts some of the responsibility for gathering evidence on training effectiveness into the hands of the trainees. These dairies are also a personal record for the nurse trainees to see how they are progressing.
A number of Link members spoke about the tension between needing results on the success of their projects versus the lack of financial and human resource available for it. One Link in Ghana felt there were unfair demands from THET for M&E: “You don’t want us to pay for people [human resource] in our grants. Yet you ask us for information. You will complain about our projects if we don’t give you the evaluation.” Another Link spoke about the anxiety they felt about their partners doing the necessary data gathering coupled with the difficulties of finding funding for the UK Coordinator to do it themselves. While this perception about THET’s demands was not common, the pressure felt to return results was. The requirements for Links to both implement and report do cause a strain when the size of grants means that the majority of funding goes on the practicalities of delivering training such as flights, materials, and workshops.
Guidance for HPS: THET now works closely with grantees to ensure that their project plans are in good shape to report on progress and likewise that our reporting requirements and processes are clear. Using data from the IHLFS, THET has built a theory of change for the Health Partnership Scheme, which illustrates the complexity of bringing about change in health systems and as such, our expectations for health partnerships in terms of the changes they will likely be able to show. The theory of change shows that results at the health outcome level will be very difficult to show but we work with grantees to articulate objectives and measures at the health worker performance and/or service access level. We have created a resource to guide partnerships through the theory of change and project planning.