The partnership between The Kambia Appeal and Kambia District Health Management Team in Sierra Leone explain their approach to monitoring and evaluation. This practical guide is for health partnerships who want to enhance their approach to monitoring change.

Why is reliable data so important? When asked this question, one of the partners stated “..as a basis for taking appropriate action”, capturing the crux of monitoring and evaluation. Reliable data is required for appropriate decision making, regarding programme management and improvement of interventions, as well as for external stakeholders to develop policy and allocate resources in line with identified needs and priorities.

Kambia healthworkers
Health workers at Kambia Government Hospital receive training

 

Sierra Leone has a severe shortage of skilled health workers and the partnership between The Kambia Appeal and the Kambia District Health Management Team (KDHMT) addresses this shortage by providing training for newly recruited health workers, continued education for staff, and development of clinical tools and guidelines. The capacity building and quality improvement efforts are carried out at Kambia Government Hospital and three Peripheral Health Units (PHUs).

 

Monitoring and Evaluation of Programme Results

To gather reliable data for decision making, the partners integrated monitoring, evaluation and learning initiatives into the programme design. The use of multiple approaches, gathering both quantitative and qualitative data, increased the reliability of the data.

Guide Contents

1. Monitoring visits led by the UK partner

2. Testing knowledge and skills

3. Observation of practice

4. Self-assessments

5. Auditing records and practice

6. Sharing and learning

7. Expansion of activities following M&E

8. Advice

Monitoring Visits led by the UK Partner

Monitoring visits are led by The Kambia Appeal in consultation with the local partner, KDHMT. Twice a year the effectiveness and reception of the programme activities is reviewed by conducting focus group discussions with local health workers and stakeholder consultation meetings with KDHMT and key members of the hospital management.

The focus group discussions serve to gain insight into the experiences of the health workers regarding the training, mentorship and on-the-job training received. They also assess the implementation of knowledge and skills, gained in training, in their daily work. However, the partners felt they were not always provided with the complete picture; they experienced several barriers to open discussion, including the language barrier and reluctance of participants to critically reflect on experiences in the presence of outsiders.

The partners utilise two strategies to address these barriers. Firstly, they involve a local health worker, skilled at retrieving information from others, who conducts the focus group discussions on the partners’ behalf. Since the local health worker is a peer and speaks the local language, the information provided in the focus group was more dynamic and richer than earlier data.

Secondly, when a skilled local health worker is unavailable, the partners adopt a different approach: the health workers are provided with the questions and left alone to discuss among themselves. Afterwards the health workers present the findings of their discussions to the partners.

 

Data from monitoring and evaluation can lead to evidence of the effectiveness of the programme approach. In one of the focus group discussions a hospital midwife commented:

 

“I have noticed the training for the VNAs [Volunteer Nursing Aides] has improved the monitoring of vital signs for patients, they are happening more regularly. Before the volunteers were here the vital signs were not taken regularly and not taken well. The practical training (on the wards) has helped to bring about these changes. It helps us to detect deviation early in patients, so that we can help them.”

 

Testing Knowledge and Skills

Long term volunteers are delivering various teaching modules to local hospital staff. Pre and post-course evaluations and informal tests are conducted during the course. The information gathered led to improvements and readjustments of the modules. The pace of the course was adjusted to the understanding of the group and the volunteers realised that repetition and simplification of the content was essential for effectiveness of the teaching, especially as it concerned trainees who had no previous medical training and limited education.

The specialist resource teaching courses and training modules include post-course MCQs and skill assessments to measure the effectiveness of the training. For example, after the completion of the foundation nursing module, 92% of the volunteer nursing aides showed improvement in knowledge. Additionally, the delivery, content and other aspects of the training are evaluated. This revealed a desire for more lessons taught in Krio and involvement of local staff who would be easily accessible for questions. As a result the partners increasingly make use of local staff and peers to conduct sessions.

 

Observation of Practice

Observations are conducted to ensure skills and knowledge gained by the health workers are used in practice. In particular, this concerns Volunteer Nursing Aides (VNAs) who have virtually no prior medical knowledge or training, but who are essential to the operation of the under-staffed ward. The observations are conducted by ward nurses who regularly work with and rely on the VNAs. UK volunteers oversee the process of observation.

Competency charts based on training content were developed by UK volunteers to provide an incentive to VNAs and as a record of their skills and knowledge in the absence of a formal certificate. These charts include: key steps to perform certain procedures, such as taking vitals; key questions about the outcome and possible causes to inform diagnosis and action. Successful execution has to be observed 10 times for the chart to be completed. The hospital matron recognises the importance of the charts as indicators of VNAs’ skills, which might potentially lead some VNAs to be trained as nurses.

Download a copy of the competency chart, kindly shared by The Kambia Appeal, here.

The following is example evidence provided by observations of a VNA’s practice:

 

“Following specific teaching on triage and IMCI-based treatments, he [VNA] became much more consistent in his prescribing practices.  His assessment and prioritisation of sick children also improved dramatically as did his handover of sick patients from one shift to the next.” (UK volunteer 2013)

 

Self-Assessments

All the UK volunteers are included in a pre-departure training weekend. The weekend is evaluated through reflections and an evaluation survey. The data is used to improve the training weekends where needed. The UK volunteers also write reflective pieces after their return. The Kambia Appeal selects quotes from their reports to match with the NHS Competency Framework to illustrate how the volunteering experience reflects on the volunteers’ competencies. In addition, volunteers fill out an exit survey rating their experience in relation to the NHS Competency Framework.

 

Auditing Records and Practice

The UK volunteers often identify quality improvement areas based on their observations of challenges faced by health workers when responding to a certain scenario or set of symptoms. Some of these issues are also recognised by local staff. Guidelines and protocols are developed in consultation with the local staff, comprising of the Kambia District Medical Officer, pharmacists and other actors relevant to a particular protocol. Following approval, the protocol is implemented by training the doctors, nurses and aides and The Kambia Appeal volunteers audit its implementation during their placements. The work is then handed over to the next long-term volunteers. For example, the audit of drug administration and vitals was undertaken by a volunteer in 2013 and is now continued by another volunteer in 2014.

To maintain the value of protocols and guidelines, effective handover to local staff is required, an aspect covered in volunteer pre-departure training weekends.

When it concerns a prescription guideline or protocol, like antibiotics prescription, the volunteers find monitoring fairly straightforward as prescription data can be easily audited. However, for other protocols and guidelines which involve the treatment of a set of symptoms through various methods, monitoring and evaluation of records is not always sufficient. Volunteers compare data from records with practice they have observed, or observations of doctors or other local staff, such as the hospital matron. If a protocol or guideline is not implemented satisfactorily, it leads to continued training or revision.

The following example about blood transfusion guidelines illustrates the use of audits to assess gaps and to demonstrate results following the intervention:

 

“The aim [of the guidelines] was to eliminate unsafe transfusions by rationalising the prescription of blood, thereby conserving limited supplies of screening materials.  An initial audit of all transfusions, looking particularly at indication and safety parameters, was followed up by new prescribing guidelines and education sessions for lab and clinical staff.  Re-audit was undertaken 3 months later. This was remarkably successful. Total transfusion numbers were almost halved, meaning that screening kits were nearly always available and few unscreened transfusions were given over the 2nd audit period.  Laboratory documentation had also increased dramatically and transfusions were no longer issued without a haemoglobin result (indirect effects). The head of laboratory services was keen from the outset to tighten controls on issuing of blood by lab staff and came up with the new transfusion thresholds. There was also a rolling programme of education to lab and ward staff on all aspects of transfusion safety and positive feedback on the good results.” (The Kambia Appeal 2013)

 

Sharing and Learning

The partnership shares findings through conferences, presentations and involvement in relevant local forums. The partners also share learning through videos, which has the advantage of supporting narratives with live images. For example, a video about the volunteering journey includes statements and anecdotes about results and achievements, serving as a visual M&E tool, as well as providing details of the volunteers’ experiences, which helps in the recruitment of volunteers. http://www.kambia.org.uk/volunteer/programme

 

 

Expansion of Activities following Monitoring and Evaluation

Throughout this case study it is evident that the partners learned from, and acted upon, their monitoring and evaluation findings. Even though the partnership is able to collect reasonably reliable data on their programme results from various sources, difficulties have arisen in collecting quantitative baseline data. The partners discovered that the data on morbidity and mortality was often inconsistent: they noticed similar numbers were recorded for subsequent months and discrepancies between volunteers' observations and the recorded data. This lack of reliable public health data forms an obstacle for gathering evidence of programme results and at the same time affects the developing country partner, the KDHMT, in reaching appropriate decisions for priority setting, policy development and resource allocation.

As a result, the partners are developing a public health element to their work, focusing on capacity building of the KDHMT to record, collate and analyse public health data. This element recently commenced and illustrates how partners acted on an identified gap with the aim of contributing to changes in resource allocation and policy development for sustainable health impact.

 

Advice

  • The observation of practice after training might lose momentum when the UK volunteers change. Therefore, it is important to allow overlap of volunteer placements to enable volunteers to provide detailed verbal handover in addition to written handover notes. Further, some of the returned volunteers keep in touch with the newly placed volunteers to ensure continuity.
  • The competency charts act as a monitoring tool and incentive for staff. However once the chart is completed, it loses its value as an incentive driving performance, so the associated practice may end up decreasing. Therefore, it is essential for the hospital management to maintain observation of good practice and to motivate staff to continue the practice. Buy-in and recognition is therefore required from hospital management to ensure sustainable change in practice.
  • When developing a visual resource it is important to work with an experienced video producer and to engage local staff and volunteers. This will ensure proper planning of the process, for example by development of a pre-planned storyboard, and timely progression and collection of the right video material.
  • When strengthening data systems, the partnerships should be prepared that once the system is strengthened the health indicators are likely to worsen, as the data becomes more accurate. This might lead to adjustment of programme interventions to align with the identified needs.
  • One of the main challenges is to convince others of the importance of accurate data collection, because they might not always be able to envision the immediate results. In an attempt to bridge this gap, the partnership is planning to utilise KDHMT administrators trained in M&E to conduct training for the local health workers at the hospitals. By providing perspective on how the data is used at the district level, these administrators will have a role to ensure all actors understand the importance, and potential impact, of recording accurate data.