Summary

Range of Evaluations

We know there is loads of CAPA evaluation going on. Details with the some of the sources are on our website (www.capa.co.uk) in a slide set. What we know so far is summarised below. If you have done some evaluation we would love to hear from you. Email both of us on steveandann@capa.co.uk

At the time of writing we are aware of three national evaluations (two in the UK and one in New Zealand), 16 local (UK, New Zealand and Australia) and one in adult mental health (New Zealand). One Children’s Learning Disability service has also
written up their experience of implementation.

Fifteen studies have data on the impact on waiting times, ten on the views of service users, twelve on the impact on staff (including administrators) and three on views of referrers.

What do they show?

Service users

Services have gathered views through a mix of direct interviews and questionnaires. The overwhelming majority of feedback is extremely positive and in particular families were seen quickly, felt listened to and got new ideas. Choice appointments as a one-off made adolescents and parents feel more open and this was especially so for adolescents.

In New Zealand, young people seen in a focus group said they liked the focus on goals not just problems and matching the clinician to their needs. They also liked that they did not have to see the Choice clinician again if they did not like them! They loved the name ‘Choice’.

Impact on staff

Clinicians report increased job satisfaction and morale, improved team functioning. They work in a more focussed way, feel more relaxed and collaborative in their work with families and find it easier to let go. They love the increased learning through team discussion and development of skills. However, implementation causes some anxiety which settles as CAPA became embedded.

Administrators prefer CAPA to their old system, find it easier to track clients and feel it is a more efficient way of working.

Two services have noted a reduction in com- plaints on implementing CAPA.

Quantitative changes

On average the waiting time to be seen was 5- 9 months and reduced to 1-6 weeks when CAPA was in place. 66 to 90% of those seen in Choice were transferred to Partnership, with waits of around 4 weeks. Some teams find they develop longer waits but this is due to not implementing full booking to Partnership (Key Component 5). Do Not Attend/No Show rates are around 5-10%.

Referrer feedback

Referrers (mainly GP’s) like the short waits and rapid feedback.

Clinical outcomes

We are only aware of a few teams that have measured clinical outcomes. None had measured their outcomes pre-CAPA and only report the impact post implementation. One team in New Zealand found reduced scores on the Child Behaviour Checklist post Partnership.

We are currently undertaking some research on outcomes with the CAMHS Outcome Research Consortium (CORC) – results will be posted on www.capa.co.uk

Negatives

Change is unsettling and many staff found implementation stressful, but this evaporated once in place. This was especially so if the implementation was felt to be imposed by management.

Some teams struggled with too few Core Partnership slots and felt under pressure to discharge. Some had difficulties managing the variation in demand and that they developed bottlenecks to some interventions.

Some teams reported that admin staff felt systems are duplicated and they spend too much time on chasing clinicians to enter their Choice and Partnership slots. These issues were subsequently addressed in the team.

Overall, these challenges reinforce the need for good leadership and management of change processes (Key Component 1), and full implementation of all the Key Components, including flexing capacity , extending skills and full booking.