At a glance

The Department for Levelling Up, Housing and Communities (DLUHC) wanted to measure the impacts of locally-tailored activities designed to create stronger, more-integrated communities. With dozens of diverse activities to choose from across five local authorities, IFF worked closely with DLUHC to identify three types of activities that recurred across more than one of the local authority areas. This allowed IFF to survey enough residents across multiple local authorities, to demonstrate that the funded activities had made a difference.  Working flexibly and creatively, we were able to help minimise disruption to the evaluation caused by the pandemic.

About the client

DLUHC, formerly the Ministry for Housing, Communities and Local Government (MHCLG), is the UK government department that supports communities across the UK to thrive, making them great places to live and work.

Challenges and objectives

In 2019, DLUHC launched the Integration Area Programme, to trial a new localised approach to social integration in five local authorities (Blackburn with Darwen, Bradford, Peterborough, Walsall and Waltham Forest). Each area developed its own funded activities intended to create stronger, more-integrated communities in their local area. DLUHC wanted an evaluation to gather evidence of what impact these funded activities were having, and particularly wanted quantitative measures of impact.

The first challenge was that, measuring impact usually relies on the activities being consistent across every local area, so that you can be confident that what’s being delivered, that’s supposed to lead to the intended changes, is always consistent. But, with the Integration Area Programme, we were dealing with dozens of different approaches and an ethos of local tailoring. It would have been completely inappropriate to insist on one consistent approach to improving social mixing and integration in each place. The outcomes that each area was trying to achieve varied too – sometimes, alongside social mixing and integration, there was a desire to improve employability, or mental wellbeing, or to better equip local people to hold local decision-makers to account. This wasn’t surprising, as each area was tasked with using the Integration Area Programme to tackle local issues.

The second challenge – that emerged during the evaluation – was COVID-19 causing the introduction of social distancing, which meant that most of the local activities we were evaluating were paused, and then resumed at different times. Some of the activities changed in their nature when they resumed, too. This meant disruption to plans to evaluate the funded activities.

writing notes

Solution

To make measurement of impacts possible, together with DLUUHC, we independently mapped the dozens of activities and the intended outcomes across the five areas and compared notes. The result was a focus on three types of activity that recurred – with similar intended outcomes – across more than one of our local authority areas (important when you need to consider how many local people you might be able to gather data from, to try to prove that the funded activities have made some sort of difference). The three activity types that recurred were:

  • Community Ambassadors (volunteers supporting community integration goals)
  • Community Conversations (events designed to bring diverse community members together to discuss commonalities and differences among local residents)
  • School Linking among primary and secondary school pupils (in which classes in diverse schools were paired to encourage social mixing and build understanding of pupils from different backgrounds)

These became the focus of evaluation, as – by combining findings for similar interventions across local authorities – this maximised the sample sizes for impact assessment.

When the pandemic disrupted everything, patience was needed – we simply had to wait for it to be possible for the interventions to resume. While we waited, we discussed with DLUHC how the evaluation approach might evolve, to work around the COVID-created disruption. One thing we found was that qualitative research was a godsend, its natural fluidity and flexibility allowing us to understand how the interventions in each area adapted due to the pandemic, and what effects this had. Another was that, perhaps surprisingly, it was still possible to evaluate impacts, if we were pragmatic about what data was usable and what data wasn’t, after the pandemic wreaked havoc with our carefully timed resident surveys.

Impact

This flexible approach allowed DLUHC to successfully measure impacts of the funded activities by combining data across multiple local authorities. They were able to identify that the funded activities had made some difference to how community members felt about their ability to change their local area, and their levels of comfort in talking to people from different backgrounds and with different views.

By adding qualitative interviews and focus groups, they were able to identify some unexpected benefits to community members, such as new connections and friendships that had formed, improved self-esteem, and reduced social isolation. This also highlighted lessons learned about how best to deliver activities intended to create stronger, more-integrated communities, in future. For example, that including some participants with more previous experience of community engagement can benefit the group overall, as the more experienced participants can encourage those with less experience by example.

‘Bigger picture’ lessons could also be drawn from the experience of the evaluation, about how best to measure the impacts of these kinds of locally-tailored activities in future.