Skip to content

Do interventions to reduce teenage pregnancy actually work?

July 13th, 2009

Dr Petra

The papers have recently been full of reports about a UK pilot scheme aimed at reducing teen pregnancy, drug use and school exclusion. Most of the coverage has vocally criticised the scheme – reporting it as contributing to a rise in teen pregnancy:

Multi million pound government scheme ‘could have increased teen pregnancies’

£6m drive to cut teen pregnancies sees them DOUBLE

Initiative to cut teen pregnancy had reverse effect, study says

Teen pregnancy project ditched

Is the press coverage fair? Or are we seeing the typical response from the UK media. Hitting scaremongering mode as soon as we’re discussing teenage pregnancy, and inviting readers to oppose sex education and other youth intervention programmes.

Here’s the background to the pilot scheme. The Young People’s Development Programme (YPDP) was run across the UK between April 2004 and March 2007. It was based upon the American Carrera Programme, and was evaluated by a team of researchers from the Institute of Education and the London School of Hygiene and Tropical Medicine. They report on the evaluation in the latest edition of the British Medical Journal.

It’s a very interesting evaluation and well worth reading. Contrary to the sensational headlines, the study has a lot to say about how we go about tackling issues of teen pregnancy, drug use and other problem behaviour in at-risk youth.

The evaluation looked at 2724 male and female young people aged 13-15 identified by professionals as ‘at risk’ of teen pregnancy, substance misuse, school exclusion or otherwise vulnerable. These young people were enrolled in a Young People’s Development Programme and researchers followed them up via 27 sites delivering this scheme. The YPDP offered support, education and mentoring with a focus on tackling sex and drugs education. Those in the YPDP scheme were compared with a group of similar young people engaged with youth services (but where said services had not received specific YPDP funding).

Participants were asked to complete questionnaires asking them about sexual behaviour, contraception use, substance misuse and questions about their home life. The scheme was assessed on outcomes such as how many girls got pregnant during the evaluation, weekly cannabis use and monthly drunkenness. If the intervention was successful you would expect to see those on the YPDP scheme doing better than those not getting this input.

The results were interesting – and unexpected. The YPDP evaluation indicated significantly more girls on the scheme reported a teen pregnancy than in the comparison group. It was not reported whether these pregnancies were terminated. There were no significant differences between boys in either the YPDP or comparison groups around awareness of causing pregnancies. An important finding since boys involvement in teen pregnancy is often ignored.

Girls in the YPDP group were less likely to use contraception and more likely than comparison to report sexual activity. The impact of the YPDP scheme on exclusion, truancy and contact with the police appears negligible for both genders.

The evaluation itself was not without its limitations. The study and comparison group were quite different. So it’s not clear within the analysis just what issues were directly attributable (or not) to the YPDP intervention. Attrition (drop out) was high, partly because participants were a notoriously difficult to engage group.

The researchers noted that the delivery of key messages varied across the different sites delivering YPDP in both content and tone. Importantly both the young people and youth workers delivering the YPDP scheme reported how much they appreciated and enjoyed it.

So although the scheme itself did not appear to reduce many of the problems it was designed to tackle, all involved did seem to prefer it to alternative approaches to youth engagement.

Why, then, did these unexpected and somewhat confusing outcomes emerge? The researchers admit they are not completely sure. The report indicates those in the YPDP scheme were already more vulnerable than those in the comparison group. That some of the education in the comparison group was of an increasingly high standard. And that many of the girls in the YPDP group had already identified themselves with low aspirations and a perception that teen motherhood was automatically their future.

The study recommends that future interventions are evaluated using RCTs, those delivering schemes are enthusiastic and well-trained and supported. Rather than being an alternative to school, they run in conjunction with school based activities to avoid stigmatising young people. And they tackle issues relating to young women particularly so the assumption that teen motherhood is all that’s open to them can be realistically challenged.

So is this really as bad as the press have made out? Not quite. The media have implied reckless spending on a useless scheme that caused more teen girls to get pregnant than would have done so if it hadn’t existed.

But the evaluation in the BMJ does not say this.

The media didn’t seem to understand this was a pilot scheme. The point of a pilot is to test something out. Yes, they are costly, but it is undoubtedly cheaper to implement a pilot and see what works, than decide to just push something into practice and pay for something ineffective.

In this case, the evaluation has shown the YPDP scheme did not seemingly address key issues for at risk youth. The researchers suggests possible reasons why and suggestions for future youth interventions. This means we are now in a better position to consider what may work to reduce teen pregnancy, substance abuse and school exclusion.

You could report this in two ways. As the media have done – costly scheme increases teen pregnancy. Or as the evaluation found – a pilot indicated a YPDP scheme didn’t deliver so ensured a costly mistake (in terms of time and youth wellbeing) was not rolled out across the country without prior testing.

Only one version makes for a good headline.

As a pilot scheme this project was time limited. All pilots are set to run for a certain period of time, during which they are evaluated. If they are successful they’ll be rolled out to a wider target group, if they fail they (hopefully) won’t be implemented. But the press coverage for this scheme implies an expensive mistake that’s been brought to a halt, rather than testing a programme that was always going to run for a set period.

The evaluation in the BMJ, contrary to the wider press coverage, does NOT say that youth intervention schemes should be outlawed. What it says is we need to think more carefully about what schemes are delivered, and ensure those that are aimed at vulnerable youth continue to be fully assessed and trialled.

It’s also worth noting that several similar interventions have been attempted in the US and they also show mixed outcomes. These are neatly summarised in an accompanying editorial in this week’s BMJ sex education researcher Doug Kirby. He explains how interventions appear to have little impact on young males sexual behaviour, and have had mixed results on young girls using contraception or avoiding pregnancy.

Rather than stating such schemes should be scrapped, Kirby draws attention to the one thing the press failed to pick up on. That the influences on at risk young people are many, and finding ways to tackle those – not to mention evaluate such multicomponent interventions – is very complex. So while you can often measure ‘outcomes’ you can’t always work out why they happened (as in this study).

Often the outcome-led assessment of such schemes, although necessary, may fail to capture some of the more subtle nuances that make schemes work. It’s why more flexible forms of evaluation, such as realist approaches, may be more effective.

Kirby concludes:
“This does not mean that all youth development approaches are ineffective. For example, programmes may be more effective when implemented by charismatic staff, when they facilitate access to reproductive health services, when the staff connect with the teenage participants, or when the staff give a strong clear message about avoiding unprotected sex”

The beauty of evaluations are they give us information that can help us. Sometimes outcomes may contradict our beliefs, or give us lots of questions to work through. This is a very good example of a thorough evaluation that’s told us a few things we didn’t know about youth education. So while YPDP scheme wasn’t successful it’s not a total loss. We can learn a lot from this evaluation, and that will only improve future support for at risk teens.

Comments are closed.