The price we pay: Strengths & weaknesses of the evidence

The price we pay: Strengths & weaknesses of the evidence

This blog tackles some of the more common questions about replicating the giving environment in DEL's recent experiment comparing positive and negative appeals. These questions include, 'How can a donation experiment replicate giving with real money?' and, 'How do respondents compare with charities' usual donors, or 'warm audiences'? among others.

Using YouGov’s online panel, recent Development Engagement Lab research experimentally tested the impact of positive and negative video appeals on donations. As shown there, we asked respondents to imagine they had £20 to donate to the organisation whose appeal they had just seen. We found no statistical difference in the average donation for positive (£6.04) and negative (£6.25) appeals. The evidence from this research suggests that there is no penalty in terms of loss of income for organisations who move away from appeals showing extreme need and anchored in pity and guilt. Positive appeals work as well (statistically) as do negative ones.

While these findings provide more evidence for a new approach to engaging the public, the online, experimental, hypothetical donation design is not without limitations. In this blog, we build on the current evidence base by outlining these limitations and show why there is good reason to be confident in their external validity – that is – the extent to which they replicate the real world environment that organisations operate in when soliciting donations.

The experiment does not suggest that negative appeals should never be used.

In another piece of research from DEL’s precursor, the Aid Attitudes Tracker (AAT), we provided respondents with real money – £10 – to donate some, none, or all to a campaign appeal from the Jaago Foundation. Respondents were randomly allocated to a baseline, pity/negative appeal, or an empathy/positive appeal. While this research used images rather than videos, the outcome was the same – no statistical difference between the average donation to positive or negative appeal, which averaged at £2.32.

Another limitation of survey based experimental research is that it uses the public as its sample. While this normally is an ideal base to measure behaviour, development organisations frequently solicit donations from an in-house database of pre-existing donors or warm audiences, i.e. those who have engaged in other ways (e.g. email sign-up, signing a petition, etc.). This matters, because the assumption is that donors behave differently to the general public: they give more, and they (may) have different reactions to positive and negative appeals than the general public.

To address this in our design, we incorporated the 10 actions measured in the DEL audience segmentation. Using these actions, we classify the public into one of 6 audience groups: Negatively Engaged, Totally Disengaged, Marginally Engaged, Transactionally Engaged, Purposefully Engaged, and Fully Engaged. In brief, the Transactionally Engaged are people who have donated to a development organisation in the past 12 months or have boycotted/purchased goods to support sustainable development causes. The Purposefully Engaged and Fully Engaged are previous donors and/or make conscious purchases, but they may also be members of or volunteered for an organisation, signed a petition, or have written to their MP. These latter two groups are not an exact match for audiences in organisations’ databases, but we suggest that they are excellent proxies.

Our findings showed that indeed, those in the more engaged audiences gave more on average. For the Fully Engaged, it was two times more than the Marginally Engaged for both the positive and negative appeal. This stands to reason: those who are more engaged with the issue give more than those who are not. Looking at a key audience for development organisation – the Transactionally Engaged – respondents who received the negative appeal gave £7.91, while those who received the positive appeal gave £7.42. However, there were no statistically significant differences in the amount given for each of the audience groups for positive and negative appeals. In other words, the size of donation varies by audience groups, but audience groups react similarly to positive and negative appeals.

Figure 1: Donations by respondents' engagement level

Because organisations spend a great deal of time and energy cultivating their donor base and keeping them engaged, a weakness of the design employed here is that it relies on data from a single point in time rather than taking a longitudinal approach to understand how exposure to different appeals influences behaviour over time. We cannot rule out different outcomes over the long term from the data here, but the way forward is to design a repeated experiment with the same respondents to observe whether time does indeed make a difference.

In closing, the research does not show – nor do we suggest – that negative appeals should not be used. On the contrary, real-world evidence from organisations’ accounts shows that negative appeals work in generating income. Our evidence shows that positive appeals – often assumed to work less well than negative ones – work just as well. There are two ways to engage the public in donating to development organisations. These findings suggest there is ample space for organisations to move away from the predominant negative appeals that use extreme need to generated feelings of pity, anger and guilt to generate donations. There is opportunity for using both types of appeals to engage the public.

What the research also shows is the positive externalities of positive appeals: respondents who receive the positive appeal are more likely to sign up to an organisation’s email list and these appeals generate a greater sense of efficacy – the feeling of being able to make a difference to people living in poor countries. For organisations taking decisions on how to engage the public, DEL’s research shows that the reliance on traditional negative appeal may not be serving organisations’ best interests.

Written by

David Hudson

David Hudson

Professor of Politics and Development in the University of Birmingham

Jennifer Hudson

Jennifer Hudson

Professor of Political Behaviour at University College London (UCL)

Write for us

[email protected]

Related Blogposts