Graphic with the text 'The Ethics of AI-Generated Fundraising Content'

Should You Use AI-Generated Content in Fundraising Appeals?

May 27, 20254 min read

An AI generated video using Sora with a simple 10 word prompt.

What happens when the stories that move us to give are generated by machines?

Hollywood has long used AI special effects to create more powerful moments in big budget movies. If such tools were available to fundraisers would you use them?

A few years ago this would've been a hypothetical question. However, with the launch of numerous AI video tools, even Hollywood special effects are now in the grasp of charities. Leading the way is Veo 3 which can produce cinematic videos, including speech and sound effects from simple text prompts.

For charities and fundraisers, the appeal is clear: faster production, lower costs, and emotionally powerful content that can move supporters to act.

But with these benefits come big ethical questions, especially when AI-generated content is used to raise money based on emotionally powerful stories.

In this post, we explore the pros and cons of using AI to generate content for fundraising, and offer guidance on how charities can navigate this evolving space responsibly.


✅ The Ethical Opportunities of AI-Generated Fundraising Content

1. Amplifying Voices Without Exploiting Them

Many charities work with individuals who cannot or do not want to share their stories publicly. AI allows the essence of their experiences to be communicated while protecting their privacy and dignity.

For example, a domestic abuse charity might use AI to depict a survivor’s journey in anonymised, fictionalised form—avoiding re-traumatisation or public exposure.

2. Lower Costs, More Mission

Producing high-quality video content is expensive. AI-generated visuals reduce costs dramatically, allowing smaller charities or stretched teams to create compelling appeals without diverting funds from frontline work.

We've always loved the SickKidsVs campaign, which does use cinematic techniques and production values way beyond the traditional budget of most charities. If these are now accessible to all of us, shouldn't we use them?

3. Fostering Empathy Through Simulation

Well-crafted AI videos can recreate situations donors might never otherwise witness, such as walking through a refugee camp or inside a rural clinic. It builds on virtual reality, that we know works incredibly well to bring supporters emotionally closer, as this Amnesty International UK case study shows. Used ethically, these tools can bridge understanding gaps and stir deeper compassion.


⚠️ The Ethical Risks and Challenges

1. Deception and Donor Trust

If AI-generated content is presented as real without disclosure, it can mislead supporters. Donors may feel manipulated if they learn that an emotional story they responded to wasn’t entirely true.

Transparency is critical. Even if the video is emotionally accurate, hiding its synthetic nature risks breaking trust.

2. Erosion of Authenticity

Fundraising has always depended on real people and real experiences. There’s a danger that overuse of AI could make appeals feel too polished, too generic, or emotionally artificial.

3. Misrepresentation and Consent

When AI-generated characters are based on real case studies, it's essential to have full consent. Even if the names are changed, the feel of a story can still represent someone’s lived experience and misuse can lead to reputational or legal risks.

4. Bias and Stereotyping

AI tools are trained on large datasets that can include harmful biases. Left unchecked, they might reproduce stereotypical depictions, like "poverty porn" imagery or saviour tropes, which undermine dignity and reinforce negative narratives.


🧩 Five Ethical Guidelines for Charities Using AI in Appeals

  1. Be Transparent: Clearly label AI-generated content and explain why you’re using it.

  2. Get Consent: If stories are inspired by real cases, secure informed, written consent from individuals.

  3. Avoid Overdramatization: Don’t crank up emotional triggers just because you can. Keep storytelling grounded and respectful.

  4. Review for Bias: Always test AI outputs with diverse voices to spot stereotypes or harmful tropes.

  5. Blend Real and Synthetic: Use AI to support, not replace, authentic voices and lived experience.


🔚 Final Thought

Used well, tools like Veo 3 could make storytelling more inclusive, accessible, and scalable for charities. But the same tools also have the power to manipulate, mislead, or reinforce damaging narratives if not used ethically.

Charities must navigate this space with care, because at the heart of every appeal, whether human- or AI-crafted, should be a commitment to truth, dignity, and trust.

Want to discuss how to use AI responsibly in your charity? Check our our free guide and workshop.

*Disclosure: AI was used to generate the video in the post. I used ChatGPT 4o to turn my idea and thoughts into the post, which I then added to and edited. I also used AI to add an SEO text description to this post’s settings.

Back to Blog