When you’re checking out at an online store, it’s increasingly common to get a prompt inviting you to toss in a few bucks to a good cause. Your decision to give (or not) may feel like a reflection of how generous you’re feeling in the moment. Yet how you respond to these “microgiving” requests — including how much you donate — is also influenced by the information and options you see on the screen.
In a series of experiments involving more than 2 million PayPal users, researchers at the Golub Capital Social Impact Lab at Stanford Graduate School of Business found that small adjustments to donation requests had sizable effects on how likely people were to donate and how much they gave. These findings, published in a new report, have the potential to help nonprofits attract millions of new donors and significantly add to their coffers.
“Charitable giving is a really important source of fundraising for organizations, but in the real world, people are impulsive, they procrastinate, they don’t fulfill their goals, they’re not perfectly informed, and those are all things that get in the way,” says Susan Athey, PhD ’95, a professor of economics at Stanford GSB and faculty director of the lab. “Our work shows that there are low-cost ways to potentially reduce friction and raise more money for good causes.”
More Bang for the Buck
The first study was conducted using a feature that lets shoppers donate $1 to charity when they check out using PayPal. As they completed a transaction, a sample of 1.4 million users was randomly selected to see one of six short descriptions of a relatively unknown nonprofit fighting global poverty or no description at all. Some of the one-line statements quantified outcomes (e.g., “$1 provides 2 people with free and reliable access to safe water for a year”), while others stuck to narrative (e.g., “[This organization] provides free access to safe water”). The rest referenced the charity’s rating from a third-party group.
Compared to no information, providing numerical data about impact nearly quadrupled donation rates. Narrative descriptions also boosted giving rates, but sharing numbers was nearly 60% more effective.
“It was surprising in a good way that quantified information about what the charity was doing was most effective, because that is something you could scale up,” Athey says. “A lot of charities have very high-level missions, but it’s hard to visualize where your dollar is going. This concrete analytical statement of what the money is doing seems to also be important.”
References to a third-party rating — either on their own or alongside the other descriptions — didn’t help. In fact, adding “as estimated by a charity rating group” to the sentence quantifying impact made people less likely to donate, on average.
The researchers don’t know exactly why they saw this effect, but Athey speculates that such mentions lacked “bang for the buck” in a short description. “Every word you put in someone’s way has a cost and can make people less likely to complete something,” she says.
The findings mitigate concerns that giving at checkout could encourage impulsive gifts to less effective organizations or only benefit those with name recognition. “We can, in this very short space, communicate information about smaller charities in a way that reassures people about their quality and opens up this channel to a wider range of charitable giving,” Athey says.
The researchers explored the relationship between demographic characteristics and giving behavior. Overall, the researchers found that women, older people, and higher-income users were more likely to donate at checkout. However, all groups had fairly similar responses to the alternative descriptions presented in the experiment.
A second experiment on PayPal’s platform tackled another concern among nonprofits: If consumers donate a small amount at checkout, will it scratch a temporary itch to be generous but “crowd out” future, potentially larger, donations? The researchers found that being prompted to give in this way didn’t significantly change future generosity among any demographic group. It didn’t affect whether people later donated using PayPal, the average number of donations they made, or the total they donated.
Refining the Ask
In another study, Athey and her team investigated how different default donation amounts, known as ask strings, affected giving on PayPal’s Giving Fund website, which lets users donate to over 450,000 charities. When they clicked on a charity, more than 400,000 users were randomly assigned to see either the existing ask string — $25, $50, $75, $100 — or strings in which the button for $75, a relatively unpopular donation amount, was replaced with $10 or $200.
Both new options made people more likely to choose a default amount rather than writing in a different amount or not donating at all, a sign that these changes simplified giving. Including the $10 button encouraged more people to donate (although it reduced the average donation amount among those who gave), while the $200 option boosted the total money raised — both welcome outcomes for nonprofits.
“In addition to making things easier, these amounts can serve as a communication device setting norms of what’s acceptable,” Athey says. “A low button might pull people in and also pull people down to it, making it popular. A high button makes it easier for people who may want to donate more and pulls people up to it.”
The new donation options also increased the likelihood that users would donate again using PayPal — and donate more overall — for up to 18 months after the study began. “Getting people over the hump to donate creates a relationship with them,” Athey says. “They’ve seen that it’s not hard; they’re not turned off by the experience, so it makes them more willing to go into it.”
The study suggests that charities should look at donation amounts that have been popular in the past when creating ask strings. “By choosing the spikes, you’re going to create a better user experience, which is good on its own but also could improve long-term outcomes,” says Kristine Koutout, research director at the Golub Capital Social Impact Lab and a co-author of the papers.
While the researchers couldn’t find a way to entirely eliminate the tradeoff between attracting more users and raising more money, they found that they could get better results across both dimensions by tailoring ask strings to different audiences. Although the researchers evaluated an approach based on a complex machine learning algorithm, organizations can use simpler methods of customization, say, targeting donors in lower income brackets with lower asks. “Whether it’s based on past donation history or where people live, there are many contexts where you could personalize,” Athey says.
Given the size of these randomized experiments and the fact that they took place in a real-world context, Athey is “pretty sure these results are going to generalize.” Future studies could examine more variations in ask strings, charity descriptions, and other factors that might optimize donation behavior. “We’re just scratching the surface,” she says.
For media inquiries, visit the Newsroom.