GoFundMe and the much scarier stuff behind many of the technologies we use daily

GoFundMe and the much scarier stuff behind many of the technologies we use daily
[Image description: Someone in what looks like a mine shaft in the dark, wearing a yellow helmet, staring at a person in the background who seems to be looking at something using a small light. Photo by Mehmet Can Özgümüş / Unsplash]

Hi everyone, this week is Halloween, which is probably my favorite holiday, mainly because I like the easy access to junk food everywhere I go. If you’re on Instagram, check out my “Scary Nonprofit Stories,” including the one about the meeting that begun with a never-ending round of introductions.  

While we scare ourselves for fun this week, there are some seriously scary things out there that we do need to think about. This week, I want to talk about the awfulness that’s going on in the tech world.

I’m sure you’ve probably heard of the GoFundMe fiasco, where GFM decided to automatically create pages on its platform for 1.4million nonprofits, without asking for any of these orgs’ consent, and with setting the default tip to GFM at 16.5% for each donation. Of course, this has been causing outrage across the sector.

For its mistake, GoFundMe has issued an apology and reversed its decision, making it opt-in for nonprofits to have a page on its site.

While we’re focused on tech malfeasance, though, there are far more awful and evil issues we need to pay attention to and act on the way we have been with GoFundMe:

Salesforce, which many nonprofits use, has renewed its contract with ICE, the enforcement machine the fascist Trump administration has been using to round up and disappear people. As reported here by Jonathan D. Ryan:

“Salesforce provides the digital infrastructure that allows the state to track, sort, and deport human beings with algorithmic precision. The same tools that let a corporation follow its customers now let ICE follow a child. The same software that helps a company predict a buyer’s behavior can predict a family’s movements. In the name of ‘customer relationship management,’ Salesforce has built the architecture of erasure.”

Facebook, meanwhile, has decided to give itself permission to scan the camera rolls of many of its users and index all their images and videos, to provide suggestions for posting, and to train its AI system. Without realizing it, you may be allowing it to see all your pictures and videos, including any you take of medical records, intimate moments, and so on. And its policy confirms it will share with third-parties, who are allowed to do whatever they want with these images.

I just checked my phone’s setting, using instructions from this article here, and saw that the default was for it to access my images and videos. This is evil enough, but it’s also concerning when many of us use our personal phones for work, including taking pictures and videos of clients, including of kids. AI specialist Clara Hawking sounds the alarm and recommends taking several precautions, including banning all Meta products (Facebook, Instagram, Threads, Whatsapp, etc.) from all work phones and prohibiting the use of personal phones for work-related activities.

While we’re dealing with that, ChatGPT and AI in general have been exacting horrific prices on people in other countries. I’ve been seeing more of my colleagues using AI engines to create images and videos, including a friend who sent me a fake commercial for a “Vu Le action figure” that comes with a unicorn floaty. Over the past few years, our sector has been increasing its use of AI and discussing ethical issues, such as using AI-generated images, which steal from artists.

But there’s more for us to think about with AI. In this post, Uchechukwu Ajuzieogu summarizes the harrowing toll that takes place each time we use AI, including content moderators having to watch sickening, violent videos in order to train AI systems:

“While OpenAI is valued at $157 billion, Fasica processes 700-1,000 horrific videos per shift. That's 7 seconds per decision on whether a murder stays online. Over 200 of her colleagues now have diagnosed PTSD. Here's what's mentally shattering: OpenAI paid contractors $12.50/hour per worker. Workers received $1.50. The middleman captured 88% while workers bore 100% of the trauma.”

Above he is referring to Fasica Berhane Gebrekidan, who wrote a detailed report on the experience she and other content moderators endured. It’s called “Content moderation: The harrowing, traumatizing job that left many African data workers with mental health issues and drug dependency.” I could only skim through it before the horror of the death and violence Fasica describes having had to watch as part of her job, left me shaken and forced me to stop.

I couldn’t even fully handle reading about it. I can’t imagine thousands of workers being paid $1.50 an hour watching hundreds of graphicly violent videos daily. Along with children forced to work in mines, and youth inhaling carcinogens while burning circuit boards to retrieve copper. All so that we can have cute little AI-generated videos, or that we can write that annual appeal letter faster.

GoFundMe, Salesforce, Meta, and OpenAI aren’t the only problematic tech companies. It seems everything that we use is involved in some sort of inequity, and extracting ourselves from them completely is difficult if not impossible.

Still, as a sector that prides itself on its collective mission of fighting inequity and injustice, we also cannot absolve ourselves of the responsibility of grappling with difficult conversations about the tech we use and how we may be complicit in furthering the things we’re seeking to counter.

There are people much smarter than I am in this area who can provide more comprehensive recommendations. At the least, I think all of us need to be more informed on these issues and engage in more conversations about them. We also need to act when we can, such as researching other alternatives to Salesforce and switching when possible, having thoughtful policies to protect clients’ privacy and safety, and avoiding creating images and videos with AI.

We also need to raise our voices and organize and mobilize, the same way that we saw people do with the GoFundMe situation. We can see what happens when our sector gets angry enough. Let’s channel that energy towards changing many of these platforms for the better.

--

Vu’s new book will be coming out. Order your copy at Elliott Bay Book CompanyBarnes and Nobles, or Bookshop. If you’re in the UK, use this version of Bookshop. If you plan to order several copies, use Porchlight for significant bulk discounts.

Net proceeds from the sales of the book from now until end of 2026 will be donated to organizations supporting trans rights, immigrant rights, and/or are fighting fascism.