Ethics and Human Rights in Tech With Deb Donig


Few literature professors spend as much time thinking about the future as Deb Donig.

Sure, students who take her classes will become well versed in the classics, but they’ll spend just as much time looking ahead. In particular, Deb’s focus is the intersection of technology and human rights. In addition to teaching at Cal Poly, she’s the host of the Technically Human podcast, a show that features some of the most prominent tech thinkers and goes deep on the singular question: “What does it mean to be human in the age of tech?”

“I have to explain to my students why they’re getting ethics lessons in technology from a professor of English literature,” she told me. “What I say to them is that before we can build anything, we first have to imagine it. So it matters how we imagine. What kind of imagination we have as a world that is a better world is not absolute. It’s not obvious or inevitable. We actually come at that idea of a better world with a lot of passions and blind spots and biases. So we need to interrogate the realm of the imagination.”

Deb’s career path seemed inevitable for most of her life. She’s had a passion for the written word since she was young and grew up a voracious reader. She was born and raised in the heart of Silicon Valley. Her mother came to the United States from South Africa and still works as a pediatric ICU doctor. Her father is a lawyer whose family emigrated from Germany during Hitler’s rise to power in the 1930s. She studied literature at the University of California, Davis, before earning a Master’s degree from the University of Cape Town and a Doctorate in philosophy from UCLA, where she also lectured from 2010 until 2018.

She was at UCLA when Donald Trump shocked the political establishment by winning the 2016 presidential election. She says she began thinking more seriously about technology’s impact on human rights and social justice in the wake of the election as people began accusing social media outlets like Facebook and Twitter of not being able to adequately police their platforms and protect them from disinformation campaigns.

“Facebook had potentially swerved the election in a significant way toward Donald Trump,” she said. “We had to reckon with things like deep fakes and Cambridge Analytica and the idea that we may be living in a post-truth, post-fact environment. So I realized that we couldn’t really talk about human rights or social justice without talking about technological innovation and technological culture.”

In 2017, Deb launched a social justice movement called Tech Stands Up, which urged activists to hold tech leaders accountable for the ways their products and platforms impact the world.

“One thing that tech culture and technological innovation share with human rights is the idea of building a better world,” she said. “But I think much of what was once utopianism in the early days of Silicon Valley has now been taken over by opportunism and financial incentives. The ideas that get funded and ultimately built are no longer tethered to a kind of moralism. Now they’re more tied to financial structures.”

That, Deb says, is where things can start going wrong.

“We need to be very careful in thinking about the innovations we create,” she adds. “We’re responsible for thinking about all the possible implications of our inventions and being protective and thoughtful about how people may use them. Not just in the ways that we’ve intended them to be used, but also in ways that are much less intentional and potentially harmful.”

To listen and subscribe to the Before IT Happened podcast, visit