Home Latest Ethical Tech Starts With Addressing Ethical Debt

Ethical Tech Starts With Addressing Ethical Debt

0
Ethical Tech Starts With Addressing Ethical Debt

[ad_1]

Awful people will use technology to do awful things. This is a universal truth that applies to almost any technology that facilitates communication and interaction, no matter how well intentioned it might be. Something as innocuous as Google Drive can be a vector for harassment. As we’ve recently discovered, so can video conference platforms like Zoom. Just in the past few weeks, high school classes in North Carolina and Texas, along with an NAACP meeting in California, were interrupted by racist and misogynist video, images, and text. With remote classes again ramping up all over the country, we can only expect more harm—but how much is Zoom to blame?

Last April, “Zoombombings” hit our university, and a colleague described the disturbing disruption to her online classroom, where trolls got around Zoom’s poor privacy protocols in order to screen-share pornography and scream racist and sexist slurs. Even obvious precautions, like not posting public links to meetings, are vulnerable to social engineering such as university students posting links to “come zoom bomb my class” forums. As tech ethics researchers, we did not find this surprising. However, apparently it was to Zoom’s CEO, who told The New York Times, “The risks, the misuse, we never thought about that.”

WIRED OPINION

ABOUT

Casey Fiesler is an assistant professor in information science at University of Colorado Boulder. She directs the Internet Rules Lab, where she and her students research tech ethics and policy, and ways to make networked technologies more awesome and safe. Natalie Garrett is a PhD student in information science at University of Colorado Boulder. Her research supports operationalization of ethics in the tech industry.

Big Tech is all about speed, especially when there is a perceived opportunity, like a pandemic forcing greater reliance on communication technology. But a “move fast and break things” mentality results in limited testing and deployment of software that isn’t ready. This is such a known problem that there’s even a term for it: “technical debt,” the unpaid cost of deploying software that will ultimately need to be fixed after it’s clear what the bugs are.

Debt accrues when these issues are not tackled during the design process. When the bugs are societal harms, however, it isn’t seen as bad tech, but rather unethical tech. “We never thought about misuse” is the precursor to another kind of debt: ethical debt.

Zoom’s “awful people” problem isn’t your typical bug, after all. When the “we’ll fix bad things after they happen” approach is about potential harms, whether individual or societal, you are failing to anticipate ethical issues. And the problem with ethical debt is that the metaphorical debt collector comes only after harm has been inflicted. You can’t go back in time and improve privacy features so that unsuspecting marginalized students didn’t hear those racial slurs in the middle of class. You can’t reverse an election after the spread of disinformation undermined democracy. You can’t undo an interrogation and improper arrest of a Black man after a biased facial recognition accusation. You can’t make people un-see conspiracy theory videos that a recommendation algorithm shoved in their faces. The harm has already been done.

Technologists can’t see the future, but they can predict and speculate. They know that awful people exist. At this point, they can easily imagine the ones who might intentionally spread conspiracy theories, who might rely on facial recognition as evidence even when they’re told not to, who might try to manipulate elections with disinformation, and that might think it’s fun to terrorize unsuspecting college students and professors. These aren’t all splashy headlines, but can also be micro-instances of individual harm that accumulate over time. As part of the design process, you should be imagining all of the misuses of your technology. And then you should design to make those misuses more difficult.

Ironically, some of the very best people to imagine things like how technology might be used for harassment are people who are often harassed. This means marginalized and vulnerable people like women and people of color—people who are underrepresented in tech. In a room of these folks, we guarantee you that “random people will jump into Zoom meetings and screen share pornography” would come up during speculation about misuse. Because many technology-based harms impact already marginalized people disproportionately, these are important voices to include in the design process as part of addressing ethical debt.

Technologists often create “user personas” during the design process to imagine how different types of people might use that technology. If those personas don’t include “user stalking their ex,” “user who wants to traumatize vulnerable people,” and “user who thinks it’s funny to show everyone their genitals,” then you’re missing an important design step. And if your response to this is, “Yes, there are likely to be these kinds of problems, but we’ll fix them after we know what they are,” start keeping an accounting book of your ethical debt.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here