{"id":8810,"date":"2021-01-20T13:22:08","date_gmt":"2021-01-20T18:22:08","guid":{"rendered":"https:\/\/www.ceros.com\/inspire\/?p=8810"},"modified":"2021-03-30T10:29:41","modified_gmt":"2021-03-30T14:29:41","slug":"ethical-design-post-truth","status":"publish","type":"post","link":"https:\/\/www.ceros.com\/inspire\/originals\/ethical-design-post-truth\/","title":{"rendered":"The Keys to Ethical Design in a Post-Truth World"},"content":{"rendered":"Reading Time: <\/span> 5<\/span> minutes<\/span><\/span>\n

By Friday, January 8, over two months after the Presidential election and mere days after a series of seditious events that need no explaining, President Donald Trump had been banned from Twitter. <\/p>\n\n\n\n

And Facebook. <\/p>\n\n\n\n

And Snapchat. And Twitch. And nearly every other digital platform on which he had a presence. <\/p>\n\n\n\n

These moves aren\u2019t all that surprising; in fact, they\u2019d been in the making for some time. Two months earlier, Facebook\u2019s Mark Zuckerberg and Twitter\u2019s Jack Dorsey spoke before the Senate about the proliferation of political misinformation. The hearings highlighted what Americans have known since the election of 2016\u2014content moderation gets to the difficulty of identifying truth when truth has become a partisan issue.<\/p>\n\n\n\n

Building trust with ethical design, in the post-truth, pandemic landscape of 2021, is a challenge facing politicians, businesses, marketers, neighbors alike. It\u2019s also one that the builders of the interfaces through which we interact with the world are particularly well-suited to tackle. Here are some of the ways that user experience<\/a> experts are thinking about ethical design in 2021 and beyond. <\/p>\n\n\n\n

Design for the worst<\/h2>\n\n\n\n

There\u2019s a common perception that the too-big-to-regulate tech companies are expertly engineered to manipulate their users. Rob Walker addresses this in a celebrated blog post<\/a> on Medium, \u201cWhy Every CEO Needs to Think Like a Hacker, Stalker, or White Nationalist.\u201d He explains that the great irony of the UX of digital design is even though we may feel these products were designed to take advantage of us, the truth is that many of these companies have failed to protect themselves from the bad actors who might try to manipulate them<\/em>. In other words, what we\u2019re really experiencing is negligence. \u201cIt would have been smart,\u201d Walker writes, \u201cto think ahead about how neo-Nazis might use Twitter, how pedophiles might use YouTube, or how a mass murderer might use Facebook Live.\u201d <\/p>\n\n\n\n

Walker cites the startup Superhuman, an invitation-only email experience<\/a> that costs $30 a month, as an example. In June of 2019, the company came under fire for one particular feature that bordered on surveillance: a read status, which could let email senders learn not only whether recipients opened their emails, but when, where, and how many times<\/a>. Superhuman\u2019s founder Rahul Vohra responded to the backlash by making some changes to the app\u2019s read statuses, and apologized by saying, \u201cWhen we built Superhuman, we focused only on the needs of our customers. We did not consider potential bad actors. I wholeheartedly apologize for not thinking through this more fully.\u201d <\/p>\n\n\n\n

Walker argues that this confession, while naive, should be taken at face value\u2014Vohra really didn\u2019t think about bad actors, and his lack of awareness speaks to so many other CEOs who are similarly not thinking about the uses of their tools beyond how they are intending for them to be used. <\/p>\n\n\n\n

The solution, as Walker sees it, is to practice what he calls Design for the Worst. That is, designing any tool, any app, with the worst possible scenarios in mind. \u201cImagine,\u201d he writes, \u201ca sort of Black Mirror Department, devoted to nothing but figuring out how the product can be abused\u2014and thus how to minimize malign misuse.\u201d<\/p>\n\n\n\n

<\/p>\n\n\n\n