{"id":1029,"date":"2023-08-28T11:36:14","date_gmt":"2023-08-28T09:36:14","guid":{"rendered":"https:\/\/site.nord.no\/response\/?p=1029"},"modified":"2023-08-28T11:36:14","modified_gmt":"2023-08-28T09:36:14","slug":"book-review-technically-wrong-sexist-apps-biased-algorithms-and-other-threats-of-toxic-tech","status":"publish","type":"post","link":"https:\/\/site.nord.no\/response\/2023\/08\/28\/book-review-technically-wrong-sexist-apps-biased-algorithms-and-other-threats-of-toxic-tech\/","title":{"rendered":"Book Review: Technically Wrong \u2013 Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech\u00a0"},"content":{"rendered":"\n<p>Why are the \u201cprivate\u201d digital and intelligent assistants that are baked into new technologies and apps, like Apple\u2019s Siri, Windows\u2019s Cortana, or Amazon\u2019s Alexa, all females? On the surface, such an observation may seem incidental. However, if you look more critically, this probably stems from cultural and gender-infused stereotypes where women are assumed to have service roles, such as mothers, nurses, housekeepers, and teachers, among others. This is just one of many important and interesting observations in Sara Wachter-Boettcher\u2019s book, <em>Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. <\/em>The book highlights how and why the technological products that we use every day are filled with stereotypes, biases, discrimination blunders, and other blind spots.&nbsp;<\/p>\n\n\n\n<figure class=\"wp-block-image aligncenter size-full is-resized\"><img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/site.nord.no\/response\/wp-content\/uploads\/sites\/56\/2023\/08\/image-7.png\" alt=\"\" class=\"wp-image-1030\" style=\"width:482px;height:724px\" width=\"482\" height=\"724\" \/><\/figure>\n\n\n\n<p class=\"has-text-align-center\"><a href=\"https:\/\/www.akademika.no\/teknologi\/data-og-informasjonsteknologi\/technically-wrong\/9780393356045\" target=\"_blank\" rel=\"noreferrer noopener\">\u00a9<\/a>&nbsp;<\/p>\n\n\n\n<p><em>Technically Wrong <\/em>was published in 2017 and consists of ten chapters: 1) Welcome to the Machine, 2) Culture Misfit, 3) Normal People, 4) Select One, 5) Delighted to Death, 6) Tracked, Tagged and Targeted, 7) Algorithmic Inequity, 8) Built to Break, 9) Meritocracy Now, Meritocracy Forever, and 10) Technically Dangerous. Although it is a relatively short book of 232 pages, the different chapters cover and highlight a range of issues and biases integrated into today\u2019s technological world. This, for instance, includes stories of social media inadvertently reminding people of heartbreaks and tragedies (e.g., Chapter 1 &#8211; Facebook\u2019s Year in Review feature), systematic barriers to using technologies for ethnic and gender minorities (e.g., Chapter 3 &#8211; narrow and binary thinking in design of personas and defaults) as well as biased algorithms (e.g., Chapter 7 \u2013 Google Photo tagging feature), to the industry&#8217;s unproductive hiring practices, homogeneity and (mis)culture (e.g., Chapter 9 \u2013 Uber\u2019s history in handling sexual harassment complaints).&nbsp;&nbsp;<\/p>\n\n\n\n<p>Through its many examples, the book reinforces how technology is the domain of privileged (white) males. With her creative writing, Wachter-Boettcher tells us what went \u201cwrong\u201d in the development of novel digital technologies, offers alternatives to refix it, and urges social activism: \u201c<em>After all, most of us don\u2019t hate tech. We love it. It\u2019s time we demand that it love us back\u201d <\/em>(p. 200). The most important part, however, is that she inspires us to think more critically about the technologies integrated into our everyday lives.&nbsp;<\/p>\n\n\n\n<p>Although a strong part is that Wachter-Boettcher gives many practical insights, the geographical reach is narrow and leans towards the Silicon Valley companies, such as Uber, Twitter, and Facebook. Many of the examples are also driven by articles in the tech media. However, it is a short, thoughtful, and fascinating read. We strongly recommend <em>Technically Wrong<\/em> to everyone interested in the intersection of technology and society.&nbsp;&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Why are the \u201cprivate\u201d digital and intelligent assistants that are baked into new technologies and apps, like Apple\u2019s Siri, Windows\u2019s Cortana, or Amazon\u2019s Alexa, all females? On the surface, such an observation may seem incidental. However, if you look more critically, this probably stems from cultural and gender-infused stereotypes where women are assumed to have [&hellip;]<\/p>\n","protected":false},"author":91,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[22],"tags":[],"coauthors":[19],"class_list":["post-1029","post","type-post","status-publish","format-standard","hentry","category-blogg-engelsk"],"_links":{"self":[{"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/posts\/1029","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/users\/91"}],"replies":[{"embeddable":true,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/comments?post=1029"}],"version-history":[{"count":1,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/posts\/1029\/revisions"}],"predecessor-version":[{"id":1031,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/posts\/1029\/revisions\/1031"}],"wp:attachment":[{"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/media?parent=1029"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/categories?post=1029"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/tags?post=1029"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/site.nord.no\/response\/wp-json\/wp\/v2\/coauthors?post=1029"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}