philip lelyveld The world of entertainment technology

23Mar/23Off

Q&A: Society wants you to feel ashamed of yourself

... Before her book [Weapons of Math Destruction] came out, says O’Neil, “people didn’t really understand that the algorithms weren’t predicting but classifying … and that this wasn’t a math problem but a political problem. A trust problem.” ...

In The Shame Machine, you argue that shame is a massive structural problem in society. Can you expand on that?

The tech giants are paying millions of dollars to the operators of clickbait pages, bankrolling the deterioration of information ecosystems around the world.

Shame is a potent mechanism to turn a systemic injustice against the targets of the injustice. Someone might say, “This is your fault” (for poor people or people with addictions), or “This is beyond you” (for algorithms), and that label of unworthiness often is sufficient to get the people targeted with that shame to stop asking questions. As just one example, I talked to Duane Townes, who was put into a reentry program from prison that was essentially a no-end, below-poverty-­level manual-labor job done under the eye of armed men who would call his parole officer if he complained or took a bathroom break for longer than five minutes. It was humiliating, and he felt that he was treated as less than a man. This was by intentional design of the program, though, and was meant to train people to be “good workers.” 

It’s tantamount to a taser to one’s sense of self. It causes momentary helplessness and the inability to defend one’s rights. ...

After Weapons was published you started ORCA, an algorithmic auditing company. What does the company’s work entail? 

Algorithmic auditing, at least at my company, is where we ask the question “For whom does this algorithmic system fail?” That could be older applicants in the context of a hiring algorithm, or obese folks when it comes to life insurance policies, or Black borrowers in the context of student loans. We have to define the outcomes that we’re concerned about, the stakeholders that might be harmed, and the notion of what it means to be fair. [We also need to define] the thresholds that determine when an algorithm has crossed the line. ...

See the full article here: https://www.technologyreview.com/2022/06/29/1053985/society-shame-book-review/?utm_medium=tr_social&utm_source=Facebook&utm_campaign=site_visitor.unpaid.engagement&fbclid=IwAR16tK2mTGWpqsSVlT92C_tbZJuy9GoR7MrMMNrfCB01clqoTnGf0v5MhNc

Comments (0) Trackbacks (0)

Sorry, the comment form is closed at this time.

Trackbacks are disabled.