Artificial intelligence’s dirty secret
... Laura Preston recently wrote about her experience working as one of these "human fallbacks" for a real-estate chatbot called Brenda. When a customer wanted to speak to someone about an apartment listing, they would be connected with Brenda, who could answer basic questions about the listing or give details on the apartment itself from the price of rent to the square footage. But many of Brenda's answers came across stilted or the system was simply unable to answer more complex questions, so a "human fallback" would step in. Preston and other human workers would take over the conversation and try to help the client, cleaning up stock answers to better address their needs or doing deeper research into housing vouchers and pet policies. According to Preston, employees were trained to use Brenda's "voice" in the interactions in an attempt to make the conversation appear seamless. And the push to robotically answer a deluge of questions came with a serious mental toll: "Months of impersonating Brenda had depleted my emotional resources," wrote Preston. "It occurred to me that I wasn't really training Brenda to think like a human, Brenda was training me to think like a bot, and perhaps that had been the point all along." ...
Instead of improving productivity, automation is often focused on increasing the power that employers have over workers. In his book, "Automation and the Future of Work," the economic historian Aaron Benanav explains that companies aren't putting money toward tools to make employees' lives easier, but are pouring money into "technologies allowing for detailed surveillance of those same workers" like computer-monitoring software that tracks the keystrokes of employees or Amazon's sophisticated algorithmic management tools that evaluate workers' every movement. ...
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio