“The algorithmic biases feed into social and cognitive biases, like confirmation bias, which in turn feed into the algorithmic biases. Before, people were looking at the evening news on TV, or reading the local paper, for example. But the fact that the medium has changed to online social networks, where you shape the sources of information to which you are exposed, now means that you become even more vulnerable,” he said. “Search engines and social media, for example, try to predict what content may be most engaging for someone. Ranking algorithms use popularity as one of the ingredients in their formulas. That means that the more people in your group interact or engage with a piece of fake news, the more likely you are to see it. The social network can act as an amplifier because the people near you have opinions similar to you, so they are more likely to be tricked by a certain kind of fake news, which means you are more likely to see it, too.”
For Chollet, though, the sort of danger Facebook poses is unique. “There’s only one company where the product is an opaque algorithmic newsfeed, that has been running large-scale mood/opinion manipulation experiments, that is neck-deep in an election manipulation scandal, that has shown time and time again to have morally bankrupt leadership. Essentially nothing about the threat described applies to Google. Nor Amazon. Nor Apple. It could apply to Twitter, in principle, but in practice it almost entirely doesn’t,” he said. What seems to clinch it for him is that Facebook is ambitiously pursuing advances in A.I. “What do you use AI…for, when your product is a newsfeed?” he wondered. “Personally, it really scares me. If you work in A.I., please don’t help them. Don’t play their game. Don’t participate in their research ecosystem. Please show some conscience.” [...]
This is worth keeping in mind, though: That, also at the end of the day, “The motivation for Facebook is not to make you a better person—to improve you morally or intellectually—and it’s not even designed to improve your social group,” Simon DeDeo, an assistant professor at Carnegie Mellon University, where he runs the Laboratory for Social Minds, and external faculty at the Santa Fe Institute, told Nautilus. “It’s designed to make money, to show you things you want to see to hopefully induce you to purchase things.”
No comments:
Post a Comment