The Security Risks of Unregulated Google Search
Someday I need to write an essay on the security risks of secret algorithms that become part of our infrastructure. This paper gives one example of that. Could Google tip an election by manipulating what comes up from search results on the candidates?
The study’s participants, selected to resemble the US voting population, viewed the results for two candidates on a mock search engine called Kadoodle. By front-loading Kadoodle’s results with articles favoring one of the candidates, Epstein shifted enough of his participants’ voter preferences toward the favored candidate to simulate the swing of a close election. But here’s the kicker: in one round of the study, Epstein configured Kadoodle so that it hid the manipulation from 100 percent of the participants.
Turns out that it could. And, it wouldn’t even be illegal for Google to do it.
The author thinks that government regulation is the only reasonable solution.
Epstein believes that the mere existence of the power to fix election outcomes, wielded or not, is a threat to democracy, and he asserts that search engines should be regulated accordingly. But regulatory analogies for a many-armed, ever-shifting company like Google are tough to pin down. For those who see search results as a mere passive relaying of information, like a library index or a phone book, there is precedent for regulation. In the past, phone books — with a monopoly on the flow of certain information to the public — were prevented from not listing businesses even when paid to do so. In the 1990s, similar reasoning led to the “must carry” rule, which required cable companies to carry certain channels to communities where they were the only providers of those channels.
As I said, I need to write an essay on the broader issue.