This marks a cultural shift. In 2018, DuckDuckGo wrote about the Filter Bubble, explaining how Google hides relevant results based on personalized, biased preconceptions about you.
Even though people searched at the same time, people were shown different sources, even after accounting for location.
Google hit back, implicitly admitting filter bubbles would be bad, but claiming they weren’t guilty of it.
From Google, which manages to entirely ignore the critiques:
Why might two different people searching for the same thing see results that are different? That’s often due to non-personalized reasons: location, language settings, platform & the dynamic nature of search.
But now, search personalization is no longer a taboo. Google is attacking DuckDuckGo for not doing it. And other corporations, like Kagi, dream of a day when data acquisition can be done so readily that the filter bubble will only ever tell you what you already believe:
[W]hen you ask your own AI a question like “does God exist?” it will answer it relying on biases you preconfigured… [W]hen you ask it to recommend a good coffee maker - it will know the brands you like, your likely budget and the kind of coffee you usually drink. All this information will be volunteered to the AI by you - similar to how you would volunteer your information to a human assistant - but this time to a much larger extent. And you will also do it without fear…
This marks a cultural shift. In 2018, DuckDuckGo wrote about the Filter Bubble, explaining how Google hides relevant results based on personalized, biased preconceptions about you.
Google hit back, implicitly admitting filter bubbles would be bad, but claiming they weren’t guilty of it.
From Google, which manages to entirely ignore the critiques:
But now, search personalization is no longer a taboo. Google is attacking DuckDuckGo for not doing it. And other corporations, like Kagi, dream of a day when data acquisition can be done so readily that the filter bubble will only ever tell you what you already believe: