The issue with Google’s personalised search results is, imo:
- Not only is it not opt-in, but you can’t even opt out of it. Personalised search results should be opt-in and disabled by default.
- The data kept on you is used to sell you ads
- The data kept on you will be handed over to state entities fairly easily
Given those three problems, how feasible would it be to self-host a search engine that personalises your results to show you things that are more relevant to you? Avoiding issues 1 & 2 as you’re self-hosting so presumably you have made the decisions around those two things. And issue 3 is improved as you can host it off-shore if you are concerned about your domestic state, and if you are legally compelled to hand over data, you can make the personal choice about whether or not to take the hit of the consequences of refusing, rather than with a big company who will obviously immediately comply and not attempt to fight it even on legal grounds.
A basic use-case example is, say you’re a programmer and you look up ruby
, you would want to get the first result as the programming language’s website rather than the wikipedia page for the gemstone. You could just make the search query ruby programming language
on any privacy-respecting search engine, but it’s just a bit of QoL improvement to not have to think about the different ways an ambiguous search query like that could be interpreted.
It’s still all tied to one account. They could say, for instance, the same person searched for “beans”, “onions”, and “rice”, as opposed to not being tied to an account where those 3 searches could have come from 3 different people. Of course, a search engine like DDG is only promising to not track you to try figure out if those 3 searches came from the same person, but various anti-fingerprinting measures could make it infeasible for DDG to do that. For a paid search engine, you’d have to pay for a new account per search if you didn’t want it tied to any other searches, if you don’t trust that Kagi isn’t logging searches (which you shouldn’t, because you shouldn’t rely on trust for any threat model).
Don’t worry, I get where you’re coming from and I most certainly think some people have a use-case for it.