What's infuriating to me about these types of "signals" to the search rankings is that they have little to do with the content that I'm searching for. Google will hide results that I might find useful because the web master hasn't kept up with whatever Google decided was today's best practices. How about ranking based on the best source for what I'm looking for?
Google's ability to surface useful results has been thoroughly defeated by SEO spammers. To a lesser extent, the same is true of other search engines (Bing, etc.) though Google is the foremost SEO spam target for obvious reasons. Given that state of things, there is some sense in promoting more user-friendly pages that are thus a bit less likely to be SEO spam.
> there is some sense in promoting more user-friendly pages that are thus a bit less likely to be SEO spam
If there's a metric that will improve ranking, the SEO people are all over it, and have more incentive and resources to optimize for it than normal publishers.
I wonder if Google could combat this by having every X search pages swap page 1 with, say, page 5? Or give users the option to jump straight to a given search result page by default?
That way the SEO-ignorant sites that actually have the info you want, but get pushed out of the way due to SEO spam, will have some chances at traffic.
I have never written a search engine, so this comment is worth about 1 kb.
For some queries I instinctively jump a few pages ahead because I know the first few pages are going to be absolutely filled with SEO spam. The remainder is still not free of spam, but has a higher chance of containing what I want to find.
Nah, this is just unimaginative. The issue is that it is really hard to admit one was wrong and start from scratch. Rules should be simple enough for mortals to read. If they keep adding to the existing formula there will be web-lawyers required.
If only one website has what I'm looking for, then definitely give me that site. If multiple sites have what I want, then prioritizing by which sites will give it to me fastest sounds like it both helps me with this search and helps with future searches (since it puts incentives towards making sites less slow).
In general, prioritizing speed highly helps small independent sites and over large bloated sites: they typically have less JS, fewer round trips on the critical path, etc. Make your site simple enough (ex: https://danluu.com/) and it will automatically be fast.
(Disclosure: I work for Google, speaking only for myself)
Sounds like a solution that fits your needs. The topics you deal with probably fit that idea perfectly.
I would make it more generic. Have the user provide his system specs and give them the option to filter out what is or isn't reasonable for them.
I use: 1) a decent desktop, 2) a phone with reasonable specs, 3) a laptop with shit poor specs
a) 400 mbit cable, b) free wifi (crazy slow), c) my ISP provides wifi hotspots that are reasonably fast, d) a prepaid wireless plan where 10 euro equals 1 GB
Shortcomings/experience is pretty obvious with each combination. The laptop (to pick just one) cant reasonably open a google search result, the duck works just fine, fb messenger works too, it can download and play HD videos. Most significant but not all that obvious, it has a qwerty keyboard with which I can write substantial amounts of text. If the search result was tailored for this I could see myself use forums and blogs (with comment sections) over prepaid tethering. It's webcam is unsuitably poor.
Edit: pay 5 cents to view a clean page in stead of 1 euro bandwidth to freeze my client all of a sudden seems a fantastic deal.
That's why we should ditch google search for alternatives. So far I've been using DDG and am quite happy though it needs improvements in search results they may be the ones to listen to our needs unlike google who is too big to care.
DDG just gives you Bing's results, while avoiding giving Microsoft information on you. This will necessarily produce worse results, but you might decide that this is worth the tradeoff.
DDG also gives google results. If it grows in adoption it might get better than the two is what I think. If not, some other search engine is welcome, something better than google and bing
Source? Take any query and it will mostly be carbon copy of Bing results. Haven't seen any Google results on DDG yet. An engine that did both would certainly be desirable.
I was under the impression that google results are also used. I guess I was wrong about it. Does anyone know why google results are not aggregated as well? Technical or legal reason?
Likely to be monetary reasons. DDG is basically (not entirely accurate but close enough) in the business of reselling Bing Search API and monetizing it through Bing ad network revenue (+some affiliate revenue through Amazon and Ebay).
If you also added Google results, your input costs essentially double. Also if so much of your marketing is based on bashing Google, it would be harder to justify such move from a branding perspective.
Forgive my ignorance, I really didn't know all of that. I only my criticism of google for the reasons I stated, they have grown so much that they don't care about their users and their search quality has been deteriorating quite rapidly. In part it'snot their fault as an army of marketers found their way to play the game to push their products up in results but, it's not only that. Google has been acting more and more like a corporate monopoly, they're not what they were when they started for sure
Searx[1] is easy to self-host and there a bunch of public instances. It's an open source web app that aggregates search results from dozens of provider back ends like DDG, Bing, Google, Wikipedia, etc.
Speed has always been a good practice for as long as the web has existed. The specific metric changes, but the goal is the same. It's just that "simple" metrics are very easy to game.
An example of this is how they went from "First Paint" to "First Contentful Paint" to "Largest Contentful Paint". They're all trying to get at the same concept, which is when the page loads, but each iteration gets more precise and accurate. Realistically, as a webdev, if your site loads fast, it shouldn't matter which metric is used.
So glad you were able to articulate what I've been feeling for years. Google promotes good websites which has little to do with good content. It's a (debatable) signal of website quality, which is only tangentially related to content quality.