Okay so the "we" being blamed is marketers, not users.
Though this part worries me: "Google pays attention to how long someone stays on the page after clicking on a search result. They actively look for which result had the “longest click” from users (longest engagement)."
>Longest visit is very different from best result.
Sure it's different, but is it _very_ different? They have to choose some sort of metric. A long visit time does seem like one good indicator of the goodness of a result. Consider that they use other indicators as well, and their business largely benefits from their search being useful. It is in their interest to choose good metrics, and I'm sure they invest a lot of time into it. Why do you doubt this seemingly-sensible metric, which google has a motive to get right?
Absolutely. Who hasn’t spent time scrolling down a page, past three ads with filler text to make the page longer, trying to find whatever you needed. Most of the time I just need a quick answer to a question and the best pages can take me right there without searching the page.
The motive they have is to increase revenue from advertising. It so happens that “time on page” is a metric that advertisers care a lot about. Ad buyers pay more for space on pages where users spend a lot of time.
I’m not surprised at all that Google chose this as a metric. It’s very different from a metric that emphasises quality for the user.
“The biggest problem facing users of Web search engines today is the quality of the results they get back. While the results are often amusing and expand users’ horizons, they are often frustrating and consume precious time. “
I’m talking about ads in the pages returned by the search not ads in the search results themselves. This absolutely affects google revenue given they run a large percentage of ad placement auctions
In fact content marketing blogs and SEO spam sites have wall of text designed more or less intentionally to make the user lose a lot of time. Add to this the time to close to cookie tracking banner and the multiple ads. Maybe there was a time in the early 2000s in the early internet when people were just browsing websites for fun where that metric made sense.
> What do you consider a better metric for user engagement?
User "engagement" is a rotten metric that doesn't represent what most of us actually care about - results and answers, ASAP.
I'm not claiming this would work but if search engines had a thumbs up/down button next to results I would use it if it informed the ranking algorithm or personalized it for me.
I just spent time yesterday arguing with Google’s search AI because it refused to read what was on a goddamn Wikipedia page—when I gave it a screenshot it even lied about what was there. So yeah, I’m not at fault for this one, even if that was the stupidest conversation I’ve ever had.
Okay so the "we" being blamed is marketers, not users.
Though this part worries me: "Google pays attention to how long someone stays on the page after clicking on a search result. They actively look for which result had the “longest click” from users (longest engagement)."
Longest visit is very different from best result.
Is this why every recipe page starts with long prose and makes it a maze to see the recipe?
>Longest visit is very different from best result.
Sure it's different, but is it _very_ different? They have to choose some sort of metric. A long visit time does seem like one good indicator of the goodness of a result. Consider that they use other indicators as well, and their business largely benefits from their search being useful. It is in their interest to choose good metrics, and I'm sure they invest a lot of time into it. Why do you doubt this seemingly-sensible metric, which google has a motive to get right?
Absolutely. Who hasn’t spent time scrolling down a page, past three ads with filler text to make the page longer, trying to find whatever you needed. Most of the time I just need a quick answer to a question and the best pages can take me right there without searching the page.
The motive they have is to increase revenue from advertising. It so happens that “time on page” is a metric that advertisers care a lot about. Ad buyers pay more for space on pages where users spend a lot of time.
I’m not surprised at all that Google chose this as a metric. It’s very different from a metric that emphasises quality for the user.
“The biggest problem facing users of Web search engines today is the quality of the results they get back. While the results are often amusing and expand users’ horizons, they are often frustrating and consume precious time. “
https://snap.stanford.edu/class/cs224w-readings/Brin98Anatom...
Ads are placed regardless of the ranking, so this metric doesn’t affect them. The metric affects the ranking of non-ad placements.
I’m talking about ads in the pages returned by the search not ads in the search results themselves. This absolutely affects google revenue given they run a large percentage of ad placement auctions
Depending on type of search, time spent on the page might correlate with quality or it might correlate with how hard the page is to use.
And those are both common enough that yes it's very different.
And I said it worries me, not that I'm confident google is using it wrong.
If I click onto your website, and immediately hit back, that's a sign it wasn't a great website for my needs.
If I click in, and spend a bit of time reading, maybe I even scroll through your archive of previous articles, then that's a good sign, yeah?
What do you consider a better metric for user engagement?
> If I click onto your website, and immediately hit back, that's a sign it wasn't a great website for my needs.
Or it's a sign that the information I needed with right there.
> If I click in, and spend a bit of time reading
If I click in and spend time searching for what I wanted, that's a bad sign, yeah?
In fact content marketing blogs and SEO spam sites have wall of text designed more or less intentionally to make the user lose a lot of time. Add to this the time to close to cookie tracking banner and the multiple ads. Maybe there was a time in the early 2000s in the early internet when people were just browsing websites for fun where that metric made sense.
> What do you consider a better metric for user engagement?
User "engagement" is a rotten metric that doesn't represent what most of us actually care about - results and answers, ASAP.
I'm not claiming this would work but if search engines had a thumbs up/down button next to results I would use it if it informed the ranking algorithm or personalized it for me.
* bots included
(2024)
I just spent time yesterday arguing with Google’s search AI because it refused to read what was on a goddamn Wikipedia page—when I gave it a screenshot it even lied about what was there. So yeah, I’m not at fault for this one, even if that was the stupidest conversation I’ve ever had.