![]() |
Anyone following Google's latest May 2022 Core Update?
As those of you who work mindfully to maximize your search engine traffic, you are probably aware by now that a new Google Core Update (May 2022) was announced a couple days ago, and will be rolling out over the next couple of weeks.
Scuttlebutt suggests that this latest system-wide makeover in the search algorithm may specifically affect affiliates in an undesirable way. So, I'm wondering if anyone has noticed any changes from their G traffic, and what changes you may notice over these upcoming weeks. In my case, the day after the start of roll-out, I noticed a 60% drop in impressions and click-throughs. The next day was better, and I'm curious to see if things will level upwards more. -Dino |
Quote:
|
Quote:
|
Quote:
What I am seeing with this broad update is more adult keywords becoming mainstream |
Quote:
|
Quote:
|
Quote:
Cloaking doesn’t have to be a blackhat thing that uses redirects, if it’s done responsibly, the other benefit of this is that if you use the same links across a full network of sites, you only have to change an entry in the database and all of your links network wise will change automagically:) |
Quote:
|
Quote:
Google follows everything. It does not respect any "soft" directives like meta info and robots.txt. You can use htaccess to provide G-only-specific renderings of you pages and links, but not all Google spidering and sniffing is done on domains/IPs known to all. A good portion of my spidering by G is to follow through all my sponsor links, especially including any (remaining) FHG links. I know this because all my links are coded with my own logging wrapper that shows every outbound (that I care to know about) traffic from my site. This also helps me reconcile sponsor traffic stats with my own independent stats. BTW, I'm sitting at 50% of my G SE impressions and click-thrus before the core update began. This is not limited to adult sites. This core update is "not personal", according to Google, and reflects it's re-evaluation of who it "allows" to be seen, by the nature of what they are about. Considering that most search engines spam their own or purchased affiliate links at the top of their search results, there are self-interested benefits to scrapping any competition (as Google is well known to do - but no one has the will or power to take them on for Antitrust violations). Google indicated that it will announce when this roll-out is done. In the meantime, I have spent the last 6 months consolidating and limiting most of my outbound affiliate links to as few as possible. |
Quote:
I have never used sponsor links directly - all of them are wrapped in ways you have suggested. "Cloaking" (black hat) would involve not rendering those links when the spider is "looking" at the page. I don't do black hat. |
You can send outgoing traffic to domain.com/out/ and then disallow Googlebot access to that path via a robots.txt file.
WG |
Quote:
My experience has shown this to be the case, for years. |
I can see quite a big drop in traffic at a few of my sites where are direct affiliate links. I have not seen it at the majority of other sites with the same structure so far. Probably the change is just around the corner. How is it with WL sites for you, guys? Anybody has experienced any drop? I have not seen any significant change so far.
|
Quote:
|
To clarify Googlebot and robots.txt...
At face value, Google does recognize robots.txt and some directives. But there are caveats which it discloses. It did officially drop the NOINDEX directive in 2019. It does claim to support the "disallow" directive, but... https://developers.google.com/search...obots.txt-file Google: "Warning: Don't use a robots.txt file as a means to hide your web pages from Google search results." This may be moot in the case of disallowed paths in robots.txt that are not listed by anyone else, such as your outbound affiliate link bounce scripts. But, any external references (outside the realm of your robots.txt) to the same bounce URL may cause Googlebot to dig or list deeper. At face value, suggestions in this thread to disallow paths that reveal a sponsor destination are "correct". I've seen Googlebot do too many things since the it's beginning in the nineties, that make it impossible for me to take Google at it's word. For example, Google notoriously does, what I call, the "pig and the electric fence" trick. If you try to keep a pig contained with an electric fence, it's important to note that pigs are exceptionally smart, and will constantly "test" the fence for a momentary outage. They are so tenacious that pigs will often bust out, where other livestock relies on old memories of what happened when they touched the fence months ago. I have watched googlebot test/spider for the existence of pages that neither existed on my websites, nor were ever linked to from anywhere. The spider would literally concoct URLS with random text and try them. Anyhow, as far as I'm concerned, only htaccess can keep Googlebot out. |
Quote:
:thumbsup |
Quote:
Having had WL webcam sites in the past, it has always seemed that "duplicate content" flags were being triggered, as the ultimate webcam destinations were shared by all the white labels and the actual webcam sponsor. I would not expect WL to do much better than other affiliate sites. |
Quote:
WG |
Just re-direct all your traffic to my sites and be done with it. Happy times for all!!
:) |
I watch some websites with reciprocal links not penalized, even ranking good in the SERPs.
|
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Quote:
|
Affiliates have been whining that Google is out to get them since the mid 2000's and do so even more when a core update is announced.
Everything I've seen with this update indicates that they're refining the product reviews update they did a few months ago, and it could be the first major refinement with CWV signals. Slow sites with shit structure have lost SERPs. |
Given all that's been discussed here thus far, I've decided to limit Googlebot's access to directories that contain bounce scripts off-site to sponsors, using robots.txt disallow.
I don't like doing this, as linking to other "authoritative sites" is a good thing for a website - healthy outbound links can be a positive thing. Google claims it respects this, and its testing of my disallowed URLs against my robots.txt shows that Google recognizes this. BUT... Even though the following works: User-agent: * Disallow: /STFOOH/ This will NOT prevent Google's Adbots from following into the disallowed areas to their final destinations. Such Adbots must be explicitly declared as User-agent (and there are more than one). I have decided to use .htaccess 403 for all User-agents with "adbot' in their name. |
Affiliates should have gone to API and RSS a decade + ago and the boomer program owners should actually learn how it works and how to make it good.
|
Quote:
Your site's score is atrocious: https://pagespeed.web.dev/report?url...om%2Fhome% 2F |
Quote:
|
Quote:
I am wondering if Google will rank mobile versions of a site differently from the desktop version of the same site, and present search results according to the ranking of each - depending on what the surfer is searching with. Some sites are oriented more to desktop and not to mobile. Even though "Mobile Usability" may be accepted by Google for such sites, the CWV may be considerably better for the desktop. Do you think a stronger desktop performance might be ignored if the mobile performance is not as good? |
Quote:
I don't really care about my mobile if my desktop can stand on its own. Does Google weight these separately? |
Quote:
|
Is there a service or a person here who can make these https://pagespeed.web.dev/
metrics for my cam aggregator site go from 56 to like 80-85? |
Quote:
|
Quote:
|
Quote:
Quote:
The desktop score really only matters if it's bad. What I mean by that is, if your mobile score is say, 94 but you make Desktop super mega cancer and it's a 58, then you won't fare as well against someone who may also be 94 on mobile, but 99 on desktop. You should've started caring about mobile experience in about 2012. :winkwink: |
Even with CWV as a significant ranking factor, content and relevance still matter, even for "slow" sites.
According to John Mueller’s response from February 26th, 2021, to a question about the influence of Core Web Vitals on search results: -- "RELEVANCE IS STILL, BY FAR, MUCH MORE IMPORTANT. So just because your website is faster with regards to the Core Web Vitals than some competitor’s doesn’t necessarily mean that, come May, you will jump to position number one in the search results. ...a really fast website might be one that’s completely empty. But that’s not very useful for users. So it’s useful to keep that in mind when it comes to Core Web Vitals." -- In my view... There are many reasons why some websites may score less favourably for CWV - not all reasons necessarily being "poor design" or inferior hosting quality. While there are various "current popular templates" (the new responsive cookie cutter styles), some web designers balance more than just rapid rendering. There are intentions of distributing processing burden between servers and client machines to reduce server CPU demands. There are intentions of sharing highly dynamic components of any given page, by deploying them as a series of separate small payloads. This may result in longer aggregate load times, but not anywhere near as much as any adsense-based or nsa/analytics website often involves. There are intentions of incorporating multi-levelled obfuscation through design to help protect the rendered pages from being easily scraped. This might not stop attempts to scrape content-rich sites, but it makes these attempts clearly visible in server logs and take enough "sniffing" hits to firewall chronic offenders. And finally, not all websites are in their "final" design/implementation state. While Rapid Application Development may facilitate blasting out massive infrastructures, when something significant like a core update comes along, development priorities may shift to address urgent deficiencies - or not. The only efforts that seem to make sense until Core Update 2022 is done, and has settled, is to keep creating rich original content. :2 cents: |
Quote:
Google's "reasoning" is based on it's own (advertising) self-interests. Bucking that may have measurable consequences - or not. |
Quote:
|
Quote:
The demographic that my work is oriented to developing and harvesting is not limited to digital porn buyers. The largest portion of the prospective market I am working towards primarily uses Desktop. |
Quote:
|
Quote:
My desktop development takes first priority, and has CSS to responsively trim "bonus sections" (left/right of main content). But that does not seem to be enough for "mobile scores". (Eg. 90 score on desktop / 67 score on mobile.) If I decide to spend some resources on better mobile presentation, it would be to significantly trim the volume of content that mobile gets, or create a completely static text version for mobile that links to the Desktop version. I've been looking at Google's spidering behaviour over the past couple days (it's not Googlebot - different agent/IPs, but Google-owned) and it seems to have picked several core pages and every little while it hits the page as a low res device, and then a high res device - repeating this many (many) times for the same page throughout the day. Also, and I don't know if this is part of the core update, but the actual key phrases I am presently getting impressions for (and higher clicks) are very good and appropriate. This may be transient, but for now, it seems like the quality of my search traffic is better, even though there are fewer impressions and clicks - I'll take that. |
Quote:
So yeah, "your view" is quite obviously very wrong. All that text is really just mumbled bullshit to try and defend your bad choices, IMO. But you do you - if you want to make choices that make you less money because a bunch of excuses, go for it. Your desktop site is slow as shit, too, mostly because everything seems to be stuck in 2010. Huge JPG files, huge header file - your payload is massive in general. I'm on a gigabit connection and it still took a bit to load and you obviously don't have much traffic, so it's not that. Anyway, not really sure why you're quoting Mueller from over a year ago in a thread about a Core Update that's currently rolling out and when your site does a ton of shit very poorly anyway. You should probably focus on those things rather than writing a thesis worth of excuses on GFY. |
Quote:
|
Quote:
|
Quote:
Another way of looking at this is: "What web users could possibly prefer their desktop (or decent tablets) for accessing the internet. And why? I appreciate that mobiles have evolved considerably since my iPhone 4s, but browsing the web sitting down at a desktop is a very different experience, as is the person doing so. And that is a demographic. |
Quote:
. |
All times are GMT -7. The time now is 04:41 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123