GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Tech Anyone following Google's latest May 2022 Core Update? (https://gfy.com/showthread.php?t=1355125)

dcortez 05-27-2022 12:08 PM

Anyone following Google's latest May 2022 Core Update?
 
As those of you who work mindfully to maximize your search engine traffic, you are probably aware by now that a new Google Core Update (May 2022) was announced a couple days ago, and will be rolling out over the next couple of weeks.

Scuttlebutt suggests that this latest system-wide makeover in the search algorithm may specifically affect affiliates in an undesirable way.

So, I'm wondering if anyone has noticed any changes from their G traffic, and what changes you may notice over these upcoming weeks.

In my case, the day after the start of roll-out, I noticed a 60% drop in impressions and click-throughs.

The next day was better, and I'm curious to see if things will level upwards more.

-Dino

geirlur 05-28-2022 04:29 AM

Quote:

Originally Posted by dcortez (Post 23005255)
Scuttlebutt suggests that this latest system-wide makeover in the search algorithm may specifically affect affiliates in an undesirable way.

Aren't all adult sites affiliates though?

Klen 05-28-2022 06:45 AM

Quote:

Originally Posted by geirlur (Post 23005435)
Aren't all adult sites affiliates though?

Pretty easy to solve that problem - just dont show any affiliate link to google bot.

Holy Damage 05-28-2022 07:35 AM

Quote:

Originally Posted by Klen (Post 23005458)
Pretty easy to solve that problem - just dont show any affiliate link to google bot.

things have changed, google follows all links, even if it has nofollow tag, no index or it is disabled via robots.txt... it does not index them, but it knows where that link goes

What I am seeing with this broad update is more adult keywords becoming mainstream

dcortez 05-28-2022 09:42 AM

Quote:

Originally Posted by geirlur (Post 23005435)
Aren't all adult sites affiliates though?

Most maybe, but the actual authoritative sponsor sites is not an affiliates.

dcortez 05-28-2022 09:42 AM

Quote:

Originally Posted by Klen (Post 23005458)
Pretty easy to solve that problem - just dont show any affiliate link to google bot.

Until you get busted for cloaking.

Publisher Bucks 05-28-2022 09:48 AM

Quote:

Originally Posted by dcortez (Post 23005497)
Until you get busted for cloaking.

You don’t have to cloak links, just throw them into a database and call them like site.com/?id=porn or whatever you want to use.

Cloaking doesn’t have to be a blackhat thing that uses redirects, if it’s done responsibly, the other benefit of this is that if you use the same links across a full network of sites, you only have to change an entry in the database and all of your links network wise will change automagically:)

AlexxxSBG 05-28-2022 09:55 AM

Quote:

Originally Posted by dcortez (Post 23005497)
Until you get busted for cloaking.

:2 cents:

dcortez 05-28-2022 09:58 AM

Quote:

Originally Posted by Holy Damage (Post 23005470)
things have changed, google follows all links, even if it has nofollow tag, no index or it is disabled via robots.txt... it does not index them, but it knows where that link goes

What I am seeing with this broad update is more adult keywords becoming mainstream

Absolutely correct.

Google follows everything. It does not respect any "soft" directives like meta info and robots.txt.

You can use htaccess to provide G-only-specific renderings of you pages and links, but not all Google spidering and sniffing is done on domains/IPs known to all.

A good portion of my spidering by G is to follow through all my sponsor links, especially including any (remaining) FHG links.

I know this because all my links are coded with my own logging wrapper that shows every outbound (that I care to know about) traffic from my site. This also helps me reconcile sponsor traffic stats with my own independent stats.

BTW, I'm sitting at 50% of my G SE impressions and click-thrus before the core update began.

This is not limited to adult sites. This core update is "not personal", according to Google, and reflects it's re-evaluation of who it "allows" to be seen, by the nature of what they are about.

Considering that most search engines spam their own or purchased affiliate links at the top of their search results, there are self-interested benefits to scrapping any competition (as Google is well known to do - but no one has the will or power to take them on for Antitrust violations).

Google indicated that it will announce when this roll-out is done.

In the meantime, I have spent the last 6 months consolidating and limiting most of my outbound affiliate links to as few as possible.

dcortez 05-28-2022 10:05 AM

Quote:

Originally Posted by Publisher Bucks (Post 23005499)
You don’t have to cloak links, just throw them into a database and call them like site.com/?id=porn or whatever you want to use.

Cloaking doesn’t have to be a blackhat thing that uses redirects, if it’s done responsibly, the other benefit of this is that if you use the same links across a full network of sites, you only have to change an entry in the database and all of your links network wise will change automagically:)

I'm not sure how that would help make such links less "affiliate obvious". Google follows every link however it is structured. It determines the destination by resolving all bounces on your end (from translating a db param url to an actual sponsor link) all the way though to the destination page.

I have never used sponsor links directly - all of them are wrapped in ways you have suggested.

"Cloaking" (black hat) would involve not rendering those links when the spider is "looking" at the page. I don't do black hat.

WiredGuy 05-28-2022 10:59 AM

You can send outgoing traffic to domain.com/out/ and then disallow Googlebot access to that path via a robots.txt file.
WG

dcortez 05-28-2022 11:32 AM

Quote:

Originally Posted by WiredGuy (Post 23005517)
You can send outgoing traffic to domain.com/out/ and then disallow Googlebot access to that path via a robots.txt file.
WG

You cannot actually "disallow" Googlebot from any path. It can, and does follow ALL links it finds, regardless of robots.txt settings.

My experience has shown this to be the case, for years.

bohous 05-28-2022 11:37 AM

I can see quite a big drop in traffic at a few of my sites where are direct affiliate links. I have not seen it at the majority of other sites with the same structure so far. Probably the change is just around the corner. How is it with WL sites for you, guys? Anybody has experienced any drop? I have not seen any significant change so far.

dcortez 05-28-2022 11:41 AM

Quote:

Originally Posted by bohous (Post 23005530)
I can see quite a big drop in traffic at a few of my sites where are direct affiliate links. I have not seen it at the majority of other sites with the same structure so far. Probably the change is just around the corner. How is it with WL sites for you, guys? Anybody has experienced any drop? I have not seen any significant change so far.

What is a "WL site"?

dcortez 05-28-2022 12:04 PM

To clarify Googlebot and robots.txt...

At face value, Google does recognize robots.txt and some directives. But there are caveats which it discloses.

It did officially drop the NOINDEX directive in 2019.

It does claim to support the "disallow" directive, but...

https://developers.google.com/search...obots.txt-file

Google: "Warning: Don't use a robots.txt file as a means to hide your web pages from Google search results."

This may be moot in the case of disallowed paths in robots.txt that are not listed by anyone else, such as your outbound affiliate link bounce scripts. But, any external references (outside the realm of your robots.txt) to the same bounce URL may cause Googlebot to dig or list deeper.

At face value, suggestions in this thread to disallow paths that reveal a sponsor destination are "correct".

I've seen Googlebot do too many things since the it's beginning in the nineties, that make it impossible for me to take Google at it's word.

For example, Google notoriously does, what I call, the "pig and the electric fence" trick.

If you try to keep a pig contained with an electric fence, it's important to note that pigs are exceptionally smart, and will constantly "test" the fence for a momentary outage. They are so tenacious that pigs will often bust out, where other livestock relies on old memories of what happened when they touched the fence months ago.

I have watched googlebot test/spider for the existence of pages that neither existed on my websites, nor were ever linked to from anywhere. The spider would literally concoct URLS with random text and try them.

Anyhow, as far as I'm concerned, only htaccess can keep Googlebot out.

j3rkules 05-28-2022 01:43 PM

Quote:

Originally Posted by dcortez (Post 23005532)
What is a "WL site"?

A whitelabel site.

:thumbsup

dcortez 05-28-2022 02:10 PM

Quote:

Originally Posted by j3rkules (Post 23005566)
A whitelabel site.

:thumbsup

Ah, thanks.

Having had WL webcam sites in the past, it has always seemed that "duplicate content" flags were being triggered, as the ultimate webcam destinations were shared by all the white labels and the actual webcam sponsor.

I would not expect WL to do much better than other affiliate sites.

WiredGuy 05-28-2022 03:07 PM

Quote:

Originally Posted by dcortez (Post 23005527)
You cannot actually "disallow" Googlebot from any path. It can, and does follow ALL links it finds, regardless of robots.txt settings.

My experience has shown this to be the case, for years.

The Googlebot that does traverse into blocked paths is checking for malware and malicious code rather than indexing. If you prefer, you can use .htaccess to further restrict Googlebot from entering into blocked paths (by useragent or IP).
WG

The Porn Nerd 05-28-2022 04:29 PM

Just re-direct all your traffic to my sites and be done with it. Happy times for all!!

:)

itx 05-28-2022 04:45 PM

I watch some websites with reciprocal links not penalized, even ranking good in the SERPs.

dcortez 05-28-2022 04:55 PM

Quote:

Originally Posted by itx (Post 23005617)
I watch some websites with reciprocal links not penalized, even ranking good in the SERPs.

Direct A to B and B to A links? (not triangular links?)

itx 05-28-2022 04:58 PM

Quote:

Originally Posted by dcortez (Post 23005619)
Direct A to B and B to A links? (not triangular links?)

A to B and B to A.

Klen 05-30-2022 08:14 AM

Quote:

Originally Posted by dcortez (Post 23005497)
Until you get busted for cloaking.

Well depend how technical you are - i can always counter google no matter what they do.

redwhiteandblue 05-30-2022 09:07 AM

Quote:

Originally Posted by Publisher Bucks (Post 23005499)
You don’t have to cloak links, just throw them into a database and call them like site.com/?id=porn or whatever you want to use.

Cloaking doesn’t have to be a blackhat thing that uses redirects, if it’s done responsibly, the other benefit of this is that if you use the same links across a full network of sites, you only have to change an entry in the database and all of your links network wise will change automagically:)

You should not use the same affiliate links across a whole network. Doing so will help Google identify the sites as a network and then apply one of its spam filters.

redwhiteandblue 05-30-2022 09:16 AM

Quote:

Originally Posted by dcortez (Post 23005534)
I have watched googlebot test/spider for the existence of pages that neither existed on my websites, nor were ever linked to from anywhere. The spider would literally concoct URLS with random text and try them.

Yes I see it do that, I suspect it's to find out what your 404 page looks like. It has also in the past tried to find "/m/" and "/mobile/" pages that never existed, just in case you made a mobile version of your site and didn't think to link to it anywhere.

Denny 05-30-2022 09:41 AM

Quote:

Originally Posted by redwhiteandblue (Post 23006147)
You should not use the same affiliate links across a whole network. Doing so will help Google identify the sites as a network and then apply one of its spam filters.

But what if I want to promote same program on a few of my sites that are interlinked? Get multiple affiliate accounts with the program? I guess if I cloak the links like "mysite.com/go/program" Google is still able to get actual link and identify my sites by that?

redwhiteandblue 05-30-2022 10:20 AM

Quote:

Originally Posted by Denny (Post 23006163)
But what if I want to promote same program on a few of my sites that are interlinked? Get multiple affiliate accounts with the program? I guess if I cloak the links like "mysite.com/go/program" Google is still able to get actual link and identify my sites by that?

I'm not absolutely certain but I suspect that as long as the affiliate link is different in some way then Google will see it as a different URL, so using a different campaign should work. In NATS it changes the encrypted link by one character. With CCBill, just sign up again to get a different affiliate ref code.

trevesty 05-30-2022 11:51 AM

Affiliates have been whining that Google is out to get them since the mid 2000's and do so even more when a core update is announced.

Everything I've seen with this update indicates that they're refining the product reviews update they did a few months ago, and it could be the first major refinement with CWV signals. Slow sites with shit structure have lost SERPs.

dcortez 05-30-2022 12:31 PM

Given all that's been discussed here thus far, I've decided to limit Googlebot's access to directories that contain bounce scripts off-site to sponsors, using robots.txt disallow.

I don't like doing this, as linking to other "authoritative sites" is a good thing for a website - healthy outbound links can be a positive thing.

Google claims it respects this, and its testing of my disallowed URLs against my robots.txt shows that Google recognizes this.

BUT...

Even though the following works:

User-agent: *
Disallow: /STFOOH/

This will NOT prevent Google's Adbots from following into the disallowed areas to their final destinations. Such Adbots must be explicitly declared as User-agent (and there are more than one).

I have decided to use .htaccess 403 for all User-agents with "adbot' in their name.

mainstreammix 05-30-2022 01:53 PM

Affiliates should have gone to API and RSS a decade + ago and the boomer program owners should actually learn how it works and how to make it good.

Freedom6995 05-30-2022 04:36 PM

Quote:

Originally Posted by trevesty (Post 23006193)
it could be the first major refinement with CWV signals. Slow sites with shit structure have lost SERPs.

Some good advice here OP. :2 cents:

Your site's score is atrocious: https://pagespeed.web.dev/report?url...om%2Fhome% 2F

TaiGhost 05-30-2022 05:20 PM

Quote:

Originally Posted by dcortez (Post 23005501)
Absolutely correct.

Google follows everything. It does not respect any "soft" directives like meta info and robots.txt.

You can use htaccess to provide G-only-specific renderings of you pages and links, but not all Google spidering and sniffing is done on domains/IPs known to all.

A good portion of my spidering by G is to follow through all my sponsor links, especially including any (remaining) FHG links.

I know this because all my links are coded with my own logging wrapper that shows every outbound (that I care to know about) traffic from my site. This also helps me reconcile sponsor traffic stats with my own independent stats.

BTW, I'm sitting at 50% of my G SE impressions and click-thrus before the core update began.

This is not limited to adult sites. This core update is "not personal", according to Google, and reflects it's re-evaluation of who it "allows" to be seen, by the nature of what they are about.

Considering that most search engines spam their own or purchased affiliate links at the top of their search results, there are self-interested benefits to scrapping any competition (as Google is well known to do - but no one has the will or power to take them on for Antitrust violations).

Google indicated that it will announce when this roll-out is done.

In the meantime, I have spent the last 6 months consolidating and limiting most of my outbound affiliate links to as few as possible.

Gone are the days of shotgun blasting. Its fine for us.

dcortez 05-30-2022 06:35 PM

Quote:

Originally Posted by trevesty (Post 23006193)
Everything I've seen with this update indicates that they're refining the product reviews update they did a few months ago, and it could be the first major refinement with CWV signals.

It seems to be so,

I am wondering if Google will rank mobile versions of a site differently from the desktop version of the same site, and present search results according to the ranking of each - depending on what the surfer is searching with.

Some sites are oriented more to desktop and not to mobile. Even though "Mobile Usability" may be accepted by Google for such sites, the CWV may be considerably better for the desktop.

Do you think a stronger desktop performance might be ignored if the mobile performance is not as good?

dcortez 05-30-2022 06:39 PM

Quote:

Originally Posted by Freedom6995 (Post 23006259)
Some good advice here OP. :2 cents:

Your site's score is atrocious: https://pagespeed.web.dev/report?url...om%2Fhome% 2F

My desktop performance is 87. My mobile is 64.

I don't really care about my mobile if my desktop can stand on its own. Does Google weight these separately?

redwhiteandblue 05-31-2022 01:17 AM

Quote:

Originally Posted by dcortez (Post 23006273)
Does Google weight these separately?

It has been doing for several years now.

grzepa 05-31-2022 01:47 AM

Is there a service or a person here who can make these https://pagespeed.web.dev/
metrics for my cam aggregator site go from 56 to like 80-85?

dcortez 05-31-2022 02:18 AM

Quote:

Originally Posted by redwhiteandblue (Post 23006335)
It has been doing for several years now.

That's good. Then I will focus on my desktop version.

Paul&John 05-31-2022 02:21 AM

Quote:

Originally Posted by grzepa (Post 23006341)
Is there a service or a person here who can make these https://pagespeed.web.dev/
metrics for my cam aggregator site go from 56 to like 80-85?

It says under 'Diagnostics' what needs to be done. But feel free to email me and I can take a look

trevesty 05-31-2022 03:37 AM

Quote:

Originally Posted by dcortez (Post 23006271)
Do you think a stronger desktop performance might be ignored if the mobile performance is not as good?

Google is a mobile-first index. However, sites with a bad Desktop score will also see a negative impact as of a few months ago. This is because some sites were making mobile extremely user-friendly, then making desktop cancer with a side of herpes and AIDS.

Quote:

Originally Posted by dcortez (Post 23006273)
My desktop performance is 87. My mobile is 64.

I don't really care about my mobile if my desktop can stand on its own. Does Google weight these separately?

Sort of, but they're still a mobile-first index. Assuming the same site structure, link profile (etc) on a competitor of yours, the one with the higher mobile score will have the better position on both platforms.

The desktop score really only matters if it's bad. What I mean by that is, if your mobile score is say, 94 but you make Desktop super mega cancer and it's a 58, then you won't fare as well against someone who may also be 94 on mobile, but 99 on desktop.

You should've started caring about mobile experience in about 2012. :winkwink:

dcortez 05-31-2022 03:48 AM

Even with CWV as a significant ranking factor, content and relevance still matter, even for "slow" sites.

According to John Mueller’s response from February 26th, 2021, to a question about the influence of Core Web Vitals on search results:

--
"RELEVANCE IS STILL, BY FAR, MUCH MORE IMPORTANT. So just because your website is faster with regards to the Core Web Vitals than some competitor’s doesn’t necessarily mean that, come May, you will jump to position number one in the search results. ...a really fast website might be one that’s completely empty. But that’s not very useful for users. So it’s useful to keep that in mind when it comes to Core Web Vitals."
--

In my view...

There are many reasons why some websites may score less favourably for CWV - not all reasons necessarily being "poor design" or inferior hosting quality.

While there are various "current popular templates" (the new responsive cookie cutter styles), some web designers balance more than just rapid rendering.

There are intentions of distributing processing burden between servers and client machines to reduce server CPU demands.

There are intentions of sharing highly dynamic components of any given page, by deploying them as a series of separate small payloads. This may result in longer aggregate load times, but not anywhere near as much as any adsense-based or nsa/analytics website often involves.

There are intentions of incorporating multi-levelled obfuscation through design to help protect the rendered pages from being easily scraped. This might not stop attempts to scrape content-rich sites, but it makes these attempts clearly visible in server logs and take enough "sniffing" hits to firewall chronic offenders.

And finally, not all websites are in their "final" design/implementation state. While Rapid Application Development may facilitate blasting out massive infrastructures, when something significant like a core update comes along, development priorities may shift to address urgent deficiencies - or not.

The only efforts that seem to make sense until Core Update 2022 is done, and has settled, is to keep creating rich original content.

:2 cents:

dcortez 05-31-2022 03:53 AM

Quote:

Originally Posted by trevesty (Post 23006360)
The desktop score really only matters if it's bad.

Maybe so, and Google generally gets what Google wants, regardless of whether its rules make sense, but some web applications do not belong on mobiles.

Google's "reasoning" is based on it's own (advertising) self-interests.

Bucking that may have measurable consequences - or not.

redwhiteandblue 05-31-2022 04:03 AM

Quote:

Originally Posted by dcortez (Post 23006344)
That's good. Then I will focus on my desktop version.

Then you are probably focusing on about one third of your total audience. Mobile users do actually buy porn, especially iPhone and iPad users.

dcortez 05-31-2022 04:24 AM

Quote:

Originally Posted by redwhiteandblue (Post 23006382)
Then you are probably focusing on about one third of your total audience. Mobile users do actually buy porn, especially iPhone and iPad users.

I agree with you that by focusing on Desktop I would forfeit a significant market portion - if (re)selling digital porn was my long-term objective.

The demographic that my work is oriented to developing and harvesting is not limited to digital porn buyers. The largest portion of the prospective market I am working towards primarily uses Desktop.

Klen 05-31-2022 06:10 AM

Quote:

Originally Posted by trevesty (Post 23006360)
Google is a mobile-first index. However, sites with a bad Desktop score will also see a negative impact as of a few months ago. This is because some sites were making mobile extremely user-friendly, then making desktop cancer with a side of herpes and AIDS.



Sort of, but they're still a mobile-first index. Assuming the same site structure, link profile (etc) on a competitor of yours, the one with the higher mobile score will have the better position on both platforms.

The desktop score really only matters if it's bad. What I mean by that is, if your mobile score is say, 94 but you make Desktop super mega cancer and it's a 58, then you won't fare as well against someone who may also be 94 on mobile, but 99 on desktop.

You should've started caring about mobile experience in about 2012. :winkwink:

Yes that is funny thing what is happening now, before was key question how does site look on mobile but now due somuch focus on mobile a lot of sites looks like a joke on desktop. For example, most popular responsive layout on desktop leaves huge empty space on left and right screen and that look really ugly. Which kind a make responsive design fail and old way of having separate desktop and mobile approach much better.

dcortez 05-31-2022 10:14 AM

Quote:

Originally Posted by Klen (Post 23006420)
Yes that is funny thing what is happening now, before was key question how does site look on mobile but now due somuch focus on mobile a lot of sites looks like a joke on desktop. For example, most popular responsive layout on desktop leaves huge empty space on left and right screen and that look really ugly. Which kind a make responsive design fail and old way of having separate desktop and mobile approach much better.

I agree. Forcing desktop design to take second seat to mobile can trash up, what was otherwise, excellent use of visual real estate on large res screens.

My desktop development takes first priority, and has CSS to responsively trim "bonus sections" (left/right of main content).

But that does not seem to be enough for "mobile scores". (Eg. 90 score on desktop / 67 score on mobile.)

If I decide to spend some resources on better mobile presentation, it would be to significantly trim the volume of content that mobile gets, or create a completely static text version for mobile that links to the Desktop version.

I've been looking at Google's spidering behaviour over the past couple days (it's not Googlebot - different agent/IPs, but Google-owned) and it seems to have picked several core pages and every little while it hits the page as a low res device, and then a high res device - repeating this many (many) times for the same page throughout the day.

Also, and I don't know if this is part of the core update, but the actual key phrases I am presently getting impressions for (and higher clicks) are very good and appropriate.

This may be transient, but for now, it seems like the quality of my search traffic is better, even though there are fewer impressions and clicks - I'll take that.

trevesty 06-01-2022 04:22 AM

Quote:

Originally Posted by dcortez (Post 23006366)
Even with CWV as a significant ranking factor, content and relevance still matter, even for "slow" sites.

According to John Mueller’s response from February 26th, 2021, to a question about the influence of Core Web Vitals on search results:

--
"RELEVANCE IS STILL, BY FAR, MUCH MORE IMPORTANT. So just because your website is faster with regards to the Core Web Vitals than some competitor’s doesn’t necessarily mean that, come May, you will jump to position number one in the search results. ...a really fast website might be one that’s completely empty. But that’s not very useful for users. So it’s useful to keep that in mind when it comes to Core Web Vitals."
--

In my view...

There are many reasons why some websites may score less favourably for CWV - not all reasons necessarily being "poor design" or inferior hosting quality.

While there are various "current popular templates" (the new responsive cookie cutter styles), some web designers balance more than just rapid rendering.

There are intentions of distributing processing burden between servers and client machines to reduce server CPU demands.

There are intentions of sharing highly dynamic components of any given page, by deploying them as a series of separate small payloads. This may result in longer aggregate load times, but not anywhere near as much as any adsense-based or nsa/analytics website often involves.

There are intentions of incorporating multi-levelled obfuscation through design to help protect the rendered pages from being easily scraped. This might not stop attempts to scrape content-rich sites, but it makes these attempts clearly visible in server logs and take enough "sniffing" hits to firewall chronic offenders.

And finally, not all websites are in their "final" design/implementation state. While Rapid Application Development may facilitate blasting out massive infrastructures, when something significant like a core update comes along, development priorities may shift to address urgent deficiencies - or not.

The only efforts that seem to make sense until Core Update 2022 is done, and has settled, is to keep creating rich original content.

:2 cents:

No offense, but "your view" doesn't appear to be working for you, otherwise your OP wouldn't say what it says. When I check CrushMyVelvent on Ahrefs, you rank VERY poorly for about 20 long tail keywords and that's it.

So yeah, "your view" is quite obviously very wrong. All that text is really just mumbled bullshit to try and defend your bad choices, IMO. But you do you - if you want to make choices that make you less money because a bunch of excuses, go for it.

Your desktop site is slow as shit, too, mostly because everything seems to be stuck in 2010. Huge JPG files, huge header file - your payload is massive in general. I'm on a gigabit connection and it still took a bit to load and you obviously don't have much traffic, so it's not that.

Anyway, not really sure why you're quoting Mueller from over a year ago in a thread about a Core Update that's currently rolling out and when your site does a ton of shit very poorly anyway. You should probably focus on those things rather than writing a thesis worth of excuses on GFY.

dcortez 06-01-2022 09:57 AM

Quote:

Originally Posted by trevesty (Post 23006739)
No offense, but "your view" doesn't appear to be working for you, otherwise your OP wouldn't say what it says. When I check CrushMyVelvent on Ahrefs, you rank VERY poorly for about 20 long tail keywords and that's it.

So yeah, "your view" is quite obviously very wrong. All that text is really just mumbled bullshit to try and defend your bad choices, IMO. But you do you - if you want to make choices that make you less money because a bunch of excuses, go for it.

Your desktop site is slow as shit, too, mostly because everything seems to be stuck in 2010. Huge JPG files, huge header file - your payload is massive in general. I'm on a gigabit connection and it still took a bit to load and you obviously don't have much traffic, so it's not that.

Anyway, not really sure why you're quoting Mueller from over a year ago in a thread about a Core Update that's currently rolling out and when your site does a ton of shit very poorly anyway. You should probably focus on those things rather than writing a thesis worth of excuses on GFY.

Thanks. I'll keep that in mind.

fuzebox 06-01-2022 11:42 AM

Quote:

Originally Posted by dcortez (Post 23006393)
I agree with you that by focusing on Desktop I would forfeit a significant market portion - if (re)selling digital porn was my long-term objective.

The demographic that my work is oriented to developing and harvesting is not limited to digital porn buyers. The largest portion of the prospective market I am working towards primarily uses Desktop.

What site could possibly have a desktop demographic in 2022?

dcortez 06-01-2022 12:07 PM

Quote:

Originally Posted by fuzebox (Post 23006867)
What site could possibly have a desktop demographic in 2022?

Lots. Some products require a high resolution presentation.

Another way of looking at this is: "What web users could possibly prefer their desktop (or decent tablets) for accessing the internet. And why?

I appreciate that mobiles have evolved considerably since my iPhone 4s, but browsing the web sitting down at a desktop is a very different experience, as is the person doing so.

And that is a demographic.

sarettah 06-01-2022 01:07 PM

Quote:

Originally Posted by fuzebox (Post 23006867)
What site could possibly have a desktop demographic in 2022?

Cam sites.


.


All times are GMT -7. The time now is 04:41 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123