GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Are you using custom AI agents for deploying websites? (https://gfy.com/showthread.php?t=1387691)

Mindi 12-09-2025 12:20 PM

Quote:

Originally Posted by 2MuchMark (Post 23417823)
Don't get so mad Mindi I am just trying to understand something.

You said here that your site "outranks Trip Advisor for many local searches." But when I search for competitive local terms like:

https://www.google.com/search?q=best...ntford+ontario
https://www.google.com/search?q=wher...t+in+brantford and
https://www.google.com/search?q=top+...ants+brantford

I see TripAdvisor, Yelp, and Google Maps dominating, but I don't see brantfordinsider.com in the top 20 results. Could you share specific queries where your site outranks TripAdvisor I Want to understand what I'm missing here.

I am completely serious when I say I have 1, possibly 2 clients for you. One of them has a dozen or so domains for a dozen or so different products, and wants to create dozens of additional domains and websites for each. Normally Legacy would be my first choice but the guy is swamped with work until March or something and my client doesn't want to wait that long. If you aren't interested in the business then thats up to you I guess, but if you can please answer the questions, maybe we can turn the page on things.... this is an olive branch...


Take that olive branch and shove it up your fucking ass. You just don't fucking get it do you? I do not care what you are offering. I would not do business you and that fucking fraud and stalker TheLegacy for ANY AMOUNT OF MONEY.

2MuchMark 12-09-2025 12:27 PM

Quote:

Originally Posted by Mindi (Post 23417826)
Take that olive branch and shove it up your fucking ass. You just don't fucking get it do you? I do not care what you are offering. I would not do business you and that fucking fraud and stalker TheLegacy for ANY AMOUNT OF MONEY.

I didn’t say business with me or with legacy. I offered to send the clients to you. These were one of their questions. It’s a basic SEO question is it not? Geeze …. If you don’t want to answer for me that’s fine but what about everyone else reading this thread? Is my question wrong?

Mindi 12-09-2025 12:44 PM

Quote:

Originally Posted by 2MuchMark (Post 23417829)
Is my question wrong?

Your EXISTENCE is fucking wrong.

If somebody else wants to ask a question, they THEY can ask it themselves.

jamezon 12-09-2025 01:41 PM

Quote:

Originally Posted by The Porn Nerd (Post 23417572)
Well the only 'data source' is my content on my server. I am not using API from different companies to integrate their content.

I just want to churn out new porn sites with the least amount of effort. :)

in that case you need to get structured data from your old sites and store it somewhere, maybe try n8n or some other scraping solutions for the beginner, then you might be inspired to go further

2MuchMark 12-09-2025 03:00 PM

Quote:

Originally Posted by Mindi (Post 23417832)
Your EXISTENCE is fucking wrong.

If somebody else wants to ask a question, they THEY can ask it themselves.

I am asking on their behalf because like I said one of them is in mainstream and she's the one asking. Do you want her business? Her questions are valid aren't they? And besides don't you think that other people here would want answers to simple questions like that too?

Wtify 12-09-2025 03:02 PM

Quote:

Originally Posted by jamezon (Post 23417773)
whats so magic about this ? its called scraping or stealing

you can pull any scraped content and republish via wordpress api since 15 years now with a couple of lines of code

the question is, is there a demand for that kind of search traffic and what makes your results better then all those competitors doing the same . there are gazillions of rescrapers out there . good luck with your new rocket science. if you needed ai to figure this out your a bit late to the party i guess.

I think the point is how easy is to implement such structure these days. To be honest I'm impressed by Claude Code. It's like having a developing team at your disposal for $200 / month.

2MuchMark 12-09-2025 03:08 PM

I found this:

https://developers.google.com/search...elpful-content

Could you share examples of where your site ranks for discovery/research queries in the competitive space?

2MuchMark 12-09-2025 03:24 PM

Quote:

Originally Posted by jamezon (Post 23417773)
whats so magic about this ? its called scraping or stealing

you can pull any scraped content and republish via wordpress api since 15 years now with a couple of lines of code

the question is, is there a demand for that kind of search traffic and what makes your results better then all those competitors doing the same . there are gazillions of rescrapers out there . good luck with your new rocket science. if you needed ai to figure this out your a bit late to the party i guess.

^^ THIS ^^

The local directory space is indeed crowded with established players and you raised a good question about market saturation.

Mindi: Your automation is cool. Take it from me. I am automating everything so I think my opinion matters a little here. Take my compliment. You deserve it. Honestly.

Now, please show ranking for competitive discovery terms. I am about to hand you a talented, female, mainstream client. This is your last chance to prove your SEO really works.

Mindi 12-09-2025 03:53 PM

Quote:

Originally Posted by 2MuchMark (Post 23417851)
^^ THIS ^^

The local directory space is indeed crowded with established players and you raised a good question about market saturation.

Mindi: Your automation is cool. Take it from me. I am automating everything so I think my opinion matters a little here. Take my compliment. You deserve it. Honestly.

Now, please show ranking for competitive discovery terms. I am about to hand you a talented, female, mainstream client. This is your last chance to prove your SEO really works.

Go fuck yourself. Do you see a FOR SALE sign here? I do not need to prove anything now do I? I do not care if your client is Elon fucking Musk if it's attached to you I will simply pass. I do not need your business or anything from you Mark. Only thing I would LIKE is for you finally get the message that I dont fucking like you so fucking piss off. You and your fucking buddy Legacy have done enough damage. Fucking go away.

Umami 12-09-2025 03:58 PM

2MuchMark is the type of dude that got told no at parties and still continued to grope the girl

Mindi 12-09-2025 04:26 PM

Quote:

Originally Posted by jamezon (Post 23417773)
whats so magic about this ? its called scraping or stealing

you can pull any scraped content and republish via wordpress api since 15 years now with a couple of lines of code

the question is, is there a demand for that kind of search traffic and what makes your results better then all those competitors doing the same . there are gazillions of rescrapers out there . good luck with your new rocket science. if you needed ai to figure this out your a bit late to the party i guess.

Now that it's not 6am and I'd haven't been coding for 18 hours straight, let me address this one :)

You're confusing scraping with licensed API use. They're not the same thing - legally, technically, or strategically.

Let me break this down since you brought up "scraping or stealing":

SCRAPING vs. LICENSED APIs

Scraping (what you're describing):

❌ Unauthorized copying of copyrighted content
❌ Violates terms of service
❌ Gets you cease & desist letters
❌ Sites detect it and block your IPs
❌ Data goes stale, breaks when site structure changes
❌ Legal liability

Licensed API Use (what I'm doing):

✅ Yelp Fusion API - Official commercial license
✅ Google Places API - Official commercial license
✅ TripAdvisor Content API - Official commercial license
✅ Legal terms of service explicitly grant you rights to display content
✅ Proper attribution required (which protects you)
✅ Data refreshes automatically
✅ No legal risk

The difference:

When you scrape TripAdvisor's website, you're copying copyrighted content without permission. Reviews, photos, descriptions - all copyrighted.

When you use TripAdvisor's Content API, they GIVE YOU A LICENSE to display that content as long as you follow display requirements (attribution, linking back, etc.).

Same data. Completely different legal status.

WHY THIS MATTERS:
1. Scalability

Scraping breaks constantly (site redesigns, anti-bot measures, IP blocks)
APIs are stable, documented, supported
I can deploy 50 cities without worrying about getting shut down

2. Legal Protection

API terms of service = you have a license
Scraping = you're violating copyright and TOS
I sleep fine. Scrapers get sued.

3. Data Quality

APIs provide structured, clean data
Scraping gives you messy HTML you have to parse
My system pulls 500+ listings in minutes with zero errors

4. Business Legitimacy

APIs = you're a legitimate platform partner
Scraping = you're a parasite hoping not to get caught
When I talk to businesses about premium listings, I'm not hiding what I do


THE TECHNICAL REALITY:

What my system does:
1. Yelp Fusion API call → Returns JSON with:
- Business name, address, phone, hours
- Star rating (4.5 stars, 238 reviews)
- Review excerpts with attribution
- Photos with license to display
- All structured, clean, ready to use

2. Google Places API call → Returns JSON with:
- Additional business data
- More photos
- Google Maps integration
- Place IDs for linking

3. AI Processing:
- Generate unique descriptions (not copied from anywhere)
- Create 8-10 SEO-optimized articles
- Build bilingual content (EN/IT, EN/FR, etc.)
- Optimize meta tags, schema markup, internal linking

4. WordPress deployment:
- Automated posting via WP-CLI
- Custom post types for listings
- Taxonomy management
- Image optimization
- Mobile-responsive theme

Result: Complete directory in under 24 hours
This isn't "a couple lines of code to scrape WordPress API."
This is 16,000+ lines of Python across 34 modules handling:

API authentication and rate limiting
Data normalization across multiple sources
Bilingual content generation
SEO optimization
Image processing
Database management
WordPress integration
Error handling and logging


THE BUSINESS MODEL:

This isn't an AdSense arbitrage play.

Here's how it actually works:

Build directory (MontrealInsider.com, TylerInsider.com, etc.)
Rank for local searches ("best restaurants in Brantford," "things to do in Tyler")
Capture business owner attention (they Google themselves, find my listing)
Convert to SEO audits, custom software sales, consulting, other things that I do
Upsell implementation (ongoing SEO services)

The directory is the lead magnet, not the revenue.
I'm not trying to compete with Yelp on ad revenue. I'm using owned traffic assets to generate leads for my core business: SEO and custom software

Proven results:

Built MontrealInsider.com as a demo
Generated multiple inbound leads within 48 hours
Directories consistently convert at 2-3% to paid audits, SEO services, custom software


WHY "EVERYONE DOING THIS" DOESN'T MATTER:

Most people doing "local directories" are:

Running AdSense farms (low margins, Google penalizes them)
Actually scraping (illegal, get shut down)
Building manually (takes weeks per city, doesn't scale)
Not monetizing properly (just hoping for ad clicks)
Have no backend service (no real business, just hoping to flip the site)

I'm doing this differently:

Legal API usage (can scale to 100+ cities without legal risk)
Automated deployment (24 hours per directory)
Real backend business (SEO services with proven demand)
Asset building (own the traffic, own the customer relationship)


THE "LATE TO THE PARTY" ARGUMENT:

You're right - APIs have existed for 15 years.

Here's what changed:

AI content generation (ChatGPT/Claude) made it economically viable to create unique, quality content at scale
API pricing dropped (Yelp/Google made APIs more accessible)
Local SEO got easier (Google prioritizes helpful local content)
WordPress optimization tools (deployment is faster than ever)

But mainly:

I'm not trying to invent something new. I'm applying 25+ years of affiliate marketing experience (multiple 7 figures properties in adult industry) to a proven model.

The "magic" isn't the technology - it's the execution:

Speed of deployment (most people take weeks, I take hours)
Business model integration (directory → leads → service revenue)
Legal compliance (APIs not scraping)
Scalability (can deploy 50+ cities)


BOTTOM LINE:

If you think this is "just scraping with WordPress API" - you're missing the entire point.

This is:

Legal API licensing (not scraping)
Automated content generation (not copying)
Lead generation (not AdSense arbitrage)
Real business backend (not hoping to flip domains)

If it's so easy, why aren't you doing it?

I'm not here to convince skeptics. I'm here to build assets and generate leads.

The proof is in the results: I'm getting inbound business inquiries from live demos while everyone else is debating whether it's "rocket science."

Good luck with whatever you're working on. :thumbsup


For business inquires, my website is WebIgniter.com

Killswitch 12-09-2025 05:09 PM

Quote:

Originally Posted by 2MuchMark (Post 23417851)
^^ THIS ^^

The local directory space is indeed crowded with established players and you raised a good question about market saturation.

Mindi: Your automation is cool. Take it from me. I am automating everything so I think my opinion matters a little here. Take my compliment. You deserve it. Honestly.

Now, please show ranking for competitive discovery terms. I am about to hand you a talented, female, mainstream client. This is your last chance to prove your SEO really works.

Just so you're aware we can see exactly what you're trying to do and it's not cool at all.

Quote:

Originally Posted by Umami (Post 23417859)
2MuchMark is the type of dude that got told no at parties and still continued to grope the girl

:2 cents: :2 cents: :2 cents:

2MuchMark 12-09-2025 05:10 PM

Quote:

Originally Posted by Mindi (Post 23417863)
Now that it's not 6am and I'd haven't been coding for 18 hours straight, let me address this one :)

You're confusing scraping with licensed API use. They're not the same thing - legally, technically, or strategically.

Let me break this down since you brought up "scraping or stealing":

SCRAPING vs. LICENSED APIs

Scraping (what you're describing):

❌ Unauthorized copying of copyrighted content
❌ Violates terms of service
❌ Gets you cease & desist letters
❌ Sites detect it and block your IPs
❌ Data goes stale, breaks when site structure changes
❌ Legal liability

Licensed API Use (what I'm doing):

✅ Yelp Fusion API - Official commercial license
✅ Google Places API - Official commercial license
✅ TripAdvisor Content API - Official commercial license
✅ Legal terms of service explicitly grant you rights to display content
✅ Proper attribution required (which protects you)
✅ Data refreshes automatically
✅ No legal risk

The difference:

When you scrape TripAdvisor's website, you're copying copyrighted content without permission. Reviews, photos, descriptions - all copyrighted.

When you use TripAdvisor's Content API, they GIVE YOU A LICENSE to display that content as long as you follow display requirements (attribution, linking back, etc.).

Same data. Completely different legal status.

WHY THIS MATTERS:
1. Scalability

Scraping breaks constantly (site redesigns, anti-bot measures, IP blocks)
APIs are stable, documented, supported
I can deploy 50 cities without worrying about getting shut down

2. Legal Protection

API terms of service = you have a license
Scraping = you're violating copyright and TOS
I sleep fine. Scrapers get sued.

3. Data Quality

APIs provide structured, clean data
Scraping gives you messy HTML you have to parse
My system pulls 500+ listings in minutes with zero errors

4. Business Legitimacy

APIs = you're a legitimate platform partner
Scraping = you're a parasite hoping not to get caught
When I talk to businesses about premium listings, I'm not hiding what I do


THE TECHNICAL REALITY:

What my system does:
1. Yelp Fusion API call → Returns JSON with:
- Business name, address, phone, hours
- Star rating (4.5 stars, 238 reviews)
- Review excerpts with attribution
- Photos with license to display
- All structured, clean, ready to use

2. Google Places API call → Returns JSON with:
- Additional business data
- More photos
- Google Maps integration
- Place IDs for linking

3. AI Processing:
- Generate unique descriptions (not copied from anywhere)
- Create 8-10 SEO-optimized articles
- Build bilingual content (EN/IT, EN/FR, etc.)
- Optimize meta tags, schema markup, internal linking

4. WordPress deployment:
- Automated posting via WP-CLI
- Custom post types for listings
- Taxonomy management
- Image optimization
- Mobile-responsive theme

Result: Complete directory in under 24 hours
This isn't "a couple lines of code to scrape WordPress API."
This is 16,000+ lines of Python across 34 modules handling:

API authentication and rate limiting
Data normalization across multiple sources
Bilingual content generation
SEO optimization
Image processing
Database management
WordPress integration
Error handling and logging


THE BUSINESS MODEL:

This isn't an AdSense arbitrage play.

Here's how it actually works:

Build directory (MontrealInsider.com, TylerInsider.com, etc.)
Rank for local searches ("best restaurants in Brantford," "things to do in Tyler")
Capture business owner attention (they Google themselves, find my listing)
Convert to SEO audits, custom software sales, consulting, other things that I do
Upsell implementation (ongoing SEO services)

The directory is the lead magnet, not the revenue.
I'm not trying to compete with Yelp on ad revenue. I'm using owned traffic assets to generate leads for my core business: SEO and custom software

Proven results:

Built MontrealInsider.com as a demo
Generated multiple inbound leads within 48 hours
Directories consistently convert at 2-3% to paid audits, SEO services, custom software


WHY "EVERYONE DOING THIS" DOESN'T MATTER:

Most people doing "local directories" are:

Running AdSense farms (low margins, Google penalizes them)
Actually scraping (illegal, get shut down)
Building manually (takes weeks per city, doesn't scale)
Not monetizing properly (just hoping for ad clicks)
Have no backend service (no real business, just hoping to flip the site)

I'm doing this differently:

Legal API usage (can scale to 100+ cities without legal risk)
Automated deployment (24 hours per directory)
Real backend business (SEO services with proven demand)
Asset building (own the traffic, own the customer relationship)


THE "LATE TO THE PARTY" ARGUMENT:

You're right - APIs have existed for 15 years.

Here's what changed:

AI content generation (ChatGPT/Claude) made it economically viable to create unique, quality content at scale
API pricing dropped (Yelp/Google made APIs more accessible)
Local SEO got easier (Google prioritizes helpful local content)
WordPress optimization tools (deployment is faster than ever)

But mainly:

I'm not trying to invent something new. I'm applying 25+ years of affiliate marketing experience (multiple 7 figures properties in adult industry) to a proven model.

The "magic" isn't the technology - it's the execution:

Speed of deployment (most people take weeks, I take hours)
Business model integration (directory → leads → service revenue)
Legal compliance (APIs not scraping)
Scalability (can deploy 50+ cities)


BOTTOM LINE:

If you think this is "just scraping with WordPress API" - you're missing the entire point.

This is:

Legal API licensing (not scraping)
Automated content generation (not copying)
Lead generation (not AdSense arbitrage)
Real business backend (not hoping to flip domains)

If it's so easy, why aren't you doing it?

I'm not here to convince skeptics. I'm here to build assets and generate leads.

The proof is in the results: I'm getting inbound business inquiries from live demos while everyone else is debating whether it's "rocket science."

Good luck with whatever you're working on. :thumbsup


For business inquires, my website is WebIgniter.com


So first, that is a copy-paste from Claude, an AI, something you blasted Legacy for doing just the other day.

Next, you said:

"These rank very well because I've been doing SEO since 1997".
- Source: Post #1

"So many top 5's. Out of 764 keywords, 150 are ranked top 10 in google"
- Source: post #22

"It outranks Trip Advisor for many local searches"
- Source: Post #26

Except that it doesn't. See for yourself:

https://www.google.com/search?q=best...s+in+brantford
https://www.google.com/search?q=top+...ntford+ontario
https://www.google.com/search?q=wher...t+in+brantford

So which is it - SEO ranking success or just a demo site for lead generation?

CyberHustler 12-09-2025 05:16 PM

Quote:

Originally Posted by Killswitch (Post 23417869)
Just so you're aware we can see exactly what you're trying to do and it's not cool at all.

You missed the original thread that set this multi-nic madness off 🤣

Mark just getting a little revenge, that's all.

2MuchMark 12-09-2025 05:18 PM

I also found this:

Google's own documentation warns against focusing on position rankings without traffic context https://support.google.com/webmasters/answer/7576553

Also, Moz's gude to SEO says Rankings are not the goal, attracting the right visitors is: https://moz.com/beginners-guide-to-seo

When someone shows Search Console position screenshots but refuses to show the Performance tab with actual clicks and impressions, that's a red flag. Real SEO professionals (like https://robertwarrenseo.com) know the difference between vanity metrics (positions for zero-volume keywords) and business metrics (traffic from competitive discovery terms).

2MuchMark 12-09-2025 05:22 PM

Quote:

Originally Posted by Killswitch (Post 23417869)
Just so you're aware we can see exactly what you're trying to do and it's not cool at all.

Hi Killswitch,

Sorry that you think that way, but it's really not my intention. As mentioned earlier I have 2 clients looking for some automation like Mindi is describing when it comes to SEO. One of them was approached by a russian guy who is promising all kinds of similar things that Mindi is. I am trying to help my client, AND create a bridge of peace between Mindi and I, AND learn some new stuff, AND help out fellow GFY'ers. Could I have done this better? Of course, but, at least I didn't fly off the handle multiple times like Mindi did, right?

Peace.

2MuchMark 12-09-2025 05:23 PM

Quote:

Originally Posted by CyberHustler (Post 23417873)
Mark just getting a little revenge, that's all.

WAT?!?! NoooOoo....

https://media1.giphy.com/media/v1.Y2...NTtXzQ/200.gif

Mindi 12-09-2025 05:24 PM

Quote:

Originally Posted by 2MuchMark (Post 23417870)
So first, that is a copy-paste from Claude, an AI, something you blasted Legacy for doing just the other day.

Next, you said:

"These rank very well because I've been doing SEO since 1997".
- Source: Post #1

"So many top 5's. Out of 764 keywords, 150 are ranked top 10 in google"
- Source: post #22

"It outranks Trip Advisor for many local searches"
- Source: Post #26

So which is it - SEO ranking success or just a demo site for lead generation?

It's both you fucking idiot.

Yes I copied it from claude, because claude is writing it with me, why wouldn't I ask the source how to answer the question? He answers it way better than I can. There is NOWHERE ELSE to get the information from.

As for the rest...
I will not show YOU anything. :)

Mindi 12-09-2025 05:26 PM

Quote:

Originally Posted by 2MuchMark (Post 23417876)
I am trying to help my client, AND create a bridge of peace between Mindi and I.

I will light that fucking bridge with both you and your client on it. :2 cents:

Pathfinder 12-09-2025 05:34 PM

Quote:

Originally Posted by Mindi (Post 23417879)
I will light that fucking bridge with both you and your client on it. :2 cents:


Mindi 12-09-2025 05:35 PM

100 Ai Agents :thumbsup

I'm happy to discuss this stuff with anyone interested except for that fucking idiot 2MuchMark. If you need a project done, hit me up at WebIgniter.com

2MuchMark 12-09-2025 06:29 PM

Quote:

Originally Posted by Mindi (Post 23417878)
It's both you fucking idiot.

Yes I copied it from claude, because claude is writing it with me, why wouldn't I ask the source how to answer the question? He answers it way better than I can. There is NOWHERE ELSE to get the information from.

As for the rest...
I will not show YOU anything. :)

You're claiming it's "both" - that it ranks well AND is just a lead magnet. But that doesn't make sense with how you responded earlier.

In this post https://gfy.com/23417110-post18.html (Post #18), when asked "How do you monetize the site?", you listed premium listings, ads, email signups, and other traditional directory monetization. You said "how many ways can you monetize a wordpress site?" - not "it's a demo."

In this post https://gfy.com/23417008-post1.html (#1), you said it "makes money and business" - not "generates leads for my services."

You only started calling it a "lead magnet" after I asked about your rankings being for zero-volume branded keywords. If it was always meant as a demo, then why:

1. Show Search Console screenshots claiming "top 5's" and "150 ranked top 10"?
2. Claim it "outranks TripAdvisor for many local searches"?
3. Refuse to show traffic data when asked?
4. Explain traditional monetization methods instead of saying "it's a demo"?

The ranking screenshots show restaurant name searches with zero volume. That's not SEO success for either purpose Mindi not for making money, and not for demonstrating your skills to potential clients who know SEO (I have learned alot from Rob...)

A real SEO demo would show rankings for competitive discovery keywords, or you would have just said from the start "this is a demo of my automation skills, not an SEO case study."


Quote:

Originally Posted by Mindi (Post 23417881)
100 Ai Agents :thumbsup

I'm happy to discuss this stuff with anyone interested except for that fucking idiot 2MuchMark. If you need a project done, hit me up at WebIgniter.com

Here's what I suspect happened. Your first SEO project flopped hard when you tried to use my brand without permission. You then tried again and this time with automation of websites, and you thought you had found a way to boost rankings. After I started asking questions, you decided to ask Claude AI if your project would really work, and it told you NO, it would not, for all of the reasons Google already says it won't, and for all of the reasons most everyone else already knows.

Some advice:Focus on your automation stuff. It's good. Don't try to fool Google, it won't work.

2MuchMark 12-09-2025 06:36 PM

Quote:

Originally Posted by Pathfinder (Post 23417880)

https://gifdb.com/images/high/mary-p...9mgyjvqj0k.gif
\m/ \m/ \m/

Mindi 12-10-2025 12:36 AM

Mark - You're moving goalposts again. You can't even keep your shit straight.

Your new attack: "Your rankings are for branded keywords, that's not impressive SEO"

No shit! It's 4 days old. The test model is barely 3 weeks old. I never claimed to be ranking for "best restaurants in Montreal" yet. That takes months. Don't you fucking know ANYTHING about SEO?

But here's what IS working:

Business owners Google themselves → find my listing → contact me

Multiple inquiries in 3 weeks. Multiple inquires from this thread alone. MANY MORE from the other places I have posted this same exact thing but for other cities. I do not just hang out on GFY :winkwink:

Four signed projects (not sharing any details with idiots like you) and several that I've walked away from. I can fucking do that :thumbsup

You said: "You asked Claude if it would work and Claude said NO"

That's a lie. Show me where Claude said it wouldn't work. You can't, because it didn't happen.

The automation works. The business model works. The only thing that's failing is your attempt to discredit it.

You are not interested, move on. You're just trying to attack my business AGAIN. I've never said this was for sale, it is NOT FOR SALE. Now go ahead and call me a fucking scammer again like your employee likes to do. Where have I ever scammed ANYONE here in over 20 fucking years? I've told you many times, I will not do ANY BUSINESS with YOU or anyone I feel has your fucking odor on them. Stop making shit up. Go back to your little queer threesome with Legacy and Scrapper :1orglaugh

While you're sitting here being Legacy and Scrapper's flying monkey, I'm putting out site after site after site. You have no fucking idea how I can scale things :pimp

Mindi 12-10-2025 01:39 AM

One last thing:

Some people on this forum seem more interested in tearing down other people's success than building their own.

I'm not here to argue with trolls. I'm here to build assets and help businesses.

If you want something built - WebIgniter.com

If you want to argue - find someone else.

I've got work to do.

queke 12-10-2025 09:03 AM

Quote:

Originally Posted by Mindi (Post 23417110)
Yes, generated with Nano Banana.

It can be monetized in many ways. I built in a system for business owners to claim their listing. You can shut features on or off (I have everything ON at the moment) but you offer a premium listing where they pay and then suddenly click to call works, and their address map works, you can let them log in and upload real photos, offer specials, most things that modern sites like this offer. It's setup to allow featured listings, it has an ad system built in. I currently dont have any ads loaded but they load just like a widget.

You can offer email signups for newletters, you can run dating ads, you can build service directories and monetize those.

I mean, how many ways can you monetize a wordpress site?

I am curious how you verify when a business owner tries to claim their listing? Like how do you know they are the business.

Mindi 12-10-2025 11:30 AM

Quote:

Originally Posted by queke (Post 23417966)
I am curious how you verify when a business owner tries to claim their listing? Like how do you know they are the business.

That's an excellent question. Before, I had the verification process just as a simple form for them to submit their info, and it would have had to been verified manually. But fuck that.

This morning, I built a new auto approval process. I still have it set to manual admin approval though for testing.

Your timing with this question was impeccable :thumbsup



This is how I plan my agent:

Business Claim Verification System - Brantford Insider

Overview

Build a system where business owners can claim their listings. After verification (phone or email) and admin approval, they receive a "Verified Owner" badge on their listing.

User Flow

Business Owner Flow

1. Owner visits business listing page
2. Clicks "Claim This Business" button
3. Fills out claim form (name, email, phone, role at business)
4. Chooses verification method: Phone or Email
5. Phone: Receives verification code, calls/visits site to enter it
6. Email: Receives magic link, clicks to verify
7. Claim goes to admin queue for final approval
8. Once approved, listing shows "Verified Owner" badge

Admin Flow

1. New claims appear in WP Admin → Business Claims
2. Admin sees: business name, claimant info, verification status
3. Admin can: Approve, Reject (with reason), Request more info
4. Approved claims add verified badge to listing

---
Technical Implementation

Phase 1: Data Structure

New ACF Fields (add to includes/acf-fields.php)

// Business Claim Status Group
- claim_status: select (unclaimed, pending, verified, rejected)
- claim_verified_date: date
- claim_owner_name: text
- claim_owner_email: email
- claim_verification_method: select (phone, email)

New Database Table: wp_business_claims

CREATE TABLE wp_business_claims (
id BIGINT AUTO_INCREMENT PRIMARY KEY,
business_id BIGINT NOT NULL,
claimant_name VARCHAR(255),
claimant_email VARCHAR(255),
claimant_phone VARCHAR(50),
claimant_role VARCHAR(100),
verification_method ENUM('phone', 'email'),
verification_code VARCHAR(20),
verification_code_expires DATETIME,
email_verified TINYINT DEFAULT 0,
phone_verified TINYINT DEFAULT 0,
status ENUM('pending', 'email_sent', 'code_sent', 'verified', 'approved', 'rejected') DEFAULT 'pending',
rejection_reason TEXT,
admin_notes TEXT,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME ON UPDATE CURRENT_TIMESTAMP,
INDEX (business_id),
INDEX (status)
);

Phase 2: Backend Infrastructure

New File: includes/claim-verification.php

- cbd_create_claims_table() - Creates DB table on activation
- cbd_generate_verification_code() - 6-digit code generator
- cbd_send_verification_email() - Magic link email
- cbd_verify_email_token() - Validates email links
- cbd_verify_phone_code() - Validates phone codes
- cbd_submit_claim() - AJAX handler for claim form
- cbd_get_claim_status() - Check if business is claimed

New File: includes/claim-admin.php

- Admin menu page: "Business Claims"
- List table showing all claims with filters
- Approve/Reject actions with AJAX
- Email notifications to claimant on status change

Phase 3: Frontend Templates

New Template: templates/page-claim-business.php

- Claim form with fields:
- Business selector (if coming from generic page)
- Your Name
- Your Email
- Your Phone
- Your Role (Owner, Manager, Marketing, etc.)
- Verification Method (Phone / Email radio)
- Terms acceptance checkbox
- AJAX submission
- Success/error messaging

Modify: templates/single-business-restaurant.php (and other single templates)

- Add "Claim This Business" button (if unclaimed)
- Add "Verified Owner" badge (if claimed & approved)
- Button links to /claim-business/?business_id=XXX

New Template Section: Verification Code Entry

- Simple form to enter 6-digit code
- Or: Email link lands on verification confirmation page

Phase 4: Email Templates

Verification Email

Subject: Verify your ownership of [Business Name] on Brantford Insider

Hi [Name],

You requested to claim [Business Name] on Brantford Insider.

Click here to verify your email: [MAGIC LINK]

This link expires in 24 hours.

If you didn't request this, ignore this email.

Phone Verification Instructions

Subject: Your verification code for [Business Name]

Hi [Name],

Your verification code is: [6-DIGIT CODE]

Enter this code at: [VERIFICATION URL]

This code expires in 1 hour.

Claim Approved Email

Subject: Your claim for [Business Name] has been approved!

Hi [Name],

Great news! Your claim for [Business Name] on Brantford Insider has been approved.

Your listing now shows a "Verified Owner" badge.

To request changes to your listing, contact us at [email].

---
Files to Create/Modify

NEW FILES

1. custom-business-directory/includes/claim-verification.php - Core claim logic
2. custom-business-directory/includes/claim-admin.php - Admin interface
3. custom-business-directory/templates/page-claim-business.php - Claim form page
4. custom-business-directory/assets/css/claim-form.css - Claim form styles
5. custom-business-directory/assets/js/claim-form.js - AJAX handling

MODIFY FILES

1. custom-business-directory/custom-business-directory.php - Include new files, register activation hook
2. custom-business-directory/includes/acf-fields.php - Add claim status fields
3. custom-business-directory/templates/single-business-restaurant.php - Add claim button & badge
4. custom-business-directory/templates/single-business-service.php - Add claim button & badge
5. custom-business-directory/templates/single-business-retail.php - Add claim button & badge
6. custom-business-directory/templates/single-business.php - Add claim button & badge
7. custom-business-directory/assets/css/style.css - Badge styles

---
Implementation Order

1. Database & ACF Fields - Create table, add fields
2. Claim Form Page - Frontend form that submits claims
3. Email/Phone Verification - Code generation and validation
4. Admin Interface - View and manage claims
5. Badge Display - Show verified badge on listings
6. Claim Button - Add to all single business templates
7. Testing - Full flow testing on Brantford site

---
Security Considerations

- Rate limit claim submissions (1 per business per email per 24h)
- Verification codes expire (1h phone, 24h email)
- Sanitize all inputs
- Nonce verification on all forms
- Email verification tokens are one-time use
- Admin-only approval (no auto-approve)

---
Badge Design

.verified-owner-badge {
display: inline-flex;
align-items: center;
background: linear-gradient(135deg, #10B981 0%, #059669 100%);
color: white;
padding: 6px 12px;
border-radius: 20px;
font-size: 13px;
font-weight: 600;
}
.verified-owner-badge::before {
content: "✓";
margin-right: 6px;
}

---
Estimated Components

| Component | Complexity | Priority |
|--------------------------|------------|----------|
| Database table | Low | P0 |
| ACF fields | Low | P0 |
| Claim form page | Medium | P0 |
| Email verification | Medium | P0 |
| Phone code verification | Low | P0 |
| Admin claims list | Medium | P0 |
| Approve/Reject actions | Low | P0 |
| Badge on listings | Low | P1 |
| Claim button on listings | Low | P1 |
| Email templates | Low | P1 |

---
Test Plan

1. Submit claim for unclaimed business
2. Choose email verification → receive email → click link
3. Verify claim shows in admin as "verified, pending approval"
4. Admin approves → badge appears on listing
5. Submit another claim for same business → should show "already claimed"
6. Test phone verification flow
7. Test rejection flow with reason
8. Test expired verification codes


The system is being built right at this very moment.

https://i.imgur.com/ALeGIS7.png

queke 12-10-2025 01:00 PM

Thanks for the writeup. I was curious if the number had to match the one on their yelp/google listing or email had to match although they don't usually display emails. So not just anyone could verify with any phone or email. Like a way to make sure they are who they say they are.

Mindi 12-10-2025 01:16 PM

The verification code confirms they have access to the contact info they provide - but that's just step one. Right now, every claim requires manual admin approval before the verified badge appears.

When a claim comes in, I review it before approving:
- Cross-reference the claimant's info with what's on the listing
- Check their email domain matches the business website
- Call the business directly if anything looks off
- Google the person + business name

So even if someone tried to claim a business with random contact info, they'd get flagged at the admin review stage.

The automated verification just filters out spam and confirms they're a real person - the human review is where I actually verify legitimacy, at least for now.

I'm also looking at adding domain-based email verification (requiring @businessdomain.com emails) as an additional layer.

Killswitch 12-10-2025 03:42 PM

Quote:

Originally Posted by Mindi (Post 23417991)
...

That seems like a lot of implementation details. Have you explored contextualizing your codebase and then writing more high level things?

For example, I have an AI tool I'm working on that basically just chews through your whole codebase and pulls out all the details it can that are relevant, basically generates a whole readme document that explains everything about the app, architectural decisions, coding patterns, tools used, etc.

Then I write high level epics that are broken down into tasks that look sorta like this, which is a real ticket I have for a task my agent implemented recently:

Code:

## Summary / Purpose

Extend the existing SES webhook ingestion system so it can process additional SES event types using the **same unified ingestion pipeline** already used for current SES events. The system does **not** distinguish between “inbound” or “outbound” events — all SES events flow through a single lifecycle:

1. Parse the JSON payload 
2. Determine the SES event type 
3. Ignore unsupported events 
4. Validate supported events 
5. Normalize supported events into the unified schema 
6. Emit internal events 
7. Respond to SES 

This task only adds new SES event types to that pipeline. No new webhook, no new lifecycle, and no additional architectural concepts are introduced.

---

## Supported SES Event Types to Add

Extend the current allowlist by adding support for the following SES event types:

- `Send`
- `Delivery`
- `Bounce`
- `Complaint`
- `DeliveryDelay`

All other SES event types remain **ignored**, logged at debug level, and acknowledged with success.

---

## Validation Requirements (Supported Event Types Only)

For events in the allowlist above, validation rules must match the existing SES ingestion model:

- JSON must parse correctly.
- `eventType` or `notificationType` must be present.
- A `mail` object must exist and include:
  - `messageId`
  - `timestamp`
  - `source`
  - `destination[]` (non-empty)

- The corresponding SES event-specific object must exist:
  - `Send` → `send`
  - `Delivery` → `delivery`
  - `Bounce` → `bounce`
  - `Complaint` → `complaint`
  - `DeliveryDelay` → `deliveryDelay`

Failures in validation:

- Must be logged at **error** level with a reason and message ID (if available).
- Must return a **non-success** response to SES.
- Must stop all processing and emit no internal events.

Unsupported events do **not** undergo validation.

---

## Normalization Requirements

For supported and validated SES events, normalize into the existing unified internal schema:

### Core Message Fields
- `message_id`
- `from_address`
- `recipients`
- `sent_at` (parsed into a standardized timestamp format)
- `tags` (preserved exactly as SES provides them)

### Event Metadata
- `id` (feedback ID when present)
- `type` (one of: `send`, `delivery`, `bounce`, `complaint`, `delay`)
- `timestamp` (event-specific timestamp, parsed into standardized timestamp format)

### Optional Event-Type Blocks
Depending on `type`, include one of:

- `bounce`
- `complaint`
- `delivery`
- `delay`

Each block includes its expected fields exactly as defined in the normalized schema.

Normalization must be SES-agnostic — no SES field names should leak into the internal event shape.

---

## Internal Event Emission

Emit exactly one internal event for each normalized SES event:

- `type = "send"` → `EmailSent`
- `type = "delivery"` → `EmailDelivered`
- `type = "bounce"` → `EmailBounced`
- `type = "complaint"` → `EmailComplaint`
- `type = "delay"` → `EmailDelayed`

Internal events must include **only** the normalized data — never SES raw JSON.

If emission fails for any reason:

- Log the error.
- Return a **non-success** response to SES.
- Stop further processing.

---

## Logging & Response Behavior

Re-use the same semantics already implemented for SES ingestion:

### Debug Logs
- Unsupported SES event types that are ignored.

### Error Logs
- JSON parsing failures.
- Validation failures for supported events.
- Internal event emission failures.

### Response Rules
Return **success (2xx)** only when:

- JSON parsed successfully.
- All supported events validated successfully.
- All supported events normalized successfully.
- All supported events emitted successfully.
- Unsupported events were ignored without issues.

Return **non-success** otherwise.

---

## Out of Scope

- Any business logic reacting to these events.
- Storage or indexing of normalized event data.
- UI or reporting updates.
- Changes to the inbound SES behavior beyond sharing the unified pipeline.
- SES configuration, SNS topic setup, DNS, or identity management.

Then my AI agent combines those two things into a giant context file and gets to work.

In my experience this has allowed me to not write so much implementation details basically coding in psuedo code and write more high level stuff and then let the agent infer how to implement it and the patterns and architectural decisions from my agents.md file that contains all that context.

2MuchMark 12-10-2025 05:05 PM

Quote:

Originally Posted by Killswitch (Post 23418038)
That seems like a lot of implementation details. Have you explored contextualizing your codebase and then writing more high level things?

For example, I have an AI tool I'm working on that basically just chews through your whole codebase and pulls out all the details it can that are relevant, basically generates a whole readme document that explains everything about the app, architectural decisions, coding patterns, tools used, etc.

Then I write high level epics that are broken down into tasks that look sorta like this, which is a real ticket I have for a task my agent implemented recently:

Code:

## Summary / Purpose

Extend the existing SES webhook ingestion system so it can process additional SES event types using the **same unified ingestion pipeline** already used for current SES events. The system does **not** distinguish between “inbound” or “outbound” events — all SES events flow through a single lifecycle:

1. Parse the JSON payload 
2. Determine the SES event type 
3. Ignore unsupported events 
4. Validate supported events 
5. Normalize supported events into the unified schema 
6. Emit internal events 
7. Respond to SES 

This task only adds new SES event types to that pipeline. No new webhook, no new lifecycle, and no additional architectural concepts are introduced.

---

## Supported SES Event Types to Add

Extend the current allowlist by adding support for the following SES event types:

- `Send`
- `Delivery`
- `Bounce`
- `Complaint`
- `DeliveryDelay`

All other SES event types remain **ignored**, logged at debug level, and acknowledged with success.

---

## Validation Requirements (Supported Event Types Only)

For events in the allowlist above, validation rules must match the existing SES ingestion model:

- JSON must parse correctly.
- `eventType` or `notificationType` must be present.
- A `mail` object must exist and include:
  - `messageId`
  - `timestamp`
  - `source`
  - `destination[]` (non-empty)

- The corresponding SES event-specific object must exist:
  - `Send` → `send`
  - `Delivery` → `delivery`
  - `Bounce` → `bounce`
  - `Complaint` → `complaint`
  - `DeliveryDelay` → `deliveryDelay`

Failures in validation:

- Must be logged at **error** level with a reason and message ID (if available).
- Must return a **non-success** response to SES.
- Must stop all processing and emit no internal events.

Unsupported events do **not** undergo validation.

---

## Normalization Requirements

For supported and validated SES events, normalize into the existing unified internal schema:

### Core Message Fields
- `message_id`
- `from_address`
- `recipients`
- `sent_at` (parsed into a standardized timestamp format)
- `tags` (preserved exactly as SES provides them)

### Event Metadata
- `id` (feedback ID when present)
- `type` (one of: `send`, `delivery`, `bounce`, `complaint`, `delay`)
- `timestamp` (event-specific timestamp, parsed into standardized timestamp format)

### Optional Event-Type Blocks
Depending on `type`, include one of:

- `bounce`
- `complaint`
- `delivery`
- `delay`

Each block includes its expected fields exactly as defined in the normalized schema.

Normalization must be SES-agnostic — no SES field names should leak into the internal event shape.

---

## Internal Event Emission

Emit exactly one internal event for each normalized SES event:

- `type = "send"` → `EmailSent`
- `type = "delivery"` → `EmailDelivered`
- `type = "bounce"` → `EmailBounced`
- `type = "complaint"` → `EmailComplaint`
- `type = "delay"` → `EmailDelayed`

Internal events must include **only** the normalized data — never SES raw JSON.

If emission fails for any reason:

- Log the error.
- Return a **non-success** response to SES.
- Stop further processing.

---

## Logging & Response Behavior

Re-use the same semantics already implemented for SES ingestion:

### Debug Logs
- Unsupported SES event types that are ignored.

### Error Logs
- JSON parsing failures.
- Validation failures for supported events.
- Internal event emission failures.

### Response Rules
Return **success (2xx)** only when:

- JSON parsed successfully.
- All supported events validated successfully.
- All supported events normalized successfully.
- All supported events emitted successfully.
- Unsupported events were ignored without issues.

Return **non-success** otherwise.

---

## Out of Scope

- Any business logic reacting to these events.
- Storage or indexing of normalized event data.
- UI or reporting updates.
- Changes to the inbound SES behavior beyond sharing the unified pipeline.
- SES configuration, SNS topic setup, DNS, or identity management.

Then my AI agent combines those two things into a giant context file and gets to work.

In my experience this has allowed me to not write so much implementation details basically coding in psuedo code and write more high level stuff and then let the agent infer how to implement it and the patterns and architectural decisions from my agents.md file that contains all that context.

N o i c e ... question: Are you just feeding the agent a generated ‘agents.md’ + the ticket, or do you have it dynamically pulling code context each time as well?

Mindi 12-10-2025 06:04 PM

Quote:

Originally Posted by Killswitch (Post 23418038)
That seems like a lot of implementation details. Have you explored contextualizing your codebase and then writing more high level things?

For example, I have an AI tool I'm working on that basically just chews through your whole codebase and pulls out all the details it can that are relevant, basically generates a whole readme document that explains everything about the app, architectural decisions, coding patterns, tools used, etc.

Then I write high level epics that are broken down into tasks that look sorta like this, which is a real ticket I have for a task my agent implemented recently.

I read that and I was like wait... what did he just say?

This is eye opening. I could apply that to my process. I could build 50 with the same pattern.

That's a little overkill. I like it. :thumbsup

Killswitch 12-10-2025 11:26 PM

Quote:

Originally Posted by 2MuchMark (Post 23418055)
N o i c e ... question: Are you just feeding the agent a generated ‘agents.md’ + the ticket, or do you have it dynamically pulling code context each time as well?

The AGENTS.md is generated by my context tool that I'm working on, I run it against a git repo and it loads up the git history and source code and just starts plowing through it and looks for as much context it can pull out to describe how the project is designed, key components, patterns, dependencies, etc. This is generated once and reused for all tickets.

Then I just use Copilot inside VSCode to start a new agent and I paste that ticket into the chat, where it automatically identifies the AGENTS.md file as a context file, then my ticket as the user prompt, that sets up the hierarchy to make it all work.

Then after it finishes I review the changes and request stuff in the chat and it will continue making changes until I decide it's good enough, then I close that chat and repeat the process with the next ticket.

Quote:

Originally Posted by Mindi (Post 23418072)
I read that and I was like wait... what did he just say?

This is eye opening. I could apply that to my process. I could build 50 with the same pattern.

That's a little overkill. I like it. :thumbsup

Yeah it's a pretty manual process on my end still but I'm just trying to figure things out and find a flow that I like, and this is what has come of it so far. I figure you can take the idea and apply it to your setup and gain noticeable improvements. Can't wait to hear more about how it changed things for you. :thumbsup

I'll share my tool here on GFY when I've got it ready for public usage.

mopek1 12-11-2025 05:07 AM

How do you make sure the review of the restaurant (or whatever product you build a site for) passes the AI detectors that Google and other SEs are looking out for? I mean the paragraphs of text (article) that make up the review?

Mindi 12-11-2025 05:15 AM

Quote:

Originally Posted by mopek1 (Post 23418162)
How do you make sure the review of the restaurant (or whatever product you build a site for) passes the AI detectors that Google and other SEs are looking out for? I mean the paragraphs of text (article) that make up the review?

I don't even worry about it. Everything in that text is based on factual information that I get from the API. It's no problem.

mopek1 12-11-2025 06:31 AM

Quote:

Originally Posted by Mindi (Post 23418163)
I don't even worry about it. Everything in that text is based on factual information that I get from the API. It's no problem.

Thanks. I guess I'm not sure what the AI detectors are actually looking for. Even though the information is factual, is it possible that Google detects the writing style as AI writing and penalizes you?

Mindi 12-11-2025 07:03 AM

Quote:

Originally Posted by mopek1 (Post 23418167)
Thanks. I guess I'm not sure what the AI detectors are actually looking for. Even though the information is factual, is it possible that Google detects the writing style as AI writing and penalizes you?

It's possible, Google seems to change their mind on a whim and they do it often. But I haven't had an issue with that yet. All of this matches with the same sentiment of the sources they link to. Any info pulled from Yelp gets a yelp attribution to it's source. Same things happens on other sites I've built that pull from Trip Advisor, Google Places, and Yelp.

It's from Google it gets labeled as from Google, and the link goes to the Google Places page. Same thing if its from Trip Advisor. If I can't pull enough restaurants or any type of business from one source, I can simply pull from another or even all of them, my system tracks it all and gives proper attribution as the platform's TOS states they should.

Google can see it's not a bunch of made up bullshit, in other words. It's not copy and paste, it's all original content.

It all happens at a speed of 2 restaurant listings per minute once it's going.

I'm still laughing at the "this is scaping and stealing" comment someone else posted above. :1orglaugh

fuzebox 12-11-2025 09:20 AM

Quote:

Originally Posted by mopek1 (Post 23418162)
How do you make sure the review of the restaurant (or whatever product you build a site for) passes the AI detectors that Google and other SEs are looking out for? I mean the paragraphs of text (article) that make up the review?

I personally think the fear of AI generated content and Google is overblown. Most pages that rank for most terms now are slop. "AI detection" tools flag so much original content as AI, the snake is eating its own head.

The endgame of Google is to serve up its own AI summary and keep you there anyway.

mopek1 12-11-2025 11:08 AM

Quote:

Originally Posted by fuzebox (Post 23418189)
Most pages that rank for most terms now are slop.

When you say 'slop', do you mean after reading one sentence a 6th grader can tell it's AI? Or more covert seeming AI writing?



Quote:

Originally Posted by fuzebox (Post 23418189)
The endgame of Google is to serve up its own AI summary and keep you there anyway.

I thought they wanted you to click on their ads. But it does seem that G always has their AI give answers first.

Mindi 12-11-2025 12:01 PM

Quote:

Originally Posted by mopek1 (Post 23418214)
When you say 'slop', do you mean after reading one sentence a 6th grader can tell it's AI? Or more covert seeming AI writing?

That's exactly how I would define it. :2 cents:


All times are GMT -7. The time now is 10:01 AM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123