Blog · April 28, 2026 · 10 min read
How to Make Your Local Business Visible to ChatGPT, Claude, and Perplexity
Imagine someone in your service area types into ChatGPT: “Who are the best [your category] near me?” And the AI responds with three names — yours leading the list, with a one-paragraph summary of why you're worth calling.
That's not magic. It's the result of a small set of technical and content choices that put you on AI's shortlist for your category. This post walks through the actual steps. No rebuilds, no expensive software. Most local businesses can finish the list in an afternoon — or hand it to a developer for a few billable hours.
Meet the players: ChatGPT, Claude, Perplexity, AI Overviews
Each major AI has its own crawler, its own ranking signals, and its own quirks. Optimizing for one usually optimizes for all — but it helps to know who you're talking to.
ChatGPT (OpenAI)
Crawlers
GPTBot, OAI-SearchBot, ChatGPT-User
Strength
Best at summarizing well-structured content. Massive reach.
Quirk
Doesn’t run JavaScript. Respects robots.txt strictly.
Claude (Anthropic)
Crawlers
ClaudeBot, Claude-User, Claude-SearchBot
Strength
Long-context summaries. Tends to favor high-quality, less SEO-spammed sources.
Quirk
Very strict about robots.txt. No JavaScript.
Perplexity
Crawlers
PerplexityBot, Perplexity-User
Strength
Real-time web search with prominent citation links shown to users.
Quirk
Heavily citation-driven. Being cited by other authoritative sources helps.
Google AI Overviews
Crawlers
Googlebot + Google-Extended
Strength
Integrated with traditional Google search ranking.
Quirk
Google-Extended controls AI training use specifically. Block one, keep the other.
Step 1: Make sure they can reach you
If AI crawlers can't fetch your site, nothing else matters. Most invisibility issues start here.
Check your robots.txt
Open yourdomain.com/robots.txt in a browser. If the file doesn't exist or returns a 404, that's fine — crawlers default to allowed.
If the file exists, look for any line saying Disallow: / under an AI user-agent like User-agent: GPTBot or User-agent: ClaudeBot. That line means that AI is blocked from your site entirely. Remove it.
The minimum recommended robots.txt for a local business:
User-agent: GPTBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: ClaudeBot Allow: / User-agent: Claude-User Allow: / User-agent: PerplexityBot Allow: / User-agent: Perplexity-User Allow: / User-agent: Google-Extended Allow: / Sitemap: https://yourdomain.com/sitemap.xml
Check Cloudflare and security plugins
If your site uses Cloudflare, log in and check the AI bot management settings. Cloudflare added a one-click “block all AI bots” feature in 2024 that many small-business owners enabled by accident.
If you use a security plugin like Wordfence, Sucuri, or iThemes, open it and look for a bot-blocking section. AI crawlers often get caught in default blocklists meant for spammers.
Add an llms.txt file
llms.txt is a new standard (2024) for telling AI which pages on your site are most important. It's a small Markdown file at yourdomain.com/llms.txt.
A template for a local business:
# Smith Plumbing > Family-owned plumbing service in Austin, TX. > Emergency repair, drain cleaning, and water heater installation. ## Key pages - [Homepage](https://smithplumbing.com/) - [Services](https://smithplumbing.com/services/): Full list with pricing - [Service area](https://smithplumbing.com/areas/): Cities we serve - [Contact](https://smithplumbing.com/contact/): Hours, phone, online booking - [FAQ](https://smithplumbing.com/faq/): Common customer questions
llms.txt is not yet required by any AI — but it's free to add, and Anthropic and OpenAI have both signaled interest in supporting it. Adopting now positions you ahead.
Step 2: Make sure they can understand you
Schema.org markup is the most underused AI visibility tool for local businesses. Schema is a small block of JSON-LD added to your site's <head>. It tells AI exactly what your business is, in a format it can quote directly.
For local businesses, the most important schemas are:
- LocalBusiness — name, address, phone, hours, services
- Service — each service you offer, with price ranges
- FAQPage — your FAQ section, marked up so AI can lift answers verbatim
- AggregateRating — your average review score and review count
- BreadcrumbList — your site's navigation hierarchy
A bare-minimum LocalBusiness schema looks like this:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "LocalBusiness",
"name": "Smith Plumbing",
"image": "https://smithplumbing.com/photo.jpg",
"telephone": "+1-512-555-0100",
"address": {
"@type": "PostalAddress",
"streetAddress": "123 Main St",
"addressLocality": "Austin",
"addressRegion": "TX",
"postalCode": "78701",
"addressCountry": "US"
},
"openingHours": "Mo-Fr 08:00-17:00",
"priceRange": "$$"
}
</script>Most CMS platforms have plugins or settings that handle this:
- WordPress: Yoast SEO, Rank Math, or Schema Pro plugins
- Shopify: built-in for products; LocalBusiness needs a small custom snippet
- Squarespace / Webflow: paste JSON-LD into the head injection field
- Wix: built-in via the SEO panel — but limited fields, and worth validating output
Whatever you add, validate it at search.google.com/test/rich-results — if Google's validator passes, AI parsers usually do too.
Step 3: Make sure they can read your content
The most common reason a small-business site is invisible to AI is JavaScript-only rendering — the content shows up in the browser, but the underlying HTML the crawler sees is essentially empty.
Here's the test: open your site, right-click, View page source (not Inspect — View Source). Press Cmd+F or Ctrl+F and search for a paragraph from your homepage.
- If the text is there: you're fine. AI crawlers will see it.
- If the text isn't there: your site is JavaScript-only. AI sees nothing.
Visual builder platforms (WordPress, Shopify, Squarespace, Webflow, Wix) almost always render server-side and pass this test. Custom-built React or Vue sites are the typical culprits; fixing requires enabling SSR (server-side rendering) or static generation.
Step 4: Give them reasons to cite you
Once AI can reach, understand, and read your site, the question becomes: why should it pick you over a competitor? The answer is E-E-A-T — Experience, Expertise, Authoritativeness, and Trust.
What helps:
- Real photos of your business and team (not stock images)
- An About page with named founders, dates, and origin story
- Author bios on blog posts, with links to LinkedIn or other proof
- Customer reviews on Google Business Profile (AI cross-references)
- A specific service list — not “we do everything” but “we do X, Y, and Z”
- Original content: case studies, before/afters, customer stories
What hurts:
- Stock photos with no real-business imagery
- Generic copy that's interchangeable with every competitor
- Content scraped or rewritten from other sites
- No team page, no author bios, no real names
Per-platform tweaks
Most of the above is universal. A few platform-specific tactics:
For ChatGPT
Allow ChatGPT-User explicitly in robots.txt — it's separate from GPTBot and used when a real user asks ChatGPT a question that needs live web data. ChatGPT favors well-structured FAQs over long paragraphs.
For Claude
ClaudeBot is the strictest about robots.txt — double-check it's allowed. Claude tends to surface higher-quality, less commercial sources, so original content (case studies, customer stories) matters more than generic SEO articles.
For Perplexity
Perplexity rewards citations from other authoritative sources — industry directories, local press, business associations, your chamber of commerce listing. Allow Perplexity-User explicitly for real-time queries.
For Google AI Overviews
Existing SEO carries over because AI Overviews use the same Googlebot. Google-Extended controls if your data trains Gemini specifically — you can allow or block it independently of regular Googlebot.
The 10-step checklist
Print this and check off as you go:
- Confirm robots.txt does not block GPTBot, ClaudeBot, PerplexityBot, ChatGPT-User, Claude-User, Perplexity-User, or Google-Extended.
- Confirm Cloudflare and your security plugin are not blocking AI bots.
- Add an llms.txt file with key pages.
- Add LocalBusiness JSON-LD to your homepage with full name, address, phone, hours, and services.
- Add FAQPage JSON-LD to your FAQ section.
- View source on your homepage and confirm the visible text appears in the raw HTML.
- Add real photos, real team bios, and named authors.
- Fully fill out your Google Business Profile — AI cross-references it.
- Submit a sitemap.xml to Google Search Console.
- Run a free AI visibility scan to verify everything is wired up.
The bottom line
Most of this work is one-time setup. Once your site is properly configured, AI crawlers re-fetch it on their own schedules — and as their indexes refresh, your visibility compounds.
The local businesses that get this done in 2026 will spend the next decade collecting AI-driven leads while their competitors wonder why their phone stopped ringing.
Run the 10-step checklist on your own site — automatically
Our free scan checks all 10 items above (and 27 more) in under 10 seconds. No signup, no credit card. You'll see exactly which items pass and which need fixing, with copy-paste snippets for each.
Run my free scan →Written by the team at Kesem Marketing, a digital agency helping small businesses get found in the AI-first era.