AI Visibility Checks

Frequently asked questions

Questions we get asked

If you have a question we haven't covered here, reply to your report email or book a call at kesemmarketing.com.

What is AI visibility?

AI visibility is how readable, indexable, and citable your website is to AI assistants and AI-powered search — ChatGPT, Claude, Perplexity, Google AI Overviews, and similar. Where traditional SEO optimizes your site for Google's ten blue links, AI visibility optimizes you to be the business AI names when a user asks for a recommendation.

How is this different from regular SEO?

Google shows ten blue links. AI delivers a short answer with a handful of cited sources. The rules AI uses to pick those sources are different — it cares more about structured data (schema.org), whether your content is readable without JavaScript, and whether its crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) are allowed in your robots.txt. A site can rank #1 on Google and still be invisible to ChatGPT.

Do I really need this if I already rank well on Google?

Yes. Adobe Analytics reported 693% year-over-year growth in AI-referred traffic to U.S. retail sites during the 2025 holiday season, with AI referrals converting 31% more than other sources. Cloudflare's 2025 Year in Review measured a 15× increase in user-action AI bot traffic. Your Google ranking doesn't translate automatically — AI picks citations using different signals, and most small-business sites haven't adapted yet.

What does your scan actually check?

Every scan runs 40+ checks across crawler access (robots.txt rules for 16 major AI crawlers), AI directives (llms.txt, llms-full.txt), structured data (JSON-LD schema types), metadata (title, description, Open Graph, Twitter Card, canonical, html lang), semantic HTML (H1, landmarks, alt text, HTTPS), discoverability (sitemap.xml), and content rendering (whether your content is in the HTML vs. JavaScript-rendered).

How long does a full implementation take?

Most small-business sites can be fully fixed in a handful of hours by a developer, or over a weekend by a motivated owner following our DIY guide. The fixes are additions, not rewrites — a robots.txt file, a schema.org block in your page <head>, a sitemap, an llms.txt — not a site redesign.

Will AI start referring traffic to me immediately?

Not instantly. AI crawlers need to re-visit your site (typically within a few days to a few weeks), and AI models update their training on a rolling schedule. What you can control is being ready: once your site is AI-readable, every new crawl improves your citation chances.

Does this work for my platform (WordPress, Shopify, Webflow, Squarespace, Wix)?

Yes. The scanner auto-detects your platform and our Full Report includes platform-specific fix instructions. WordPress users get plugin recommendations (Yoast, Rank Math). Shopify users get theme.liquid file paths. Webflow users get Custom Code panel instructions. Squarespace and Wix users get Code Injection guidance.

What makes the $49 Full Report different from the free scan?

The free scan shows your score and top three issues. The $49 Full Report unlocks every issue found (typically 10–20 on most sites), with copy-paste code snippets for your developer, plain-English step-by-step instructions for you, platform-specific guidance, and a downloadable PDF you can forward or archive.

Is my scan data private?

We only fetch publicly accessible pages on your website — the same way Google or any other search engine would. Scan results are stored so you can revisit your report later, and your email is only captured if you purchase the Full Report. We do not share data with third parties.

What is llms.txt?

llms.txt is an emerging standard (llmstxt.org) — a small Markdown file at your site's root that tells AI models which pages on your site matter most and what your business is about. Think of it as a table of contents written for machines. Our scan checks if you have one, and our Full Report includes a ready-to-use llms.txt template for your site.

What is schema.org structured data?

Schema.org is a shared vocabulary that websites use to describe themselves to machines via JSON-LD blocks embedded in the page. It tells AI: 'this business is an Organization named X, located at Y, offering Services Z'. Without it, AI has to guess at what you do. With it, AI can cite you confidently by name.

Do I need to re-run the scan after I make changes?

Yes — the scan is a point-in-time snapshot. After you implement fixes, run a fresh scan to confirm your score improved and to catch anything the first scan couldn't tell you (e.g., whether your newly-added schema validates correctly).

Can Kesem Marketing implement the fixes for me?

Yes. If you'd rather hand off the report than work through it yourself, Kesem Marketing can do the implementation. Every report ends with a link to book a call — no pressure, we'll just see whether you're a fit.

Ready to check your own site?

Free scan, no signup, results in seconds.

Run a free scan →