Last July I was sitting in my truck between two site visits in Markham. I pulled up ChatGPT on my phone and typed "best renovation company Markham." I wanted to see what it said about me. I have run Yellow Pencil for 11 years. We have 340+ completed projects in the GTA. I assumed I would at least be in the list.
ChatGPT named Monarch Renovations first. Then two others. Yellow Pencil was not in the answer at all. I tried four variations of the query. Same result on three of them. On the fourth, we got a one-line mention at the bottom.
That moment is the reason RankingLocal.ai exists. If you have typed your own business into ChatGPT, Perplexity, or Gemini and watched a competitor get named instead, I want to walk you through exactly how I diagnosed it. There are 4 reasons this happens. Each has a different fix. You do not need to panic, and you do not need to hire a $4,000/month agency. You need a checklist.
Reason 1: Your site is harder to crawl than theirs
AI engines pull from web pages. If their crawlers cannot read your pages cleanly, you do not get cited. This is the boring reason, and it is the first thing I check.
The audit takes 15 minutes. Open your homepage and your 3 most important service pages. View the page source (right-click, View Page Source). Search the source for a paragraph of body copy you can see on the rendered page. If the text is not in the raw HTML, your site is rendering content in JavaScript after the page loads. Some crawlers handle that. Many do not. GPTBot, PerplexityBot, and ClaudeBot are not as forgiving as Googlebot.
Then check your robots.txt at yoursite.com/robots.txt. I have seen 3 clients this year who were blocking GPTBot and did not know it. A developer added it "for security" two years ago. If you see "User-agent: GPTBot" followed by "Disallow: /" you are invisible to ChatGPT. Delete those lines.
Action plan: render content server-side where possible, unblock the AI crawlers, and make sure your H1 and your first paragraph contain the city name and the service. Monarch's homepage had "Markham renovation contractor" in the first 40 words. Mine did not. That took me 20 minutes to fix.
Reason 2: You have less structured data than they do
Structured data is the JSON-LD block in your page head that tells machines what your page is about. LocalBusiness schema tells AI engines your name, address, phone, service area, and hours. FAQPage schema gives them pre-formatted question-and-answer pairs they can lift directly into a response.
Run your homepage through Google's Rich Results Test. Then run your top competitor's homepage through the same tool. Compare the two. When I did this on Monarch I found they had LocalBusiness, FAQPage with 8 questions, and BreadcrumbList on every service page. I had LocalBusiness only.
The FAQPage schema is the single biggest lever I have found for AI citations. Eight questions, each with a direct 2-sentence answer, each genuinely useful. Not keyword-stuffed garbage. Real questions your customers ask you on the phone.
Action plan: add FAQPage schema to your top 5 pages. Use questions your customers actually ask. "How much does a kitchen renovation cost in Markham?" beats "What are the best renovation services?" every time. The specific question matches the specific AI query.
Reason 3: They have more third-party citations than you do
This is the one that took me 18 months to figure out. Your own website is one signal. What other people say about you is a different, stronger signal. AI engines weight third-party mentions heavily because they are harder to fake than your own marketing copy.
The audit: search "[competitor name] Reddit," "[competitor name] review," and "[competitor name] [city]" on Google. Count what you find. Then do the same for your own business.
When I did this on Monarch I found a 2021 r/Markham thread where someone asked for renovation recommendations. Monarch got 6 upvoted comments naming them. The thread had 40+ comments total and 18k views. I had zero Reddit presence. I had Google reviews, which are great, but Google reviews are gated inside Google's ecosystem. Reddit is open web. AI crawlers love Reddit.
Action plan: find 3 to 5 relevant subreddits. For me it was r/Markham, r/Toronto, r/HomeImprovement, and r/RenovationsCanada. Show up as a human. Answer real questions. Mention your business when it is genuinely the right answer, and disclose that it is yours. Do not spam. One honest comment per week for 3 months will do more than 50 spammy ones. I also got 2 local news mentions through a contractor association press release that cost me $0 and 4 hours of writing.
Reason 4: Your entity is newer or less established
AI engines build a picture of who you are from how long you have been talked about online. If your competitor has a 2014 GoFundMe that mentions their business, a 2016 news article, and a 2019 BBB profile, they have what I call an "entity history." If your earliest mention is your 2022 Google Business Profile, you look new even if you are not.
The audit: Google your business name in quotes with a date filter. Tools menu, Any time, Custom range. Set it to before 2020. What comes up? If the answer is nothing, you have an entity history problem.
Action plan: you cannot fake history, but you can build a timeline. Add a proper About page with founding date, milestones, and years of operation. Add schema with the foundingDate property on your Organization markup. Get listed on industry directories that have been around for a while. Chamber of Commerce, BBB, trade associations. They are unglamorous, but they add dates to your entity graph.
What happened with Yellow Pencil
I did all 4 audits on my own business over a weekend in August 2024. My site crawled fine. My schema was thin. My third-party presence was almost nothing outside of Google reviews. My entity history was shallow.
The Reddit thread was the biggest single fix. I spent 12 weeks answering questions honestly in r/Markham and r/HomeImprovement. No keyword stuffing. No links. Just useful answers when I had something real to say. By November 2024, searches for "best renovation Markham" in ChatGPT were putting us in the top 3 cited businesses. By February 2025, we were the first-named more than half the time across the 7 engines we track.
18 months of losing to Monarch. 4 months of deliberate work to turn it around.
How to know if it is working
You need to actually measure this. Typing your business into ChatGPT once a week is not a system. Run the same query across ChatGPT, Perplexity, Gemini, Claude, Copilot, Grok, and You.com. Track who gets named, in what order, with what context. Watch the trend over weeks.
That is what we built at /free-tools/ai-visibility/. Free. No credit card. Enter your business and your top 3 competitors, pick 5 queries you care about, and you get a weekly report. It will tell you within 2 weeks whether the fixes you made are moving the needle.
If you have questions about your own diagnostic, hello@rankinglocal.ai is read by me directly.