Programmatic SEO Page Generator
The Problem
Your website has one page: 'plumber in Dallas.' Your competitor ranks for every neighborhood because they have 200 pages. This tool scrapes your website, grabs your services and tone, then generates unique, locally-optimized pages for every service × location combo. Paste them into your CMS and dominate long-tail local search.
How It Works
Your website URL + manually entered services and target locations (cities, neighborhoods)
AI scrapes your site for services, tone, and company info. Then generates a unique, human-sounding SEO page for every service × location combination with local landmarks, neighborhood references, and schema markup.
50-200 unique, locally-optimized service pages ready to copy-paste into your website. Each page targets a specific keyword like 'drain cleaning in Lakewood' with genuine local details.
PRD
# Product Requirements Document
## Recipe 007 — Programmatic SEO Page Generator
### AI Trades Platform
---
Recipe Slug: `programmatic-seo`
Recipe Number: 007
Difficulty: Replit Build (Remixable)
Time Estimate: 4–6 hours
Category: SEO & Marketing
Revenue Impact: $2,000–8,000/mo (long-tail organic traffic → more calls)
Hours Saved: 10+ hrs/wk (vs. manually writing location pages)
Replaces: SEO agency location page creation ($500–2,000/mo) or DIY (never happens)
Reusable Modules Referenced:
- Module 0: UX Philosophy (`modules/ux-philosophy-module.md`) — foundational design principles
- Module 9: Onboarding Wizard (`modules/onboarding-wizard-module.md`) — email capture + first-run setup
- Module 11: Notification / Toast (`modules/notification-toast-module.md`) — toasts, status messages
Integration Docs (include with build):
- OpenAI / Anthropic (`integrations/openai-anthropic.md`) — LLM for page content generation (via Replit AI Integrations)
Remixable Design:
This recipe is designed to be remixed on Replit. Users fork the project and use Replit's built-in LLM integrations — no API keys required. A PostgreSQL database is used for user tracking with RPC functions for secure access.
---
## Table of Contents
3. What Was Removed (vs. Previous Version)
10. Anti-AI Writing for SEO Pages
13. Technical Stack
14. Data Model
---
1. Recipe Overview
The Programmatic SEO Page Generator creates hyper-local service pages for home service contractors. A plumber in Dallas doesn't need one page — they need 200 pages targeting every neighborhood-service combination ("drain cleaning in Lakewood," "water heater repair in Uptown," "emergency plumber in Oak Cliff").
The workflow is simple:
1. Contractor enters their website URL → system scrapes for services, tone, and company info
2. Contractor manually enters/confirms their services and service locations
3. System generates unique, human-sounding SEO pages for every service × location combination
4. Pages display on screen with a big "Copy" button → contractor pastes into their CMS
No Airtable. No export. No complexity. Just website in, pages out, copy button.
Key Capabilities:
- Scrape contractor website for business info, services, and tone
- Manual entry/editing of services and locations (cities, neighborhoods, zip codes)
- Generate unique pages per service × location combo using Replit LLM integrations
- Anti-AI writing rules so pages don't read like ChatGPT output
- On-screen display with large copy-to-clipboard function
- Database integration: onboarding email capture + user/usage tracking
- AI Trades branding throughout
- Mobile-friendly (generate on desktop, review/copy on phone)
- Remixable on Replit — no API keys needed (uses Replit AI integrations)
---
2. Strategy Brief
The Core Problem & Solution Logic
The Pain: "I know I need more pages on my website but I can't afford $200 per page from an SEO agency, and when I try to write them myself they all sound the same."
The Diagnosis: Content volume problem + uniqueness problem. Google ranks sites with specific, locally-relevant pages for each service area. A plumber with 8 services across 25 neighborhoods needs 200 unique pages. Writing them manually is impossible. Using AI to batch-generate them produces duplicate-sounding content that Google penalizes.
The Product Fix: A generator that understands the contractor's real services, real service areas, and real voice — then produces pages that are genuinely unique per location by incorporating local landmarks, common local issues, neighborhood-specific language, and varied structure. The contractor's only job: enter services + locations, then copy-paste pages into their website.
The Math That Sells It
```
Services: 8 (drain cleaning, water heater, sewer, leak detection, etc.)
Neighborhoods: 25 (Lakewood, Uptown, Oak Cliff, Deep Ellum, etc.)
Pages needed: 200
SEO agency rate: $150–300 per page
Agency cost: $30,000–60,000
Time to write DIY: 400+ hours (2 hrs/page)
This tool: 20 minutes of setup → 200 pages generated
```
Screen/UI Implications
- Onboarding: email capture → welcome screen
- Setup: paste website URL → scrape → confirm services + locations
- Generator: batch generate pages → display in scrollable list
- Output: each page shown with big copy button, page title, preview
---
3. What Was Removed
This recipe is a simplified version. The following have been intentionally removed:
| Removed Feature | Why |
|---|---|
| Airtable frontend | Unnecessary complexity. Output is copy-to-clipboard, not database storage. |
| Airtable backend | No external data store needed. Generated pages are ephemeral — display and copy. |
| CSV/file export | Over-engineered. Contractors copy one page at a time into their CMS. |
| CMS integration | Too many CMS platforms to support. Copy-paste is universal. |
| User authentication | Email capture only (via database). No login/password flow for V1. |
| Page scheduling | Not needed. Generate all at once, copy as needed. |
What remains is the core value: website → services + locations → unique pages → copy.
---
4. Onboarding & Email Capture
Purpose
Capture the contractor's email before they use the tool. This feeds the AI Trades user database for tracking active users, sending tips, and measuring recipe adoption.
Onboarding Flow
```
Screen 1: Welcome + Email Capture
→ Contractor enters email
→ Database creates/resolves user record
→ Stores email in localStorage for session persistence
Screen 2: Enter Website URL
→ Paste website URL
→ System scrapes for business name, services, tone
→ Shows what was found
Screen 3: Confirm Services & Locations
→ Edit scraped services, add/remove
→ Enter cities/neighborhoods manually
→ Confirm and proceed to generation
```
Screen 1: Welcome + Email Capture
```
┌─────────────────────────────────────────────────────────────┐
│ │
│ [⚡] THE AI TRADES │
│ │
│ ────────────────────────────────────────────────────────── │
│ │
│ SEO Page Generator │
│ │
│ Generate 50-200 unique, locally-optimized pages for │
│ your business in minutes. Not hours. │
│ │
│ Enter your email to get started: │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ mike@smithplumbing.com │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Get Started → │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ We'll send occasional tips on ranking higher locally. │
│ No spam. Unsubscribe anytime. │
│ │
└─────────────────────────────────────────────────────────────┘
```
Feature: Email Capture
- Visual: Full-screen centered layout. AI Trades logo at top. Large email input. Primary amber CTA button. Subtle disclaimer below.
- Trigger: Contractor enters email and taps "Get Started."
- Condition: Email must be valid format (basic regex check).
- Result: Database call `resolve_or_create_user(email)` creates or finds the user. Email stored in localStorage. User advanced to Screen 2. If database is unreachable, allow the user to proceed anyway (graceful degradation — tool still works, just no tracking).
Validation Rules
| Field | Type | Required | Validation |
|---|---|---|---|
| Text input | Yes | Valid email format. No disposable email domains (optional). |
Error Handling
| Scenario | Behavior |
|---|---|
| Invalid email format | Inline error: "Enter a valid email address" |
| Database unreachable | Silently proceed. Log error to console. Tool still works. |
| Email already exists | Treat as returning user. Load any saved session data if available. |
---
5. User Tracking
Architecture
Uses a tenant registry pattern with RPC functions for secure database access. This ensures the tool works after Replit remixes.
What We Track
| Data Point | Purpose |
|---|---|
| User email | Identify the contractor |
| First seen timestamp | Track when they started using the tool |
| Last active timestamp | Track engagement / retention |
| Total pages generated | Usage metric |
| Services entered | Understanding of user base |
| Locations entered | Understanding of market coverage |
| Recipe slug | Identify which AI Trades tool they're using |
Database Client
```typescript
// server/lib/db.ts
import { db } from './database';
export async function resolveOrCreateUser(email: string) {
try {
const result = await db.query('SELECT * FROM resolve_or_create_seo_user($1, $2)', [
email,
'programmatic-seo',
]);
return result.rows[0] ?? null;
} catch (error) {
console.error('User tracking failed:', error.message);
return null; // Graceful degradation — tool still works
}
}
export async function trackPageGeneration(email: string, count: number) {
try {
await db.query('SELECT * FROM track_seo_generation($1, $2)', [
email,
count,
]);
} catch (error) {
console.error('Tracking failed:', error.message);
}
}
```
RPC Functions (PostgreSQL)
```sql
-- Function: resolve_or_create_seo_user
-- Creates a new user or returns the existing one. Anon role has execute permission only.
CREATE OR REPLACE FUNCTION resolve_or_create_seo_user(
p_email TEXT,
p_recipe_slug TEXT DEFAULT 'programmatic-seo'
)
RETURNS JSON
LANGUAGE plpgsql
SECURITY DEFINER
AS $$
DECLARE
v_user_id UUID;
v_is_new BOOLEAN := FALSE;
BEGIN
SELECT id INTO v_user_id
FROM seo_users
WHERE email = lower(trim(p_email));
IF v_user_id IS NULL THEN
INSERT INTO seo_users (email, recipe_slug, first_seen_at, last_active_at)
VALUES (lower(trim(p_email)), p_recipe_slug, NOW(), NOW())
RETURNING id INTO v_user_id;
v_is_new := TRUE;
ELSE
UPDATE seo_users
SET last_active_at = NOW()
WHERE id = v_user_id;
END IF;
RETURN json_build_object(
'user_id', v_user_id,
'is_new', v_is_new
);
END;
$$;
-- Grant execute to anon role
GRANT EXECUTE ON FUNCTION resolve_or_create_seo_user TO anon;
-- Function: track_seo_generation
CREATE OR REPLACE FUNCTION track_seo_generation(
p_email TEXT,
p_pages_generated INTEGER
)
RETURNS VOID
LANGUAGE plpgsql
SECURITY DEFINER
AS $$
BEGIN
UPDATE seo_users
SET
total_pages_generated = total_pages_generated + p_pages_generated,
last_active_at = NOW()
WHERE email = lower(trim(p_email));
END;
$$;
GRANT EXECUTE ON FUNCTION track_seo_generation TO anon;
```
---
6. Website Scraping
Purpose
Extract business name, services, tone, and any existing location/service area info from the contractor's website. This pre-fills the service and location inputs so the contractor doesn't start from zero.
Scraping Flow
```
1. Contractor pastes URL (auto-prepend https:// if missing)
2. Server-side fetch of homepage HTML
3. Parse navigation to discover Services, About, Service Area pages
4. Fetch discovered sub-pages (max 5)
5. Strip HTML: remove nav, header, footer, scripts, styles (Cheerio)
6. Send cleaned text to LLM for structured extraction
7. Display results for contractor confirmation
```
LLM Extraction Prompt
```
Analyze this website text from a home service contractor's website.
Website text:
{{CLEANED_WEBSITE_TEXT}}
Extract the following into JSON:
{
"business_name": "string — the company name",
"trade_type": "string — plumbing, HVAC, electrical, roofing, etc.",
"services": [
{
"name": "string — service name (e.g., 'Drain Cleaning')",
"description": "string — brief description if available"
}
],
"locations_mentioned": {
"cities": ["string — any cities mentioned as served"],
"neighborhoods": ["string — any neighborhoods mentioned"],
"state": "string — state abbreviation"
},
"tone_signals": {
"formality": "number 1-10 — how formal is their copy",
"key_phrases": ["string — phrases they use frequently"],
"personality": "string — one sentence describing their communication style"
},
"tagline": "string — if they have a tagline or hero text"
}
If you can't determine a field, use null. Do not guess or fabricate information.
```
Scraping UI
```
┌─────────────────────────────────────────────────────────────┐
│ [⚡] THE AI TRADES │
│ ────────────────────────────────────────────────────────── │
│ │
│ Step 1 of 2 — Tell us about your business │
│ │
│ What's your website? │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ smithplumbing.com │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 🔍 Read My Website │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ [ I don't have a website — skip to manual entry → ] │
│ │
└─────────────────────────────────────────────────────────────┘
```
Loading State:
```
┌─────────────────────────────────────────────────────────────┐
│ │
│ 🔍 Reading your website... │
│ │
│ ✅ Found homepage │
│ ✅ Found services page — 6 services │
│ 🔄 Checking for service area info... │
│ │
│ About 15 seconds. │
│ │
└─────────────────────────────────────────────────────────────┘
```
Scrape Failure:
```
┌─────────────────────────────────────────────────────────────┐
│ │
│ ⚠️ Couldn't read your website │
│ │
│ No worries — you can enter everything manually. │
│ It takes about 2 minutes. │
│ │
│ [ Enter Manually → ] │
│ │
└─────────────────────────────────────────────────────────────┘
```
---
7. Service & Location Input
Purpose
The contractor confirms/edits the scraped services and manually enters their target locations. This is the critical input step — the quality of the generated pages depends on accurate services and specific locations.
Service Input
Feature: Service List Editor
- Visual: Editable list of services as pill/chip components. Each has an ✕ to remove. "+ Add Service" button at the bottom. Pre-filled from website scraping if available.
- Trigger: Contractor adds, removes, or edits services.
- Condition: Minimum 1 service required to proceed.
- Result: Services stored in component state. Used as input for page generation.
Pre-populated trade-specific suggestions (shown if no scraping):
```
Plumbing: Drain Cleaning, Water Heater Repair, Water Heater Installation,
Sewer Line Repair, Leak Detection, Garbage Disposal, Faucet Repair,
Toilet Repair, Repiping, Gas Line Repair, Sump Pump, Water Softener
HVAC: AC Repair, AC Installation, Heating Repair, Furnace Installation,
Duct Cleaning, Thermostat Installation, Heat Pump Repair,
Mini Split Installation, Indoor Air Quality, AC Maintenance
Electrical: Panel Upgrade, Outlet Installation, Ceiling Fan Install,
Lighting Installation, Generator Installation, EV Charger Install,
Whole House Rewire, Smoke Detector Install, Surge Protection
Roofing: Roof Repair, Roof Replacement, Roof Inspection, Gutter Installation,
Gutter Cleaning, Skylight Repair, Chimney Repair, Emergency Tarping
```
Location Input
Feature: Location Entry
- Visual: Large text area with one location per line. Helper text: "Enter each city, neighborhood, or area on its own line." Counter shows total locations entered.
- Trigger: Contractor types or pastes locations.
- Condition: Minimum 1 location required to proceed.
- Result: Locations parsed (split by newline, trimmed, deduplicated). Used as input for page generation.
Alternative: Comma-separated input with auto-parsing into individual location chips.
Confirmation UI
```
┌─────────────────────────────────────────────────────────────┐
│ [⚡] THE AI TRADES │
│ ────────────────────────────────────────────────────────── │
│ │
│ Step 2 of 2 — Your services & locations │
│ │
│ YOUR SERVICES │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ [Drain Cleaning ✕] [Water Heater Repair ✕] │ │
│ │ [Sewer Line Repair ✕] [Leak Detection ✕] │ │
│ │ [Garbage Disposal ✕] [Faucet Repair ✕] │ │
│ │ [Toilet Repair ✕] [Repiping ✕] │ │
│ │ │ │
│ │ + Add Service │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ YOUR LOCATIONS │
│ (Enter each city, neighborhood, or area on its own line) │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Mesa │ │
│ │ Gilbert │ │
│ │ Chandler │ │
│ │ Tempe │ │
│ │ Scottsdale │ │
│ │ Apache Junction │ │
│ │ Queen Creek │ │
│ │ │ │
│ └──────────────────────────────────────────────────────┘ │
│ 7 locations entered │
│ │
│ STATE (for context in generated pages) │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ [ Arizona ▾ ] │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 📄 8 services × 7 locations = 56 pages │ │
│ │ Estimated generation time: ~3 minutes │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Generate All Pages → │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
```
Page Count Preview
Before generation, show the math:
```
{{SERVICE_COUNT}} services × {{LOCATION_COUNT}} locations = {{TOTAL_PAGES}} pages
Estimated generation time: ~{{ESTIMATED_MINUTES}} minutes
```
Estimation: ~3 seconds per page × total pages ÷ concurrency limit.
---
8. Page Generation Engine
Generation Strategy
Each page must be genuinely unique. This is critical for SEO — Google penalizes near-duplicate content. The system achieves uniqueness through:
1. Location-specific details — Reference local landmarks, neighborhoods, common local issues
2. Service-specific depth — Each service page focuses on that one service, not a catalog
3. Varied structure — Randomize section order, heading patterns, and paragraph structure
4. Natural language variation — Different openings, transitions, and CTAs across pages
5. Local problem framing — Each page opens with a locally-relevant problem, not a generic service description
LLM Prompt for Page Generation
```
You are writing a local SEO service page for a {{TRADE_TYPE}} contractor.
Business: {{BUSINESS_NAME}}
Service: {{SERVICE_NAME}}
Location: {{LOCATION_NAME}}, {{STATE}}
Tone: {{TONE_DESCRIPTION}} (formality {{FORMALITY_LEVEL}}/10)
Key phrases they use: {{KEY_PHRASES}}
{{ANTI_AI_SEO_RULES}}
Write a complete service page targeting the keyword "{{SERVICE_NAME}} in {{LOCATION_NAME}}".
The page must include:
1. H1 title — Include service + location naturally. Do NOT use the format
"Service Name in City Name" as the only pattern. Vary it.
Good: "Drain Cleaning in Lakewood, Dallas — Same-Day Service"
Good: "Need a Plumber in Oak Cliff? We Handle Drain Emergencies."
Bad: "Drain Cleaning Services in Dallas, TX" (generic, overused)
2. Opening paragraph (2-3 sentences) — Lead with a LOCAL problem, not a service
description. Reference something specific to {{LOCATION_NAME}}: older homes,
common plumbing issues in that area, weather patterns, infrastructure age.
3. What we do section (3-4 paragraphs) — Describe the specific service, not
a list of all services. Include what the process looks like, how long it takes,
what the homeowner should expect.
4. Why choose us section (2-3 short paragraphs or bullet points) — Specific
differentiators. Not "we have great customer service" — actual things like
"same-day service," "we own our own camera equipment," "licensed since 2008."
5. Service area callout — Mention {{LOCATION_NAME}} and 2-3 nearby areas
naturally in the text. Not a list — woven into sentences.
6. CTA — Natural call to action with phone number placeholder.
7. Schema markup suggestion — At the end, provide a JSON-LD LocalBusiness
schema snippet for the contractor to paste into their page head.
Output format:
- Use markdown headings (H1, H2, H3)
- Keep total length 400-700 words
- Include the target keyword "{{SERVICE_NAME}} in {{LOCATION_NAME}}" 2-3 times
naturally (not stuffed)
CRITICAL: This page must be DIFFERENT from every other page we generate.
Vary your opening, structure, and angle. If you've written about pipe age
for another location, write about water quality for this one. If you opened
with a question for the last page, open with a statement for this one.
```
Generation Execution
```typescript
interface GenerationJob {
service: string;
location: string;
state: string;
businessName: string;
tradeType: string;
tone: {
formality: number;
keyPhrases: string[];
personality: string;
};
}
interface GeneratedPage {
service: string;
location: string;
title: string; // Extracted H1
content: string; // Full markdown content
targetKeyword: string; // "{service} in {location}"
wordCount: number;
generatedAt: string;
}
```
Concurrency & Rate Limiting
```
Concurrency limit: 3 parallel LLM calls
Delay between batches: 500ms
Progress tracking: "Generating page 12 of 56..."
Estimated time: ~3 seconds per page
If generation fails for a single page:
→ Retry once after 2-second delay
→ If retry fails: mark as failed, continue with remaining pages
→ Show failed pages at end with "Retry Failed" button
```
Progress UI During Generation
```
┌─────────────────────────────────────────────────────────────┐
│ [⚡] THE AI TRADES │
│ ────────────────────────────────────────────────────────── │
│ │
│ Generating your pages... │
│ │
│ ████████████████████░░░░░░░░░░░░░░ 34 of 56 │
│ │
│ Currently writing: │
│ "Water Heater Repair in Chandler, AZ" │
│ │
│ ✅ Completed: 33 pages │
│ ❌ Failed: 1 page (will retry) │
│ │
│ Estimated time remaining: ~1 minute │
│ │
│ [ Cancel Generation ] │
│ │
└─────────────────────────────────────────────────────────────┘
```
---
9. Output Display & Copy
Core Design Principle
The copy button is the product. Everything else serves it. The button must be large, obvious, and satisfying to tap. Feedback must be instant and clear.
Page List View
After generation completes, all pages are displayed in a scrollable list. Each page is a collapsible card.
```
┌─────────────────────────────────────────────────────────────┐
│ [⚡] THE AI TRADES │
│ ────────────────────────────────────────────────────────── │
│ │
│ ✅ 55 of 56 pages generated │
│ │
│ FILTER: [All ▾] [By Service ▾] [By Location ▾] │
│ 🔍 Search pages... │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Drain Cleaning in Mesa, AZ │ │
│ │ ★ 523 words • drain-cleaning-mesa-az │ │
│ │ │ │
│ │ ┌──────────────────────────────────────────────┐ │ │
│ │ │ 📋 COPY PAGE CONTENT │ │ │
│ │ └──────────────────────────────────────────────┘ │ │
│ │ │ │
│ │ [ ▼ Preview ] │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ Drain Cleaning in Gilbert, AZ │ │
│ │ ★ 487 words • drain-cleaning-gilbert-az │ │
│ │ │ │
│ │ ┌──────────────────────────────────────────────┐ │ │
│ │ │ 📋 COPY PAGE CONTENT │ │ │
│ │ └──────────────────────────────────────────────┘ │ │
│ │ │ │
│ │ [ ▼ Preview ] │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ... (scrollable list continues) │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 📋 Copy ALL Pages (as one document) │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────┘
```
Copy Button Behavior
Feature: Copy Page Content
- Visual: Large amber button spanning 80%+ of card width. Text: "📋 COPY PAGE CONTENT". Min height 48px. Font weight 600.
- Trigger: Contractor taps the copy button.
- Condition: Page content exists and is not empty.
- Result: Full page content (markdown) copied to clipboard. Button text changes to "✅ Copied!" for 2 seconds with green background. Toast notification: "Page copied to clipboard." Counter increments: "12 of 56 pages copied." Database tracking: increment total_pages_generated.
Feature: Copy All Pages
- Visual: Full-width amber button at bottom of page list. "📋 Copy ALL Pages (as one document)."
- Trigger: Contractor taps button.
- Condition: At least 1 page generated.
- Result: All pages concatenated with separator (`---\n\n`) between each. Copied to clipboard. Toast: "All 56 pages copied to clipboard." Warning on very large content: "That's a lot of text — your CMS might need pages added one at a time."
Page Preview (Expanded)
```
┌──────────────────────────────────────────────────────────┐
│ Drain Cleaning in Mesa, AZ [ ▲ Close ] │
│ ★ 523 words • Suggested slug: drain-cleaning-mesa-az │
│ ──────────────────────────────────────────────────────── │
│ │
│ # Need a Drain Cleared in Mesa? We'll Be There Today. │
│ │
│ If you live in one of the older homes near downtown Mesa │
│ — especially around the Evergreen Historic District — │
│ you've probably dealt with slow drains more than once. │
│ Those original clay pipes from the 60s don't take kindly │
│ to 60 years of hard water buildup... │
│ │
│ ## What We'll Do │
│ │
│ When we show up (same day if you call before 2 PM)... │
│ │
│ (rendered markdown preview continues...) │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 📋 COPY PAGE CONTENT │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
│ ┌──────────────────────────────────────────────────────┐ │
│ │ 📋 Copy Schema Markup Only │ │
│ └──────────────────────────────────────────────────────┘ │
│ │
└──────────────────────────────────────────────────────────┘
```
Filtering & Search
| Filter | Options |
|---|---|
| By Service | Dropdown listing all services. Shows only pages for that service. |
| By Location | Dropdown listing all locations. Shows only pages for that location. |
| Search | Full-text search across page titles and content. Debounce 300ms. |
| Status | All / Copied / Not Copied (track which pages the contractor has already copied) |
---
10. Anti-AI Writing for SEO Pages
SEO-Specific Anti-AI Rules
```
ANTI-AI WRITING RULES FOR SEO PAGES — MANDATORY
These pages will be published on a real business website. They must pass two tests:
1. A human reader should think the business owner wrote it (or a good local copywriter)
2. Google's helpful content system should classify it as "people-first" content
BANNED OPENINGS:
- "Are you looking for [service] in [city]?" — the #1 AI SEO page cliché
- "When it comes to [service]..." — meaningless filler
- "At [Business Name], we understand..." — corporate template
- "Finding a reliable [trade] in [city] can be challenging..." — patronizing
- "Welcome to [Business Name]'s [service] page" — amateur SEO from 2010
BANNED PHRASES:
- "comprehensive solutions"
- "peace of mind"
- "look no further"
- "trusted professionals"
- "state-of-the-art equipment"
- "second to none"
- "your go-to [trade]"
- "whether you need X or Y, we've got you covered"
- "serving the greater [city] area" (be specific about neighborhoods)
- "our team of experienced professionals"
- "we pride ourselves on"
- "no job is too big or too small"
- "customer satisfaction guaranteed"
- "competitive pricing"
BANNED PATTERNS:
- Keyword stuffing: using "[service] in [city]" more than 3 times
- Template structure: every page following identical H1 → intro → services → why us → CTA
- Listicle format: "5 Signs You Need [Service]" — Google has deprioritized these
- FAQ sections with obvious questions: "How much does [service] cost in [city]?"
- Identical CTAs across all pages
- Using "we" to start more than 2 paragraphs
REQUIRED FOR UNIQUENESS:
- Each page MUST have a different opening angle (problem-first, story-first,
seasonal-first, neighborhood-specific-first)
- Each page MUST reference something specific to the location:
- Neighborhood names within the city
- Local landmarks or references
- Common local issues (hard water, old infrastructure, clay soil, etc.)
- Regional weather patterns that affect the service
- Local housing stock characteristics (age, materials, common builders)
- Vary paragraph lengths: mix 1-sentence paragraphs with 3-sentence ones
- Vary heading styles: questions, statements, imperatives
- The contractor's personality/tone should come through
SEO REQUIREMENTS:
- Target keyword in H1, first paragraph, one H2, and naturally 1-2 more times
- H1 must be unique across all generated pages (not just "Service in City")
- Include semantically related terms (LSI keywords) naturally
- Suggest a URL slug: lowercase, hyphenated, no stop words
- Include one LocalBusiness JSON-LD schema snippet at the end
```
---
11. AI Trades Branding
Brand Implementation
Every screen in the app uses AI Trades branding from `brand.md` and `design-system.md`:
| Element | Specification |
|---|---|
| Logo | Lucide `Zap` icon (20px, `emerald-500`) + "THE AI TRADES" (Inter Bold, 14px, uppercase, `tracking-wide`, `text-slate-900`) |
| Logo placement | Top-left of every screen. Compact. Not oversized. |
| Primary font | Inter (headings + UI) |
| Body font | Source Sans 3 (body text, descriptions) |
| Primary color | Amber `#D97706` (buttons, accents, focus rings) |
| Background | `#FAFAF9` (gray-50) page background, `#FFFFFF` card backgrounds |
| Borders | `1px solid #E7E5E4` (gray-200) |
| Border radius | 6px default |
| Shadows | None on cards (use borders). `shadow-md` on modals/popovers only. |
| Button style | Amber-600 background, white text, 600 weight, 6px radius, 44px min height |
| Icons | Lucide, 20px default, 2px stroke, gray-600 default color |
| Attribution | Footer text: "Built with THE AI TRADES" with Zap icon |
Attribution Footer
Every screen includes:
```
┌─────────────────────────────────────────────────────────────┐
│ [⚡] Built with THE AI TRADES │
└─────────────────────────────────────────────────────────────┘
```
Small, bottom of page, `text-gray-400`, Inter, 12px. Links to AI Trades landing page.
Light Mode Only
Per agency coding standards: NO dark mode. No dark theme, no `prefers-color-scheme`, no `dark:` Tailwind classes. Light mode only.
---
12. Mobile-First Design
Responsive Strategy
| Breakpoint | Layout | Notes |
|---|---|---|
| `< 640px` (Mobile) | Single column. Full-width inputs and cards. Bottom-sticky generate button. | Contractors will set up on desktop but may review/copy on phone. |
| `640px – 1023px` (Tablet) | Single column, wider cards. | |
| `≥ 1024px` (Desktop) | Centered content area (max-width 720px). More generous spacing. | Primary setup experience. |
Mobile Copy Experience
- Copy buttons are full-width on mobile (no hunting for small buttons)
- "Copied!" feedback fills the full button width with green
- Scroll position preserved after copy (user doesn't lose their place)
- Service/location filters in a bottom sheet on mobile (not dropdowns)
Touch Targets
- All buttons: minimum 44px height
- All tappable elements: minimum 44x44px hit area
- Service chips: 36px height minimum, 12px horizontal padding
---
13. Technical Stack
| Component | Technology |
|---|---|
| Frontend | React + TypeScript + Vite + Tailwind CSS |
| UI Components | shadcn/ui |
| Routing | wouter |
| Data Fetching | TanStack React Query v5 |
| Backend | Express.js (TypeScript) |
| AI / LLM | Replit AI Integrations (OpenAI + Anthropic — no API keys needed for remix users) |
| Web Scraping | Cheerio + node-fetch (server-side HTML parsing) |
| User Tracking | PostgreSQL (RPC functions) |
| Clipboard | navigator.clipboard.writeText() with fallback |
| Deployment | Replit |
Replit LLM Integration
Uses Replit's built-in AI integrations so remix users don't need their own API keys:
```typescript
// server/lib/llm.ts
// Uses Replit AI Integrations — available to all remix users
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic();
export async function generatePageContent(prompt: string): Promise<string> {
const response = await anthropic.messages.create({
model: 'claude-sonnet-4-20250514',
max_tokens: 2000,
messages: [
{ role: 'user', content: prompt }
],
});
return response.content[0].type === 'text'
? response.content[0].text
: '';
}
```
Environment Variables
```
# These are NOT needed for remix users — Replit AI integrations handle LLM access
# Only needed if running outside Replit:
ANTHROPIC_API_KEY= (optional — Replit provides this via integrations)
OPENAI_API_KEY= (optional — Replit provides this via integrations)
```
---
14. Data Model
Migration Script
```sql
-- Table for user tracking
-- This is a SHARED table used by AI Trades to track usage across recipes
CREATE TABLE IF NOT EXISTS seo_users (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
email TEXT NOT NULL UNIQUE,
recipe_slug TEXT NOT NULL DEFAULT 'programmatic-seo',
total_pages_generated INTEGER DEFAULT 0,
services_entered JSONB DEFAULT '[]'::jsonb,
locations_entered JSONB DEFAULT '[]'::jsonb,
first_seen_at TIMESTAMPTZ DEFAULT NOW(),
last_active_at TIMESTAMPTZ DEFAULT NOW()
);
-- Access is only through SECURITY DEFINER RPC functions.
-- RPC: resolve_or_create_seo_user (defined in Section 5)
-- RPC: track_seo_generation (defined in Section 5)
-- Index for email lookups
CREATE INDEX idx_seo_users_email ON seo_users(email);
CREATE INDEX idx_seo_users_recipe ON seo_users(recipe_slug);
```
What Is NOT Stored
- Generated page content is not stored in the database. Pages are generated on-the-fly and displayed in the browser. If the user navigates away, they need to regenerate.
- No session persistence beyond email in localStorage.
- No Airtable. No export tables. No saved drafts.
Rationale: Keep it simple. The tool generates pages and the contractor copies them. No data management complexity. If they need to regenerate, they re-enter services/locations (which takes 30 seconds since they already know what to enter).
---
15. Testing Scenarios
Onboarding
| # | Scenario | Expected Result |
|---|---|---|
| T1 | Enter valid email | User record created, proceed to website input |
| T2 | Enter invalid email | Inline error shown, cannot proceed |
| T3 | Database is unreachable | User proceeds anyway, tool works without tracking |
| T4 | Returning user (email exists) | Recognized as returning, proceed normally |
Website Scraping
| # | Scenario | Expected Result |
|---|---|---|
| T5 | Scrape website with clear services page | Services extracted and pre-populated |
| T6 | Scrape website with no services page | Partial extraction, manual entry prompted |
| T7 | Invalid URL | Friendly error, offer manual entry |
| T8 | Website unreachable | "Couldn't read your site" message, manual entry offered |
| T9 | Skip website (no website) | Manual entry form shown with trade-specific suggestions |
Service & Location Input
| # | Scenario | Expected Result |
|---|---|---|
| T10 | Add services manually | Services appear as chips/pills |
| T11 | Remove a scraped service | Service removed from list |
| T12 | Enter locations (newline separated) | Each line parsed as a location, count shown |
| T13 | Zero services entered | Cannot proceed, validation error |
| T14 | Zero locations entered | Cannot proceed, validation error |
| T15 | Large input (20 services × 30 locations) | Warning about generation time (~600 pages), proceed allowed |
Page Generation
| # | Scenario | Expected Result |
|---|---|---|
| T16 | Generate 8 services × 7 locations = 56 pages | All 56 pages generated, progress bar accurate |
| T17 | Single page generation failure | Failed page noted, others continue, retry available |
| T18 | Cancel generation mid-stream | Generation stops, completed pages still available |
| T19 | Generated pages are unique | Spot check: no two pages have identical openings or identical structure |
| T20 | Pages follow anti-AI rules | No banned phrases, no keyword stuffing, varied structure |
Copy Functionality
| # | Scenario | Expected Result |
|---|---|---|
| T21 | Copy single page | Content in clipboard, button shows "Copied!", toast shown |
| T22 | Copy all pages | All content concatenated in clipboard, toast shown |
| T23 | Copy schema markup only | Only JSON-LD snippet copied |
| T24 | Copy on mobile | Same behavior, full-width button, haptic if supported |
| T25 | Clipboard API unavailable (older browser) | Fallback: select-all in a textarea for manual copy |
Filtering
| # | Scenario | Expected Result |
|---|---|---|
| T26 | Filter by service | Only pages for that service shown |
| T27 | Filter by location | Only pages for that location shown |
| T28 | Search by keyword | Pages matching keyword in title/content shown |
| T29 | Clear filters | All pages shown again |
Edge Cases
| # | Scenario | Expected Result |
|---|---|---|
| T30 | Very long service name (50+ chars) | Truncated in chip display, full name in page content |
| T31 | Non-English location names | Handled correctly in page content |
| T32 | Duplicate locations entered | Deduplicated automatically |
| T33 | Navigate away and return | Pages are gone, re-generation required (by design) |
| T34 | Browser memory with 500+ pages | Paginate display (show 20 at a time with load more) |