Your documentation is lying to you right now.
That README last updated 18 months ago? Lies. The architecture diagram from the original sprint? Fantasy. The API docs that say "coming soon"? They're not coming.
AI can fix this — but not the way most people think.
Why Documentation Dies
Documentation has a fatal flaw: it's disconnected from code.
# UserService API
## createUser(data)
Creates a new user with the provided data.
**Parameters:**
- data: UserData object with name, email, password
The moment someone adds a role parameter and forgets to update the docs, this becomes a lie. And they will forget. Always.
The problem isn't lazy developers. It's that documentation lives in a different place than the code it describes.
AI Documentation: The Wrong Way
The tempting approach:
Prompt: "Document this function"
def process_payment(order_id, amount, currency, payment_method,
retry_count=3, idempotency_key=None):
...
AI output:
def process_payment(order_id, amount, currency, payment_method,
retry_count=3, idempotency_key=None):
"""
Process a payment for an order.
Args:
order_id: The unique identifier of the order
amount: The payment amount
currency: The currency code (e.g., USD)
payment_method: The payment method to use
retry_count: Number of retry attempts (default: 3)
idempotency_key: Optional key for idempotent requests
Returns:
PaymentResult object with transaction details
"""
This is worse than no documentation. Why?
- It restates the obvious (anyone can see
amountis an amount) - It doesn't explain why or when to use this
- It doesn't mention error handling, side effects, or gotchas
- It will go stale just like manually written docs
AI Documentation: The Right Way
Good documentation answers questions code can't:
- When should I call this?
- What happens if it fails?
- What else gets affected?
- Why does this exist?
AI can help, but it needs context.
Pattern 1: Context-Aware Documentation
Instead of "document this function," provide context:
This function is called when:
- User clicks "Pay Now" on checkout
- Subscription auto-renews
- Admin processes refund reversal
Document focusing on: error handling, retry behavior,
and what happens to order state on failure.
Better AI output:
def process_payment(order_id, amount, currency, payment_method,
retry_count=3, idempotency_key=None):
"""
Process payment through Stripe, updating order state on success/failure.
Called from: CheckoutController, SubscriptionService, AdminRefundHandler
IMPORTANT: This function modifies order state. On failure after retries,
order moves to 'payment_failed' state and user receives notification.
Retry behavior:
- Retries on network errors and rate limits
- Does NOT retry on card declined or invalid payment method
- Uses exponential backoff: 1s, 2s, 4s
Idempotency:
- Always provide idempotency_key for user-initiated payments
- Subscription renewals generate key from subscription_id + period
Side effects:
- Updates Order.status and Order.payment_id
- Sends payment confirmation email on success
- Triggers inventory reservation on success
- Logs to audit trail regardless of outcome
"""
This documentation is useful because it explains behavior, not just parameters.
Pattern 2: Generated from Code Graph
The best documentation comes from actual code analysis:
# Our system generates this automatically:
"""
Function: process_payment
File: src/services/payment.py:145
Called by (3 locations):
- CheckoutController.complete_order (src/controllers/checkout.py:89)
- SubscriptionService.renew (src/services/subscription.py:234)
- AdminController.reverse_refund (src/controllers/admin.py:567)
Calls (5 functions):
- StripeClient.charge
- OrderRepository.update_status
- EmailService.send_payment_confirmation
- InventoryService.reserve
- AuditLog.record
Database tables affected:
- orders (status, payment_id, updated_at)
- audit_logs (insert)
- inventory_reservations (insert on success)
"""
This can't go stale because it's derived from the actual call graph.
Pattern 3: Living Architecture Docs
Instead of maintaining architecture diagrams manually:
Auto-generated from code analysis:
Payment Feature (23 files, 89 symbols)
├── Entry Points
│ ├── POST /api/checkout/pay
│ ├── POST /api/subscriptions/:id/renew
│ └── POST /api/admin/refunds/:id/reverse
├── Core Services
│ ├── PaymentService (process, refund, verify)
│ └── StripeClient (charge, refund, webhook)
├── Data Layer
│ ├── OrderRepository
│ └── PaymentRepository
└── Side Effects
├── EmailService.send_payment_confirmation
└── InventoryService.reserve
Generated daily from actual code structure. Always accurate.
Comment Best Practices with AI
Don't Comment What, Comment Why
Bad (AI default):
# Increment counter
counter += 1
Good (AI with context):
# Track retry attempts for circuit breaker pattern
# After 5 failures, we stop trying for 30 seconds
counter += 1
Don't Comment Obvious Code
# Bad: AI loves to add these
user = get_user(id) # Get user by ID
# Good: Only comment non-obvious behavior
user = get_user(id) # Returns cached user if accessed within 5 minutes
Do Comment Business Logic
# Good: Explains business rule, not code
if order.total > 10000:
require_manager_approval(order)
# Orders over $10k require manager approval per SOX compliance
# See: compliance/policies/large-orders.md
The Glue Approach
We built documentation generation into our platform:
- Auto-generated call graphs — Who calls what, always current
- Feature documentation — What features exist, derived from code clusters
- API documentation — Endpoints, parameters, extracted from actual routes
- Change documentation — What changed in each PR, auto-summarized
// Our MCP tools include:
const docs = await generateFeatureDocumentation(featureId);
const apiDocs = await generateApiDocumentation(workspaceId);
const callGraph = await getSymbolCallGraph(symbolName);
Documentation that can't go stale because it's generated from code reality.
Practical Recommendations
- Don't generate docstrings for everything — Most code is self-explanatory
- Do document non-obvious behavior — Side effects, error handling, business rules
- Generate architecture docs from code — Use tools that analyze actual structure
- Link docs to code — Documentation should reference file:line locations
- Review AI docs for accuracy — AI can hallucinate; verify against actual behavior
The Goal
Documentation should answer: "How does this system work?"
AI can help write words. But the real value is AI that understands the system and generates documentation from that understanding.
Words derived from code structure > Words written about code structure.
That's the difference between documentation that lies and documentation that's always true.