Verify
Functions
verifyCodeExamples
async function verifyCodeExamples(outputPath: string, allElements: APIElement[], client: LLMClient, primarySourcePath: string, options: VerifyOptions): Promise<void>
Use verifyCodeExamples to automatically test every generated code example in your docs and re-generate any that fail, so your documentation ships with examples that actually run.
Call this after skrypt generate completes — it's the quality gate that catches hallucinated imports, wrong method signatures, and runtime errors before your users do. If an example fails execution, it loops back through the LLM to produce a corrected version rather than leaving broken code in your docs.
| Name | Type | Required | Description |
|---|---|---|---|
outputPath | string | Yes | Directory where your generated .mdx files live — the same path you passed to -o during generation. |
allElements | APIElement[] | Yes | The full list of scanned API elements from your codebase. Used to give the LLM context when re-generating a failing example. |
client | LLMClient | Yes | Configured LLM client (e.g. OpenAI, Anthropic) that handles re-generation requests when an example fails verification. |
primarySourcePath | string | Yes | Absolute path to the source directory that was scanned — helps the verifier resolve imports when executing examples. |
options | VerifyOptions | Yes | Controls verification behavior: maximum retry attempts per example, which languages to verify, and whether to fail fast on the first broken example. |
Returns Promise<void>. Resolves when all examples have passed or exhausted their retry budget. Rejects if a fatal error (e.g. LLM client failure, unwritable output directory) prevents verification from completing — individual example failures are handled internally via the retry loop.
Heads up:
- Examples that still fail after exhausting
options.maxRetriesare logged as warnings but do not cause the promise to reject — check stdout for a summary of unresolved failures before publishing. - Verification actually executes code, so run this in an isolated environment (CI sandbox, Docker) if your source library has side effects like database writes or outbound HTTP calls.
Example:
const path = require("path");
// --- Inline type stubs (do not import from autodocs) ---
/**
* @typedef {{ name: string, signature: string, docstring: string, filePath: string }} APIElement
* @typedef {{ chat: (messages: any[]) => Promise<string> }} LLMClient
* @typedef {{ maxRetries: number, languages: string[], failFast: boolean }} VerifyOptions
*/
// Mock LLMClient — replace with real OpenAI/Anthropic client in production
const mockLLMClient = {
chat: async (messages) => {
console.log(`[LLM] Re-generating example for failing code...`);
// Simulate returning a corrected code example
return "console.log(createUser({ id: 'usr_9k2x', email: 'ada@example.com' }));";
},
};
// Mock verifyCodeExamples — replace with the real function from your autodocs build
async function verifyCodeExamples(outputPath, allElements, client, primarySourcePath, options) {
console.log(`Verifying examples in: ${outputPath}`);
console.log(`Checking ${allElements.length} API element(s) across languages: ${options.languages.join(", ")}`);
for (const element of allElements) {
let attempts = 0;
let passed = false;
while (attempts <= options.maxRetries && !passed) {
try {
// Simulate executing the example (real impl runs the file in a subprocess)
if (attempts === 0) throw new Error("ReferenceError: createUser is not defined");
passed = true;
console.log(`✓ ${element.name} — example verified`);
} catch (err) {
attempts++;
if (attempts > options.maxRetries) {
console.warn(`⚠ ${element.name} — failed after ${options.maxRetries} retries: ${err.message}`);
if (options.failFast) throw err;
break;
}
console.log(`✗ ${element.name} attempt ${attempts} failed: ${err.message}. Re-generating...`);
await client.chat([{ role: "user", content: `Fix this example for ${element.name}` }]);
}
}
}
}
// --- Usage ---
const outputPath = path.resolve("./content/docs");
const primarySourcePath = path.resolve("../my-project/src");
/** @type {APIElement[]} */
const allElements = [
{
name: "createUser",
signature: "function createUser(options: UserOptions): User",
docstring: "Creates a new user and returns the user object.",
filePath: "../my-project/src/users.ts",
},
];
/** @type {VerifyOptions} */
const verifyOptions = {
maxRetries: 3,
languages: ["typescript", "python"],
failFast: false,
};
(async () => {
try {
await verifyCodeExamples(
outputPath,
allElements,
mockLLMClient,
primarySourcePath,
verifyOptions
);
console.log("Verification complete — docs are ready to publish.");
} catch (err) {
console.error("Verification encountered a fatal error:", err.message);
process.exit(1);
}
})();
// Expected output:
// Verifying examples in: /your/project/content/docs
// Checking 1 API element(s) across languages: typescript, python
// ✗ createUser attempt 1 failed: ReferenceError: createUser is not defined. Re-generating...
// [LLM] Re-generating example for failing code...
// ✓ createUser — example verified
// Verification complete — docs are ready to publish.