Index
Functions
runQA
function runQA(outputDir: string): QAReport
Use runQA to validate a generated documentation directory and surface issues before publishing — broken links, malformed MDX, missing frontmatter, security problems, and more.
Run this after skrypt generate completes, before deploying or committing your docs. It's the quality gate between raw AI-generated output and production-ready documentation.
runQA scans every doc file in the output directory and runs a suite of checks covering frontmatter, headings, code blocks, MDX syntax, internal links, component usage, content quality, and security. It collects all findings into a single report rather than failing on the first issue, so you can fix everything in one pass.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
outputDir | string | Yes | Path to the directory produced by skrypt generate — the same value you passed to -o. All .mdx files found recursively will be checked. |
Returns
Returns a QAReport object containing an array of issues grouped by file, a total issue count, and a pass/fail status. Iterate over report.issues to display problems to the user, or check report.passed to decide whether to block a CI step.
Heads up
runQAis synchronous and reads files from disk — make sureskrypt generatehas fully written its output before calling this.- A report with zero issues still returns a
QAReportobject; checkreport.passed === truerather than checking for a falsy return value.
Example:
import { execSync } from "child_process";
import { mkdirSync, writeFileSync } from "fs";
import { join } from "path";
// Inline types — do not import from autodocs
interface QAIssue {
file: string;
check: string;
severity: "error" | "warning" | "info";
message: string;
line?: number;
}
interface QACheckResult {
checkName: string;
passed: boolean;
issues: QAIssue[];
}
interface QAReport {
passed: boolean;
totalIssues: number;
errorCount: number;
warningCount: number;
durationMs: number;
results: QACheckResult[];
}
// Minimal mock of runQA for a self-contained example
function runQA(outputDir: string): QAReport {
const start = Date.now();
// Simulate finding one warning in a generated file
const mockIssues: QAIssue[] = [
{
file: join(outputDir, "api/auth.mdx"),
check: "frontmatter",
severity: "warning",
message: 'Missing "description" field in frontmatter',
line: 1,
},
];
const results: QACheckResult[] = [
{ checkName: "frontmatter", passed: false, issues: mockIssues },
{ checkName: "headings", passed: true, issues: [] },
{ checkName: "code-blocks", passed: true, issues: [] },
{ checkName: "links", passed: true, issues: [] },
{ checkName: "security", passed: true, issues: [] },
];
const errorCount = mockIssues.filter((i) => i.severity === "error").length;
const warningCount = mockIssues.filter((i) => i.severity === "warning").length;
return {
passed: errorCount === 0,
totalIssues: mockIssues.length,
errorCount,
warningCount,
durationMs: Date.now() - start,
results,
};
}
// --- Usage ---
const OUTPUT_DIR = "./content/docs";
try {
const report = runQA(OUTPUT_DIR);
console.log(`QA ${report.passed ? "PASSED ✓" : "FAILED ✗"}`);
console.log(
`Found ${report.totalIssues} issue(s): ${report.errorCount} error(s), ${report.warningCount} warning(s)`
);
console.log(`Completed in ${report.durationMs}ms\n`);
if (!report.passed) {
for (const result of report.results) {
for (const issue of result.issues) {
const loc = issue.line ? `:${issue.line}` : "";
console.error(`[${issue.severity.toUpperCase()}] ${issue.file}${loc}`);
console.error(` ${issue.check}: ${issue.message}\n`);
}
}
process.exit(1); // Block CI on errors
}
} catch (err) {
console.error("QA check failed to run:", err);
process.exit(1);
}
// Expected output:
// QA PASSED ✓
// Found 1 issue(s): 0 error(s), 1 warning(s)
// Completed in 3ms
printQAReport
function printQAReport(report: QAReport): void
Use printQAReport to display a formatted QA summary to stdout after running documentation checks.
Call this at the end of a QA pipeline once you've collected results across all scanned files — it gives you a human-readable pass/fail verdict alongside file counts and timing.
It formats the report with status icons (✓/✗), a PASSED/FAILED label, the number of files checked, and how long the run took in milliseconds.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
report | QAReport | Yes | The result object from your QA run — must include passed, filesChecked, and duration at minimum. Typically produced by running checks across your docs output directory. |
Returns
Nothing (void). Output goes directly to stdout, so it appears in your terminal or CI logs. Pipe stdout to a file if you need a persistent record.
Heads up
- This is a display-only function — it doesn't throw on failure or set a process exit code. If you need CI to fail on a bad report, check
report.passedyourself and callprocess.exit(1).
Example:
interface QAIssue {
type: string;
message: string;
line?: number;
}
interface QACheckResult {
checkName: string;
passed: boolean;
issues: QAIssue[];
}
interface QAReport {
passed: boolean;
filesChecked: number;
duration: number;
results: QACheckResult[];
fixesApplied?: number;
}
function printQAReport(report: QAReport): void {
const statusIcon = report.passed ? "✓" : "✗";
const statusLabel = report.passed ? "PASSED" : "FAILED";
console.log(`\n QA ${statusIcon} ${statusLabel}`);
console.log(` ${report.filesChecked} files checked in ${report.duration}ms`);
if (!report.passed) {
const failed = report.results.filter((r) => !r.passed);
failed.forEach((result) => {
console.log(`\n ✗ ${result.checkName}`);
result.issues.forEach((issue) => {
const location = issue.line ? ` (line ${issue.line})` : "";
console.log(` - ${issue.message}${location}`);
});
});
}
if (report.fixesApplied) {
console.log(`\n ${report.fixesApplied} fixes applied automatically`);
}
}
const report: QAReport = {
passed: false,
filesChecked: 42,
duration: 1183,
fixesApplied: 3,
results: [
{
checkName: "Frontmatter",
passed: true,
issues: [],
},
{
checkName: "Code Blocks",
passed: false,
issues: [
{ type: "missing-language", message: "Code block missing language tag", line: 24 },
{ type: "missing-language", message: "Code block missing language tag", line: 61 },
],
},
{
checkName: "Security",
passed: false,
issues: [
{ type: "exposed-secret", message: "Possible API key exposed in example", line: 8 },
],
},
],
};
printQAReport(report);
// Expected output:
//
// QA ✗ FAILED
// 42 files checked in 1183ms
//
// ✗ Code Blocks
// - Code block missing language tag (line 24)
// - Code block missing language tag (line 61)
//
// ✗ Security
// - Possible API key exposed in example (line 8)
//
// 3 fixes applied automatically
if (!report.passed) {
process.exit(1);
}
fixQAIssues
function fixQAIssues(outputDir: string): FixReport
Use fixQAIssues to automatically repair common documentation problems across your entire output directory in one pass.
After running skrypt generate, your .md and .mdx files may contain fixable issues — malformed frontmatter, unsafe MDX syntax, insecure content patterns, or broken code block formatting. Call fixQAIssues on your output directory before publishing to catch and correct these automatically, without touching anything that requires human judgment.
It scans every .md and .mdx file in the output directory, applies a set of conservative fix functions (frontmatter normalization, MDX syntax repair, code block cleanup, security scrubbing), and writes the corrected content back in place. Only safe, deterministic fixes are applied — ambiguous issues are left for manual review.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
outputDir | string | Yes | Path to the directory containing your generated docs. All .md and .mdx files found recursively will be read, fixed, and overwritten. |
Returns
Returns a FixReport object with three fields: filesFixed (number of files that had at least one change), totalFixes (sum of all individual fixes applied), and fixes (an array of fix records describing what changed and where). Use fixes to log a summary for your CI pipeline or to audit what was auto-corrected before a deploy.
Heads up
- This function writes changes directly to disk — run it against your generated output, not your source files.
- Issues that can't be safely auto-corrected (broken links, missing screenshots, content quality warnings) won't appear in
fixes— run a QA check separately to surface those.
Example:
import { readFileSync, writeFileSync, mkdirSync } from 'fs'
import { join } from 'path'
import { tmpdir } from 'os'
import { randomBytes } from 'crypto'
// --- Inline types (do not import from autodocs) ---
interface FixRecord {
file: string
rule: string
message: string
}
interface FixReport {
filesFixed: number
totalFixes: number
fixes: FixRecord[]
}
// --- Minimal self-contained mock of fixQAIssues ---
function fixQAIssues(outputDir: string): FixReport {
const { readdirSync } = require('fs')
const path = require('path')
const report: FixReport = { filesFixed: 0, totalFixes: 0, fixes: [] }
const walk = (dir: string): string[] => {
const entries = readdirSync(dir, { withFileTypes: true })
return entries.flatMap((e: any) =>
e.isDirectory()
? walk(path.join(dir, e.name))
: e.name.match(/\.mdx?$/) ? [path.join(dir, e.name)] : []
)
}
for (const filePath of walk(outputDir)) {
const original = readFileSync(filePath, 'utf-8')
let content = original
const fileFixes: FixRecord[] = []
// Fix 1: ensure frontmatter ends with a blank line
if (content.startsWith('---') && !content.match(/---\n\n/)) {
content = content.replace(/---\n([^-])/, '---\n\n$1')
fileFixes.push({ file: filePath, rule: 'frontmatter', message: 'Added blank line after frontmatter closing delimiter' })
}
// Fix 2: replace bare <br> with <br /> for MDX compatibility
if (content.includes('<br>')) {
content = content.replaceAll('<br>', '<br />')
fileFixes.push({ file: filePath, rule: 'mdx-syntax', message: 'Replaced <br> with <br />' })
}
if (content !== original) {
writeFileSync(filePath, content, 'utf-8')
report.filesFixed++
report.totalFixes += fileFixes.length
report.fixes.push(...fileFixes)
}
}
return report
}
// --- Demo ---
async function main() {
try {
// Set up a temporary docs directory with two files that have fixable issues
const outputDir = join(tmpdir(), `skrypt-demo-${randomBytes(4).toString('hex')}`)
mkdirSync(join(outputDir, 'api'), { recursive: true })
writeFileSync(
join(outputDir, 'api', 'authentication.mdx'),
`---\ntitle: Authentication\n---\n## Overview\nUse <br> to separate sections.\n`
)
writeFileSync(
join(outputDir, 'api', 'rate-limits.md'),
`---\ntitle: Rate Limits\n---\n\nNo issues here.\n`
)
const report = fixQAIssues(outputDir)
console.log('Fix report:', JSON.stringify(report, null, 2))
// Expected output:
// Fix report: {
// "filesFixed": 1,
// "totalFixes": 2,
// "fixes": [
// { "file": "...authentication.mdx", "rule": "frontmatter", "message": "Added blank line after frontmatter closing delimiter" },
// { "file": "...authentication.mdx", "rule": "mdx-syntax", "message": "Replaced <br> with <br />" }
// ]
// }
} catch (err) {
console.error('fixQAIssues failed:', err)
process.exit(1)
}
}
main()
printFixReport
function printFixReport(report: FixReport): void
Use printFixReport to surface a human-readable summary of auto-fix results to stdout after running skrypt's documentation QA pipeline.
Call this at the end of a fix pass — after skrypt has scanned and corrected MDX files — to give users immediate feedback on what changed. It's the last step in a typical lint-and-fix workflow.
If no fixes were applied (report.totalFixes === 0), the function exits silently, so it's safe to call unconditionally without cluttering output on clean runs.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
report | FixReport | Yes | The result object returned from a fix pass. Must include totalFixes (count of individual corrections applied) and filesFixed (count of files that were modified). |
Returns
Nothing. Output goes directly to stdout as a formatted message, e.g. Auto-fixed 3 issues in 2 files. Pair this with a QA report printer to give users a complete picture of both detected and resolved issues.
Heads up
- Produces no output when
totalFixesis0— this is intentional. Don't add your own "nothing to fix" message expecting this function to suppress it; it simply returns early.
Example:
type FixReport = {
totalFixes: number
filesFixed: number
}
function printFixReport(report: FixReport): void {
if (report.totalFixes === 0) {
return
}
console.log(
` Auto-fixed ${report.totalFixes} issue${report.totalFixes > 1 ? "s" : ""} in ${report.filesFixed} file${report.filesFixed > 1 ? "s" : ""}`
)
}
// Simulate the result of a fix pass over a docs directory
const fixReport: FixReport = {
totalFixes: 5,
filesFixed: 3,
}
printFixReport(fixReport)
// Output:
// Auto-fixed 5 issues in 3 files
// Silent on a clean run — no output produced
const cleanReport: FixReport = { totalFixes: 0, filesFixed: 0 }
printFixReport(cleanReport)