Anthropic-client
Classes
AnthropicClient
class AnthropicClient implements LLMClient
Use AnthropicClient to send completion requests to Anthropic's Claude models within skrypt's documentation generation pipeline.
Reach for this when you need to swap in Claude as the AI backend — for example, if you prefer Claude's output quality for technical writing or are already using Anthropic's API elsewhere. It implements the shared LLMClient interface, so it's a drop-in replacement for any other provider client.
AnthropicClient wraps the official Anthropic SDK and handles retries automatically, backing off and retrying failed requests up to maxRetries times before throwing.
Constructor Parameters
| Name | Type | Required | Description |
|---|---|---|---|
config.apiKey | string | Yes | Your Anthropic API key — find it at console.anthropic.com/settings/keys |
config.model | string | Yes | The Claude model to use for completions, e.g. "claude-3-5-sonnet-20241022" |
config.maxRetries | number | No | How many times to retry a failed request before throwing. Defaults to 3 |
Returns
An AnthropicClient instance that satisfies the LLMClient interface. Pass it to skrypt's generator config wherever an LLMClient is expected — the generator calls .complete() on it internally to produce descriptions, parameter docs, and code examples.
Heads up:
- The
providerproperty is set to"anthropic"and is read by skrypt to tag generated docs with their source model — don't override it. - If
apiKeyis omitted or empty, the client initializes without throwing, but any completion call will fail. Validate your key before passing it in.
Example:
// Inline types matching the LLMClient interface shape
const LLMProvider = { anthropic: "anthropic" };
// Mock implementation mirroring AnthropicClient's behavior
class AnthropicClient {
provider = "anthropic";
#model;
#maxRetries;
#apiKey;
constructor(config) {
if (!config.apiKey) throw new Error("apiKey is required");
this.#model = config.model;
this.#maxRetries = config.maxRetries ?? 3;
this.#apiKey = config.apiKey;
}
async complete(request) {
// In production, this calls the Anthropic SDK with this.#model
// Simulating a successful completion response
return {
content: `Documents the \`${request.symbol}\` function: ${request.prompt.slice(0, 60)}...`,
model: this.#model,
usage: { inputTokens: 312, outputTokens: 128 },
};
}
}
// Simulates how skrypt's generator consumes an LLMClient
async function generateDocs(client, symbols) {
const results = [];
for (const symbol of symbols) {
const response = await client.complete({
symbol,
prompt: `Write concise API documentation for the ${symbol} function, including parameters and return value.`,
});
results.push({ symbol, doc: response.content });
}
return results;
}
async function main() {
const client = new AnthropicClient({
apiKey: "sk-ant-api03-xK9mP2nQrL8vW4jY6hT1cB5dF0eA3gN7iU2oR9sM-xxxxxxxxxxx",
model: "claude-3-5-sonnet-20241022",
maxRetries: 3,
});
console.log("Provider:", client.provider); // "anthropic"
try {
const docs = await generateDocs(client, ["createUser", "deleteSession"]);
console.log("Generated docs:", JSON.stringify(docs, null, 2));
// Expected output:
// Provider: anthropic
// Generated docs: [
// { symbol: "createUser", doc: "Documents the `createUser` function: Write concise API documentation..." },
// { symbol: "deleteSession", doc: "Documents the `deleteSession` function: Write concise API documentation..." }
// ]
} catch (err) {
console.error("Completion failed:", err.message);
}
}
main();
Methods
constructor
constructor(config: LLMClientConfig)
Use AnthropicClient to connect skrypt's documentation generator to Anthropic's Claude models for AI-powered doc and code example generation.
Instantiate this when you need to swap in Anthropic as the LLM provider — for example, if you're configuring skrypt programmatically via a config file or building a custom generation pipeline instead of using the CLI's --provider flag.
The constructor initializes the underlying Anthropic SDK client and pins the model and retry behavior for all subsequent completion requests made during a skrypt generate run. If no maxRetries is provided, it defaults to 3.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
config.apiKey | string | Yes | Your Anthropic API key — get one at console.anthropic.com/settings/keys |
config.model | string | Yes | The Claude model to use for generation, e.g. "claude-3-5-sonnet-20241022" — controls quality, speed, and cost of generated docs |
config.maxRetries | number | No | How many times to retry a failed API call before throwing. Defaults to 3 — increase for flaky network environments |
Returns
Returns an AnthropicClient instance ready to issue completion requests. Pass this instance to your skrypt generation config so the scanner uses it when generating descriptions and code examples for each extracted API signature.
Heads up
- The
apiKeyfalls back to an empty string if omitted rather than throwing immediately — the error surfaces later when the first completion request fires. SetANTHROPIC_API_KEYin your environment or pass it explicitly to catch misconfigurations early. - Model availability and context limits vary by Claude version. If you're generating docs for a large codebase, prefer a model with a larger context window (e.g.
claude-3-5-sonnet-20241022) to avoid truncation mid-generation.
Example:
// Inline types to keep this self-contained
interface LLMClientConfig {
apiKey: string;
model: string;
maxRetries?: number;
}
// Minimal mock of the Anthropic SDK
class Anthropic {
apiKey: string;
constructor(config: { apiKey: string }) {
this.apiKey = config.apiKey;
}
}
// Inline AnthropicClient implementation
class AnthropicClient {
private client: Anthropic;
private model: string;
private maxRetries: number;
private apiKey: string;
constructor(config: LLMClientConfig) {
this.model = config.model;
this.maxRetries = config.maxRetries ?? 3;
this.apiKey = config.apiKey || "";
this.client = new Anthropic({ apiKey: this.apiKey });
}
getInfo() {
return {
model: this.model,
maxRetries: this.maxRetries,
apiKeyPrefix: this.apiKey.slice(0, 12) + "...",
};
}
}
async function main() {
try {
const client = new AnthropicClient({
apiKey: "sk-ant-api03-xK9mP2nQrL8vW4jY",
model: "claude-3-5-sonnet-20241022",
maxRetries: 5,
});
console.log("AnthropicClient initialized:", client.getInfo());
// => AnthropicClient initialized: {
// model: 'claude-3-5-sonnet-20241022',
// maxRetries: 5,
// apiKeyPrefix: 'sk-ant-api03-...'
// }
} catch (error) {
console.error("Failed to initialize AnthropicClient:", error);
}
}
main();
isConfigured
isConfigured(): boolean
Use isConfigured() to verify that an AnthropicClient instance has a valid API key before attempting to generate documentation.
Call this before invoking complete() or triggering any skrypt generate pipeline that depends on Anthropic — it lets you fail fast with a clear error rather than discovering a missing key mid-generation.
The check confirms the underlying API key is neither an empty string nor the "not-set" placeholder that skrypt uses when no key has been provided. It does not make a network call or validate the key against Anthropic's API.
Returns: true if the client holds a usable API key, false otherwise. Use the result to gate your generation logic or surface a helpful configuration error to the user.
Heads up:
- A
trueresult means the key is present, not that it's valid — a malformed or revoked key will still pass this check and only fail whencomplete()is called. - If you're reading the API key from an environment variable, ensure the variable is loaded (e.g. via
dotenv) before constructing the client, orisConfigured()will returnfalseeven when the key exists.
Example:
const PLACEHOLDER = "not-set";
class AnthropicClient {
private apiKey: string;
private model: string;
constructor(config: { apiKey?: string; model?: string }) {
this.apiKey = config.apiKey ?? PLACEHOLDER;
this.model = config.model ?? "claude-3-5-sonnet-20241022";
}
isConfigured(): boolean {
return this.apiKey !== "" && this.apiKey !== PLACEHOLDER;
}
}
async function generateDocs(sourceDir: string) {
const client = new AnthropicClient({
apiKey: process.env.ANTHROPIC_API_KEY,
model: "claude-3-5-sonnet-20241022",
});
if (!client.isConfigured()) {
throw new Error(
"Anthropic API key is missing. Set ANTHROPIC_API_KEY in your environment " +
"or pass it explicitly to AnthropicClient. " +
"Get your key at console.anthropic.com/settings/keys"
);
}
console.log(`Client ready — scanning ${sourceDir} for documentation...`);
// proceed with generation
}
try {
// Simulate a missing key
delete process.env.ANTHROPIC_API_KEY;
await generateDocs("./src");
} catch (err) {
console.log((err as Error).message);
// Output:
// Anthropic API key is missing. Set ANTHROPIC_API_KEY in your environment
// or pass it explicitly to AnthropicClient.
// Get your key at console.anthropic.com/settings/keys
}
// Now with a real key
process.env.ANTHROPIC_API_KEY = "sk-ant-api03-r8Kx2mNpQ9vLwYjT...";
const configuredClient = new AnthropicClient({ apiKey: process.env.ANTHROPIC_API_KEY });
console.log(configuredClient.isConfigured()); // true
complete
async complete(request: CompletionRequest): Promise<CompletionResponse>
Use complete to send a conversation to Claude and get a generated text response back.
Reach for this when you need to prompt Claude with a series of messages — including an optional system prompt — and receive a completion. It's the core method you'll call on AnthropicClient after constructing it with your API key and model preferences.
The method automatically separates any system-role message from the rest of the conversation before forwarding to the Anthropic API, so you can pass a unified message array without restructuring it yourself. If you don't specify a model on the request, it falls back to the model set during client construction.
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
request | CompletionRequest | Yes | The completion request to send to Claude. |
request.messages | Message[] | Yes | Ordered conversation history. Include a { role: 'system', content: '...' } entry anywhere in the array to set Claude's behavior — it will be extracted automatically. |
request.model | string | No | Claude model to use (e.g. "claude-3-5-sonnet-20241022"). Overrides the client-level default when provided. |
Returns
Returns a Promise<CompletionResponse> containing Claude's reply. Use response.content to get the generated text and pass it back into messages as an assistant entry to continue a multi-turn conversation.
Heads up
- Calling
completeon a client whereisConfigured()returnsfalse(missing or placeholder API key) will throw — check configuration before making requests in production. - Only the first
systemmessage is extracted; if you include multiple system-role entries, the extras are treated as conversation turns, which Claude may handle unexpectedly.
Example:
const Anthropic = require("@anthropic-ai/sdk");
// Inline types to keep example self-contained
interface Message {
role: "system" | "user" | "assistant";
content: string;
}
interface CompletionRequest {
messages: Message[];
model?: string;
}
interface CompletionResponse {
content: string;
model: string;
usage: { inputTokens: number; outputTokens: number };
}
// Minimal AnthropicClient implementation matching the real one's behavior
class AnthropicClient {
private client: Anthropic;
private model: string;
private apiKey: string;
constructor(config: { apiKey: string; model?: string }) {
this.apiKey = config.apiKey;
this.model = config.model ?? "claude-3-5-sonnet-20241022";
this.client = new Anthropic({ apiKey: this.apiKey });
}
isConfigured(): boolean {
return this.apiKey !== "" && this.apiKey !== "not-set";
}
async complete(request: CompletionRequest): Promise<CompletionResponse> {
const model = request.model || this.model;
const systemMessage = request.messages.find((m) => m.role === "system");
const conversationMessages = request.messages.filter(
(m) => m.role !== "system"
);
const response = await this.client.messages.create({
model,
max_tokens: 1024,
system: systemMessage?.content,
messages: conversationMessages.map((m) => ({
role: m.role as "user" | "assistant",
content: m.content,
})),
});
const content =
response.content[0].type === "text" ? response.content[0].text : "";
return {
content,
model: response.model,
usage: {
inputTokens: response.usage.input_tokens,
outputTokens: response.usage.output_tokens,
},
};
}
}
async function main() {
const anthropicClient = new AnthropicClient({
apiKey: process.env.ANTHROPIC_API_KEY ?? "not-set",
model: "claude-3-5-sonnet-20241022",
});
if (!anthropicClient.isConfigured()) {
throw new Error("ANTHROPIC_API_KEY is not set");
}
try {
const response = await anthropicClient.complete({
messages: [
{
role: "system",
content:
"You are a concise technical assistant. Reply in one sentence.",
},
{
role: "user",
content: "What does a TypeScript interface do?",
},
],
});
console.log("Reply:", response.content);
// Reply: A TypeScript interface defines the shape of an object,
// specifying which properties and methods it must have.
console.log("Model:", response.model);
console.log("Tokens used:", response.usage);
} catch (error) {
console.error("Completion failed:", error);
}
}
main();