Node.js Integration
AI Guardian works with the openai npm package with zero additional dependencies. Change apiKey and baseURL — everything else is identical.
Installation
npm install openaiBasic Usage
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "aig_YOUR_API_KEY",
baseURL: "http://localhost:8000/api/v1/proxy",
});
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What is the capital of France?" },
],
});
console.log(response.choices[0].message.content);Error Handling
Catch APIError from the openai package and check the error.code field:
import OpenAI, { APIError } from "openai";
const client = new OpenAI({
apiKey: process.env.AI_GUARDIAN_API_KEY,
baseURL: process.env.AI_GUARDIAN_BASE_URL,
});
async function safeComplete(userMessage: string): Promise<string> {
try {
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: userMessage }],
});
return response.choices[0].message.content ?? "";
} catch (error) {
if (error instanceof APIError) {
const body = error.error as Record<string, unknown>;
const code = (body?.code as string) ?? "unknown";
if (code === "request_blocked") {
return `[BLOCKED] Risk score: ${body.risk_score}. Blocked by AI Guardian.`;
}
if (code === "queued_for_review") {
return `[QUEUED] Review ID: ${body.review_item_id}. Pending human review.`;
}
}
throw error;
}
}Vercel AI SDK
Using the Vercel AI SDK? Use createOpenAI with a custom baseURL:
import { createOpenAI } from "@ai-sdk/openai";
const guardian = createOpenAI({
apiKey: process.env.AI_GUARDIAN_API_KEY,
baseURL: process.env.AI_GUARDIAN_BASE_URL,
});
// Use like any other Vercel AI SDK provider
const model = guardian("gpt-4o");Environment Variables
Store credentials in .env.local (Next.js) or .env:
# .env.local
AI_GUARDIAN_API_KEY=aig_YOUR_API_KEY
AI_GUARDIAN_BASE_URL=http://localhost:8000/api/v1/proxy