OpenAI SDK
The official OpenAI SDK is the broadest and lowest-friction integration path for Sentinel.
Recommended configuration
Use the Sentinel key as the SDK apiKey and point the SDK at the
OpenAI-compatible base URL:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.SP_API_KEY!,
baseURL: `${process.env.SENTINEL_BASE_URL!}/v1`,
});
Why it is the default
The OpenAI SDK is the default recommendation because:
- base URL override is supported
- Sentinel auth matches the SDK’s Bearer model naturally
- the OpenAI-compatible lane provides the broadest stable surface
For most integrations, this is the simplest path to production.
Broadly supported surfaces
The OpenAI-compatible lane is the strongest current recommendation for most Sentinel integrations.
Broadly supported surfaces include:
- chat completions
- responses
- embeddings
- images
- transcriptions
- speech
- moderations
- models
Caveats
Treat the following as separate compatibility checks rather than implied by base OpenAI-compatible support:
- file workflows
- batch workflows
- realtime workflows
- vector store workflows
- assistant-style surfaces
Response retrieval and similar control-plane-like extensions should also be treated as separate support questions.
Not broadly recommended yet
The following should not be treated as part of the default broad support posture unless explicitly verified in your environment:
- advanced control-plane-like extensions beyond create-style inference paths
- realtime or assistant-style workflows