Docs

Quick start for a routed model API.

Start with the OpenAI-compatible surface, then add alias routing, usage visibility, and deployment controls as the workload matures.

Lower integration anxiety in under a minute.

The first integration path stays intentionally short: create an organization, issue a scoped key, point the SDK at the SCX base URL, and choose a stable alias.

Getting started

01

Create an organization inside the SCX Console.

02

Generate a key with the scope you actually want to expose.

03

Point an OpenAI-compatible SDK to the SCX base URL.

04

Start with one stable alias before layering fallback and dedicated routes.

Quick start

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://api.stellarcomputerx.com/v1",
  apiKey: process.env.SCX_API_KEY,
});

await client.chat.completions.create({
  model: "scx/qwen2.5-72b-instruct",
  messages: [{ role: "user", content: "Explain rollout posture." }],
});

Integration notes

The docs should teach the operating model, not just the request shape.

01

Start with one alias before introducing fallback policy.

02

Keep the platform base URL stable across environments and rotate keys separately.

03

Use the console as the first place to inspect usage, routing posture, and Credit consumption.