Framework Integrations
ClawPipe integrates with popular AI frameworks. Use it as a drop-in replacement for OpenAI or plug it into LangChain, LlamaIndex, and the Vercel AI SDK.
OpenAI Drop-in Replacement
The OpenAICompat export provides an OpenAI-compatible interface. Replace your OpenAI client with ClawPipe and get all pipeline optimizations automatically.
import { OpenAICompat } from 'clawpipe-ai';
// Drop-in replacement for OpenAI client
const openai = new OpenAICompat({
apiKey: 'cp_xxx',
projectId: 'my-app',
});
// Same API as OpenAI SDK
const completion = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [
{ role: 'system', content: 'You are helpful.' },
{ role: 'user', content: 'Explain recursion' },
],
max_tokens: 2000,
});
console.log(completion.choices[0].message.content);
LangChain
Use ClawPipe as a custom LLM provider in LangChain. Point LangChain's OpenAI integration at the ClawPipe gateway.
from langchain_openai import ChatOpenAI
# Point LangChain at ClawPipe gateway
llm = ChatOpenAI(
base_url="https://api.clawpipe.ai/v1",
api_key="cp_xxx",
model="gpt-4o",
default_headers={"X-Project-Id": "my-app"},
)
# Use exactly like normal LangChain
response = llm.invoke("Explain recursion")
print(response.content)
# Works with chains, agents, RAG, etc.
from langchain_core.prompts import ChatPromptTemplate
prompt = ChatPromptTemplate.from_messages([
("system", "You are a {role}."),
("human", "{input}"),
])
chain = prompt | llm
result = chain.invoke({"role": "tutor", "input": "What is a monad?"})
LlamaIndex
Configure LlamaIndex to use the ClawPipe gateway as its OpenAI-compatible backend.
from llama_index.llms.openai import OpenAI
# Use ClawPipe as the LLM backend
llm = OpenAI(
api_base="https://api.clawpipe.ai/v1",
api_key="cp_xxx",
model="gpt-4o",
additional_kwargs={"headers": {"X-Project-Id": "my-app"}},
)
# Use with any LlamaIndex component
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents, llm=llm)
query_engine = index.as_query_engine()
response = query_engine.query("What does this codebase do?")
Vercel AI SDK
Use ClawPipe with the Vercel AI SDK by configuring the OpenAI provider with the ClawPipe gateway URL.
import { createOpenAI } from '@ai-sdk/openai';
import { generateText, streamText } from 'ai';
const clawpipe = createOpenAI({
baseURL: 'https://api.clawpipe.ai/v1',
apiKey: 'cp_xxx',
headers: { 'X-Project-Id': 'my-app' },
});
// Generate text
const { text } = await generateText({
model: clawpipe('gpt-4o'),
prompt: 'Explain recursion',
});
// Stream text
const result = await streamText({
model: clawpipe('gpt-4o'),
prompt: 'Write a haiku about code',
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}
Gateway as Proxy
Any OpenAI-compatible client can use ClawPipe by setting the base URL. This works with any language or framework that supports custom OpenAI endpoints.
# cURL example
curl -X POST https://api.clawpipe.ai/v1/prompt \
-H "Authorization: Bearer cp_xxx" \
-H "X-Project-Id: my-app" \
-H "Content-Type: application/json" \
-d '{
"prompt": "Explain recursion",
"provider": "openai",
"model": "gpt-4o",
"maxTokens": 2000
}'