API Documentation

Complete guide to integrating Sloop's LLM routing API into your applications.

Start Building
Base URL: https://api.sloop.infloat.co/v1

Authentication

All API requests require an API key. Include your API key in the Authorization header:

bash
Authorization: Bearer sk-sloop-your-api-key

Get your API key: Sign up and visit your dashboard to generate an API key.

Quick Integration

Sloop is a drop-in replacement for OpenAI's API. Simply update your base URL and API key:

python
import openai

client = openai.OpenAI(
    api_key="sk-sloop-your-api-key",
    base_url="https://api.sloop.infloat.co/v1"
)

response = client.chat.completions.create(
    model="auto",  # Let Sloop choose the best model
    messages=[{"role": "user", "content": "Hello!"}]
)

Making Your First Request

Here's a complete example of making a chat completion request with Sloop:

python
import openai

# Initialize with your Sloop API key
client = openai.OpenAI(
    api_key="sk-sloop-your-api-key",
    base_url="https://api.sloop.infloat.co/v1"
)

# Make a request - let Sloop choose the best model
try:
    response = client.chat.completions.create(
        model="auto",  # Sloop intelligently routes to optimal model
        messages=[
            {"role": "system", "content": "You are a helpful assistant."},
            {"role": "user", "content": "Explain quantum computing in simple terms."}
        ],
        max_tokens=150,
        temperature=0.7
    )
    
    print(response.choices[0].message.content)
    
except Exception as e:
    print(f"Error: {e}")

What Sloop Handles

  • Provider selection and routing
  • Automatic failover and retries
  • Cost optimization across providers
  • Rate limiting and load balancing

What You Control

  • Request parameters and settings
  • Your application logic and data
  • Usage monitoring and analytics
  • Routing configuration in dashboard

Error Handling

Sloop returns OpenAI-compatible error responses. Here's how to handle them properly:

python
import openai
from openai import OpenAIError

client = openai.OpenAI(
    api_key="sk-sloop-your-api-key",
    base_url="https://api.sloop.infloat.co/v1"
)

try:
    response = client.chat.completions.create(
        model="auto",  # Let Sloop choose the best model
        messages=[{"role": "user", "content": "Hello!"}]
    )
except openai.AuthenticationError:
    print("Invalid API key")
except openai.RateLimitError:
    print("Rate limit exceeded, please try again later")
except openai.APIConnectionError:
    print("Network error, please check your connection")
except OpenAIError as e:
    print(f"API error: {e}")

Tip: Sloop includes automatic retries and failover, so transient errors are often handled automatically without reaching your application.

LangChain Integration

Sloop works seamlessly with LangChain. Simply configure it as a custom OpenAI endpoint:

python
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage

# Initialize LangChain with Sloop
llm = ChatOpenAI(
    model="auto",  # Let Sloop choose the best model
    openai_api_key="sk-sloop-your-api-key",
    openai_api_base="https://api.sloop.infloat.co/v1",
    temperature=0.7
)

# Use with LangChain as normal
messages = [
    SystemMessage(content="You are a helpful AI assistant."),
    HumanMessage(content="What are the benefits of using LLM routing?")
]

response = llm.invoke(messages)
print(response.content)

# Works with chains, agents, and all LangChain features
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory

memory = ConversationBufferMemory()
conversation = ConversationChain(
    llm=llm,
    memory=memory,
    verbose=True
)

result = conversation.predict(input="Hello, I'm interested in AI routing")
print(result)

LangChain Benefits with Sloop

  • Automatic routing: Sloop chooses the best model for each request in your chain
  • Cost optimization: Reduce costs while maintaining quality across complex workflows
  • Reliability: Built-in failover ensures your chains keep running
  • Full compatibility: Works with agents, tools, and all LangChain features

Next Steps

Ready to start building? Here's what to do next:

1

Get Your API Key

Sign up for a free account and generate your API key from the dashboard.

Create Account →
2

Update Your Code

Replace your OpenAI base URL with Sloop's. No other changes needed.

See Integration Guide →
3

Monitor Usage

Track your API usage, costs, and performance in the dashboard.

View Dashboard →

API Status

Current Status
Online
Last updated: 12:53:20 PM

Rate Limits

Free Tier10K tokens/month
Pro Tier1M tokens/month
Requests/minute60 RPM