Get the most out of Generative AI with Opper

Opper is a Unified API that makes it easy for developers to build model independent, maintainable and successful generative AI features, agents and apps.

Opper dashboard

To get going with Opper we recommend having a look at our get started guide. For a more in depth overview of the concepts of Opper check out our overview.

Here are some of the benefits:

Write clean, structured model independent LLM calls

With our SDK for Python and Typescript, you can interact with models through structured input and output schemas. This allows you to reason about LLM calls just like any other Python or Typescript function call.

Read more in our get started guide or more on the benefits of schema based prompting in our blog post on Introduction to Schema Based Prompting: Structured inputs for Predictable outputs

Pick from frontier, open-source and multi-modal models

We have over 50 models in the platform. Due to the structured calls you can often switch without any changes to client code or prompts. We manage downstream API keys and integrations to providers so you don't have to. You can also plug in your custom models or your own providers.

Read more: Supported models.

Build datasets and test prompts and models per task

We streamline the process of building so called golden datasets. Once you have tasks that execute well, add them to datasets and use them for testing alternative prompts and models. Or use dataset entries as examples in prompts.

Read more: Datasets.

Integrate your own data and information

Our SDKs also supports uploading documents, PDFs or structured objects and retrieve it in your LLM calls. This allows you to build knowledge powered LLM features very quickly while we take care of the strategies for chunking, reranking etc.

Read more: Indexes.

Perform evaluations and catch runtime feedback

Our SDKs supports uploading metrics and other types of feedback and attach it to tracing spans. This allows you see what works (and not) as your feature is live with user. And take approprite actions to correct things.

Read more: Feedback.

Optimize calls with examples

With our SDKs and in our UI, we make adding Examples as a way to instruct the call of what success looks like as easy as it can be. This allows you to focus less on iterating on elaborate prompts and more on showing calls the output you expect and iterate add to this over time.

Read more: Examples.

Stay in the loop with detailed logging and tracing

All calls through the platform, be it indexing, retrieval, calls to models is automatically logged and traced so you can debug things easily. You can even trace non-AI methods and operations so you get a complete understanding of your app.

Read more: Tracing.

Cost analytics

With our analytics API you can slice costs per project, model or custom tags and get complete understanding how how tokens are being spent.

Go compliant by default

We go far in making sure you are compliant to GDPR and other regulations by default. We take pride in making it a no brainer for you to add Opper as your data processor to your own terms of service.

Read more: Compliance.

Lets build!

Get started with Opper today by signing up and following our get started guide.