Calls

What is a Call?

A Call is a call to a LLM to perform a generation. With the Opper SDKs, calls can be configured to use a specific model, prompt, schema. Note that all calls are automatically logged as traces in Opper.

Behind the scenes, the Opper API works to construct prompts for the model to generate outputs that optimially completes the call. Prompts are always completely visible in the trace view.

Make a Call

Opper also support multi modal inputs and outputs. For examples of this, check out the Examples folder in the Python SDK and Node SDK respectively.

Call with Instructions

Instructions are useful for instructing the model of how to respond to a given input. We typically recommend keeping them small and concise and let the schemas be expressive.

Use Schemas

Calls are preferably defined to return a specific JSON schema. While this is JSON under the hood, we recommend using Pydantic for Python and Zod for Typescript to facilitate building accurate schemas:

Specifying Model

On any call you can specify which model to use. Opper has support for a lot of models, and loves to add additional ones. If no model is specified, Opper reverts to azure/gpt-4o. To specify a different model:

See Models for more information.

Specifying Examples

Providing examples to a call is a great way to show how you want outputs to look like given some inputs. This will help the model reason without having to improve the prompt. The number of examples is limited to 10.

For a more extensive example of using examples by populating dataset and automatically retrieving them in the call, see Guiding output with examples.

Read more