Tracing
What is Tracing?
Tracing is essentially logging, but where related or subsequent logs is associated with eachother. This drastically simplifies debugging and understanding complex interactions. This is especially relevant for LLM applications, as they often gets inplemented with a series of steps and calls.
Calls create Spans
Opper automatically traces all function calls and index retrievals. Our traces page offers a detailed view of each trace, making it simple to understand complex flows made of many LLM calls.
Connect Spans into a Trace
To place spans into the same trace you have to create a hierarchy. Here is how you can make spans part of a higher level span:
Adding metrics to a Span
You can attach metrics to a span and visualize it in our UI. This is very useful for enabling a feedback loop from your users or applications to yourself.
Investigate Traces
Traces show up in the portal https://platform.opper.ai/traces. There you can see all traces for your project, including token use, evaluations etc.
Viewing a trace
Click on a trace to view all calls in a trace. Under each call is a generation where you can see the prompt, model used etc.
More information
See more extensive examples in our Python and Node.js SDKs, who both have an example folder with runnable examples.