Processing text
LLMs are very capable text processors. You can use them for a variety of text related tasks. Below is a subset of common use cases and patterns of using them.
Reasoning
Making models reason through its response is a powerful technique that improves the quality of responses. By adding a thoughts
field as the first field in the output schema you can make the model first express how it thinks about forming the response, this is called chain of thought.
1from opperai import AsyncOpper
2from pydantic import BaseModel
3import asyncio
4
5opper = AsyncOpper()
6
7class Response(BaseModel):
8 thoughts: str
9 response: str
10
11async def main():
12 result, _ = await opper.call(
13 input="If a train travels 120 km in 2 hours, what is its average speed in km/h?",
14 output_type=Response
15 )
16
17 print(result)
18
19# Run the async function
20asyncio.run(main())
21
22# Response(
23# thoughts="The train travels 120 km in 2 hours, so its average speed is 120 km / 2 hours = 60 km/h",
24# response="The average speed of the train is 60 km/h"
25# )
1import Client from "opperai";
2
3// Initialize the client
4const client = new Client();
5
6const outputSchema = {
7 $schema: "https://json-schema.org/draft/2020-12/schema",
8 type: "object",
9 properties: {
10 thoughts: {
11 description: "Reasoning process for calculating the speed",
12 type: "string",
13 },
14 response: {
15 description: "The calculated average speed in km/h",
16 type: "number",
17 },
18 },
19 required: ["thoughts", "response"],
20};
21
22async function main() {
23 try {
24 const { json_payload } = await client.call({
25 name: "node-sdk/call/train-speed",
26 instructions: "Calculate the average speed of a train and provide reasoning.",
27 input: "If a train travels 120 km in 2 hours, what is its average speed in km/h?",
28 output_schema: outputSchema,
29 });
30 console.log("JSON response: ", json_payload);
31 } catch (error) {
32 console.error("Failed to send message:", error);
33 }
34}
35
36main();
37
38// Example output:
39// JSON response: {
40// thoughts: "To calculate the average speed, we need to divide the total distance by the total time. The train travels 120 km in 2 hours.",
41// response: "The average speed of the train is 60 km/h"
42// }
Planning
It is possible to further improve reasoning by also adding a steps field to the output schema. This lets the model first be very diligent about the steps it needs to take to solve the problem, thereby improving the final answer.
1from opperai import AsyncOpper
2from pydantic import BaseModel
3import asyncio
4
5opper = AsyncOpper()
6
7class Output(BaseModel):
8 thoughts: str
9 steps: list[str]
10 final_answer: str
11
12async def main():
13 response, _ = await opper.call(
14 name="get_ingredients",
15 instructions="Given a dish, provide an ingredient list",
16 input="Boullabaisse",
17 output_type=Output
18 )
19
20 print(response)
21
22# Run the async function
23asyncio.run(main())
24
25# Output(
26# thoughts="The user is asking for the ingredients to make Boullabaisse, a traditional French soup.",
27# steps=[
28# "1. Start with a base of onions, carrots, and celery.",
29# "2. Add garlic, tomatoes, and herbs like thyme, bay leaf, and parsley.",
30# "3. Simmer for 1 hour.",
31# "4. Add the fish, shellfish, and potatoes.",
32# "5. Simmer until the potatoes are tender."
33# ],
34# final_answer="The ingredients for making Boullabaisse are onions, carrots, celery, garlic, tomatoes, herbs, fish, shellfish, potatoes."
35# )
1import Client from "opperai";
2
3// Initialize the client
4const client = new Client();
5
6const outputSchema = {
7 $schema: "https://json-schema.org/draft/2020-12/schema",
8 type: "object",
9 properties: {
10 thoughts: {
11 description: "Reasoning process for making Boullabaisse",
12 type: "string",
13 },
14 steps: {
15 description: "Step-by-step guide on how to make Boullabaisse",
16 type: "array",
17 items: {
18 type: "string",
19 },
20 },
21 final_answer: {
22 description: "Final answer on how to make Boullabaisse",
23 type: "string",
24 },
25 },
26 required: ["thoughts", "steps", "final_answer"],
27};
28
29async function main() {
30 try {
31 const { json_payload } = await client.call({
32 name: "get_ingredients",
33 instructions: "Given a dish, provide an ingredient list",
34 input: "Boullabaisse",
35 output_schema: outputSchema,
36 });
37 console.log("JSON response: ", json_payload);
38 } catch (error) {
39 console.error("Failed to send message:", error);
40 }
41}
42
43main();
44
45// Example output:
46// JSON response: {
47// thoughts: "To make Boullabaisse, we need to gather the ingredients and follow a recipe.",
48// steps: [
49// "1. Start with a base of onions, carrots, and celery.",
50// "2. Add garlic, tomatoes, and herbs like thyme, bay leaf, and parsley.",
51// "3. Simmer for 1 hour.",
52// "4. Add the fish, shellfish, and potatoes.",
53// "5. Simmer until the potatoes are tender."
54// ],
55// final_answer: "The ingredients for making Boullabaisse are onions, carrots, celery, garlic, tomatoes, herbs, fish, shellfish, potatoes."
56// }
Extracting information
You can use LLMs to extract structured data from text and images. This can be usedful to make these entities programmable.
Here is a simple example where we extract a structured Room object from a text string.
1from opperai import AsyncOpper
2from pydantic import BaseModel
3import asyncio
4
5aopper = AsyncOpper()
6
7class Room(BaseModel):
8 beds: int
9 seaview: bool
10 description: str
11
12async def extractRoom(input: str) -> Room:
13 response, _ = await aopper.call(
14 name="extractRoom",
15 instructions="Extract room details from the given text",
16 input=input,
17 output_type=Room
18 )
19 return response
20
21async def main():
22 result = await extractRoom("Room at Grand Hotel with 2 beds and a view to the sea")
23 print(result)
24
25# Run the async function
26asyncio.run(main())
27
28# Output:
29# Room(beds=2, seaview=True, description="Room at Grand Hotel with 2 beds and a view to the sea")
1import Client from "opperai";
2
3// Initialize the client
4const client = new Client();
5
6const outputSchema = {
7 $schema: "https://json-schema.org/draft/2020-12/schema",
8 type: "object",
9 properties: {
10 beds: {
11 description: "Number of beds in the room",
12 type: "number",
13 },
14 seaview: {
15 description: "Whether the room has a seaview",
16 type: "boolean",
17 },
18 description: {
19 description: "Description of the room",
20 type: "string",
21 },
22 },
23 required: ["beds", "seaview", "description"],
24};
25
26async function main() {
27 try {
28 const { json_payload } = await client.call({
29 name: "node-sdk/call/extract-room",
30 instructions: "Extract room details from the given text",
31 input: "Room at Grand Hotel with 2 beds and a view to the sea",
32 output_schema: outputSchema,
33 });
34 console.log("JSON response: ", json_payload);
35 } catch (error) {
36 console.error("Failed to send message:", error);
37 }
38}
39
40main();
41
42// Example output:
43// JSON response: {
44// beds: 2,
45// seaview: true,
46// description: "Room at Grand Hotel with 2 beds and a view to the sea"
47// }
Selecting from options
You can use LLMs to make decisions, perform selection and reason about them. This can be useful for building things like recommendation systems and next steps.
Here we have an example of providing a recommendation of additional things for a user to buy based on past purchase history
1from opperai import AsyncOpper
2from pydantic import BaseModel
3import asyncio
4
5aopper = AsyncOpper()
6
7class RecommendedItem(BaseModel):
8 thoughts: str # Improves reasoning
9 item: str
10
11async def recommend_additional_item(cart: list, purchase_history: list) -> RecommendedItem:
12 response, _ = await aopper.call(
13 name="recommend_additional_item",
14 instructions="Your task is to complete a shopping cart so that the owner can make a dish out of the cart.",
15 input={"cart": cart, "purchase_history": purchase_history},
16 output_type=RecommendedItem
17 )
18 return response
19
20# Options to choose from
21purchase_history = [
22 "milk",
23 "pasta",
24 "cream",
25 "cheese",
26 "bacon",
27 "tomatoes",
28 "potatoes",
29 "water",
30 "milk",
31 "chicken",
32 "beef",
33 "fish",
34 "vegetables",
35 "fruit",
36 "spices",
37 "oil",
38 "butter",
39 "rice",
40 "noodles",
41 "flour",
42 "sugar",
43 "syrup",
44 "spice",
45 "cochenille",
46]
47
48# Current shopping cart
49cart = [
50 "pasta",
51 "cream",
52 "cheese",
53]
54
55async def main():
56 result = await recommend_additional_item(cart, purchase_history)
57 print(result)
58
59# Run the async function
60asyncio.run(main())
61# Output: RecommendedItem(
62# thoughts="With pasta, cream, and cheese in the cart, the owner appears to be aiming to make a pasta dish, possibly Alfredo pasta which commonly uses these ingredients. To complete this meal, a protein such as chicken would complement the dish well, adding flavor and substance.",
63# item="chicken"
64# )
1import Client from "opperai";
2
3// Initialize the client
4const client = new Client();
5
6const outputSchema = {
7 $schema: "https://json-schema.org/draft/2020-12/schema",
8 type: "object",
9 properties: {
10 thoughts: {
11 description:
12 "Reasoning process for recommending an additional item",
13 type: "string",
14 },
15 item: {
16 description: "Recommended item to add to the cart",
17 type: "string",
18 },
19 },
20 required: ["thoughts", "item"],
21};
22
23async function main() {
24 try {
25 const cart = ["pasta", "cream", "cheese"];
26
27 const purchase_history = [
28 "milk",
29 "pasta",
30 "cream",
31 "cheese",
32 "bacon",
33 "tomatoes",
34 "potatoes",
35 "water",
36 "milk",
37 "chicken",
38 "beef",
39 "fish",
40 "vegetables",
41 "fruit",
42 "spices",
43 "oil",
44 "butter",
45 "rice",
46 "noodles",
47 "flour",
48 "sugar",
49 "syrup",
50 "spice",
51 "cochenille",
52 ];
53
54 const { json_payload } = await client.call({
55 name: "node-sdk/call/recommend-additional-item",
56 instructions:
57 "Your task is to complete a shopping cart so that the owner can make a dish out of the cart.",
58 input: { cart: cart, purchase_history: purchase_history },
59 output_schema: outputSchema,
60 });
61 console.log("JSON response: ", json_payload);
62 } catch (error) {
63 console.error("Failed to send message:", error);
64 }
65}
66
67main();
68
69// Example output:
70// JSON response: {
71// thoughts: "With pasta, cream, and cheese in the cart, the owner appears to be aiming to make a pasta dish, possibly Alfredo pasta which commonly uses these ingredients. To complete this meal, a protein such as chicken would complement the dish well, adding flavor and substance.",
72// item: "chicken"
73// }
Performing classification
You can use LLMs to classify text and images. This can be useful to solve various categorization problems.
Here is an example of classifies an input text, thought of as being a support request, into one of the four categories: Bug, Feature Request, Question or Unknown.
1from opperai import Client
2from pydantic import BaseModel
3from typing import Literal
4
5class Classification(BaseModel):
6 thoughts: str # improves reasoning
7 category: Literal['Bug', 'Feature Request', 'Question', 'Unknown']
8 confidence: float
9
10opper = Client()
11
12async def classifyText(input: str) -> Classification:
13 """Classify the text"""
14 result, _ = await opper.call(
15 name="classify_text",
16 instructions="Classify the input text into one of the categories: Bug, Feature Request, Question, or Unknown",
17 input=input,
18 output_type=Classification
19 )
20 return result
21
22# Example usage
23import asyncio
24
25async def main():
26 print(await classifyText("I encountered an error when trying to save the file."))
27 # Output: Classification(
28 # thoughts="This appears to be a bug report as the user is describing an error.",
29 # category='Bug',
30 # confidence=0.95
31 # )
32
33 print(await classifyText("Can you add a dark mode feature?"))
34 # Output: Classification(
35 # thoughts="This is clearly a request for a new feature.",
36 # category='Feature Request',
37 # confidence=0.95
38 # )
39
40 print(await classifyText("How do I reset my password?"))
41 # Output: Classification(
42 # thoughts="This is a typical user question about account management.",
43 # category='Question',
44 # confidence=0.95
45 # )
46
47asyncio.run(main())
1import Client from "opperai";
2
3// Initialize the client
4const client = new Client();
5
6const outputSchema = {
7 $schema: "https://json-schema.org/draft/2020-12/schema",
8 type: "object",
9 properties: {
10 thoughts: {
11 description: "Reasoning process for classification",
12 type: "string",
13 },
14 category: {
15 description: "Category of the text",
16 type: "string",
17 enum: ["Bug", "Feature Request", "Question", "Unknown"],
18 },
19 confidence: {
20 description: "Confidence level of the classification",
21 type: "number",
22 },
23 },
24 required: ["thoughts", "category", "confidence"],
25};
26
27async function main() {
28 try {
29 const { json_payload } = await client.call({
30 name: "node-sdk/call/classify-text",
31 instructions: "Classify the input text into one of the categories: Bug, Feature Request, Question, or Unknown",
32 input: "I encountered an error when trying to save the file.",
33 output_schema: outputSchema,
34 });
35 console.log("JSON response: ", json_payload);
36 } catch (error) {
37 console.error("Failed to send message:", error);
38 }
39}
40
41main();
42
43// Example output:
44// JSON response: {
45// thoughts: "This appears to be a bug report as the user is describing an error.",
46// category: "Bug",
47// confidence: 0.95
48// }