Skip to content

openai[minor],docs[minor]: Add parallel tool calls call option and docs #5717

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 11, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 13 additions & 0 deletions docs/core_docs/docs/integrations/chat/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -138,3 +138,16 @@ import OpenAIStreamTokens from "@examples/models/chat/integration_openai_stream_
:::tip
See the LangSmith trace [here](https://smith.langchain.com/public/66bf7377-cc69-4676-91b6-25929a05e8b7/r)
:::

### Disabling parallel tool calls

If you have multiple tools bound to the model, but you'd only like for a single tool to be called at a time, you can pass the `parallel_tool_calls` call option to enable/disable this behavior.
By default, `parallel_tool_calls` is set to `true`.

import OpenAIParallelToolCallsTokens from "@examples/models/chat/integration_openai_parallel_tool_calls.ts";

<CodeBlock language="typescript">{OpenAIParallelToolCallsTokens}</CodeBlock>

:::tip
See the LangSmith trace for the first invocation [here](https://smith.langchain.com/public/68f2ff13-6331-47d8-a8c0-d1745788e84e/r) and the second invocation [here](https://smith.langchain.com/public/6c2fff29-9470-486a-8715-805fda631024/r)
:::
85 changes: 85 additions & 0 deletions examples/src/models/chat/integration_openai_parallel_tool_calls.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
import { ChatOpenAI } from "@langchain/openai";
import { z } from "zod";
import { zodToJsonSchema } from "zod-to-json-schema";

const model = new ChatOpenAI({
temperature: 0,
model: "gpt-4o"
});

// Define your tools
const calculatorSchema = z
.object({
operation: z.enum(["add", "subtract", "multiply", "divide"]),
number1: z.number(),
number2: z.number(),
})
.describe("A tool to perform basic arithmetic operations");
const weatherSchema = z
.object({
city: z.enum(["add", "subtract", "multiply", "divide"]),
})
.describe("A tool to get the weather in a city");

// Bind tools to the model
const modelWithTools = model.bindTools([
{
type: "function",
function: {
name: "calculator",
description: calculatorSchema.description,
parameters: zodToJsonSchema(calculatorSchema),
},
},
{
type: "function",
function: {
name: "weather",
description: weatherSchema.description,
parameters: zodToJsonSchema(weatherSchema),
},
},
]);

// Invoke the model with `parallel_tool_calls` set to `true`
const response = await modelWithTools.invoke(
["What is the weather in san francisco and what is 23716 times 27342?"],
{
parallel_tool_calls: true,
}
);
console.log(response.tool_calls);
// We can see it called two tools
/*
[
{
name: 'weather',
args: { city: 'san francisco' },
id: 'call_c1KymEIix7mdlFtgLSnTXmDc'
},
{
name: 'calculator',
args: { operation: 'multiply', number1: 23716, number2: 27342 },
id: 'call_ANLYclAmXQ4TwUCLXakbPr3Z'
}
]
*/

// Invoke the model with `parallel_tool_calls` set to `false`
const response2 = await modelWithTools.invoke(
["What is the weather in san francisco and what is 23716 times 27342?"],
{
parallel_tool_calls: false,
}
);
console.log(response2.tool_calls);
// We can see it called one tool
/*
[
{
name: 'weather',
args: { city: 'san francisco' },
id: 'call_Rk34XffawJjgZ2BCK9E4CwlT'
}
]
*/
2 changes: 1 addition & 1 deletion libs/langchain-openai/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
"dependencies": {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! I noticed the change in the "openai" dependency version from "^4.41.1" to "^4.49.1" in the package.json file. This comment is to flag the dependency change for the maintainers to review, as it may impact the project's dependencies. Great work on the PR!

"@langchain/core": ">=0.2.5 <0.3.0",
"js-tiktoken": "^1.0.12",
"openai": "^4.41.1",
"openai": "^4.49.1",
"zod": "^3.22.4",
"zod-to-json-schema": "^3.22.3"
},
Expand Down
18 changes: 17 additions & 1 deletion libs/langchain-openai/src/chat_models.ts
Original file line number Diff line number Diff line change
Expand Up @@ -253,7 +253,22 @@ export interface ChatOpenAICallOptions
promptIndex?: number;
response_format?: { type: "json_object" };
seed?: number;
stream_options?: { include_usage: boolean };
/**
* Additional options to pass to streamed completions.
*/
stream_options?: {
/**
* Whether or not to include token usage in the stream.
* If set to `true`, this will include an additional
* chunk at the end of the stream with the token usage.
*/
include_usage: boolean;
};
/**
* Whether or not to restrict the ability to
* call multiple tools in one response.
*/
parallel_tool_calls?: boolean;
}

/**
Expand Down Expand Up @@ -557,6 +572,7 @@ export class ChatOpenAI<
...(options?.stream_options !== undefined
? { stream_options: options.stream_options }
: {}),
parallel_tool_calls: options?.parallel_tool_calls,
...this.modelKwargs,
};
return params;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -274,3 +274,49 @@ test("withStructuredOutput includeRaw true", async () => {
JSON.parse(raw.additional_kwargs.tool_calls?.[0].function.arguments ?? "")
).toBe(true);
});

test.only("parallelToolCalls param", async () => {
const calculatorSchema = z
.object({
operation: z.enum(["add", "subtract", "multiply", "divide"]),
number1: z.number(),
number2: z.number(),
})
.describe("A tool to perform basic arithmetic operations");
const weatherSchema = z
.object({
city: z.enum(["add", "subtract", "multiply", "divide"]),
})
.describe("A tool to get the weather in a city");

const model = new ChatOpenAI({
model: "gpt-4o",
temperature: 0,
}).bindTools([
{
type: "function",
function: {
name: "calculator",
description: calculatorSchema.description,
parameters: zodToJsonSchema(calculatorSchema),
},
},
{
type: "function",
function: {
name: "weather",
description: weatherSchema.description,
parameters: zodToJsonSchema(weatherSchema),
},
},
]);

const response = await model.invoke(
["What is the weather in san francisco and what is 23716 times 27342?"],
{
parallel_tool_calls: false,
}
);
console.log(response.tool_calls);
expect(response.tool_calls?.length).toBe(1);
});
20 changes: 19 additions & 1 deletion yarn.lock
Original file line number Diff line number Diff line change
Expand Up @@ -10066,7 +10066,7 @@ __metadata:
jest: ^29.5.0
jest-environment-node: ^29.6.4
js-tiktoken: ^1.0.12
openai: ^4.41.1
openai: ^4.49.1
prettier: ^2.8.3
release-it: ^15.10.1
rimraf: ^5.0.1
Expand Down Expand Up @@ -30529,6 +30529,24 @@ __metadata:
languageName: node
linkType: hard

"openai@npm:^4.49.1":
version: 4.49.1
resolution: "openai@npm:4.49.1"
dependencies:
"@types/node": ^18.11.18
"@types/node-fetch": ^2.6.4
abort-controller: ^3.0.0
agentkeepalive: ^4.2.1
form-data-encoder: 1.7.2
formdata-node: ^4.3.2
node-fetch: ^2.6.7
web-streams-polyfill: ^3.2.1
bin:
openai: bin/cli
checksum: b9bc845d25412d6b6ad827fb1363a4029935d8eb85a8708e55f5cf2852a0551b8720c8099edcbb0a2c2ab2be2d8f652a97061d9898b908e29d9bb2f727304b6e
languageName: node
linkType: hard

"openapi-types@npm:^12.1.3":
version: 12.1.3
resolution: "openapi-types@npm:12.1.3"
Expand Down
Loading