Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .changeset/expose-abort-usage.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
'ai': patch
---

feat(ai): expose usage metrics in onAbort callback
6 changes: 5 additions & 1 deletion content/docs/03-ai-sdk-core/50-error-handling.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,11 @@ for await (const textPart of textStream) {

The `onAbort` callback receives:

- `steps`: An array of all completed steps before the abort
- `steps`: All completed steps before the abort
- `usage`: Token usage for the current (aborted) step, or `undefined` if not yet received
- `totalUsage`: Aggregated usage across all completed steps plus the current step, or `undefined` if no usage data was received
- `inputMessages`: The input messages for the current step, including prior step responses
- `partialText`: The text being streamed at abort time, or `undefined` if no text was generated

You can also handle abort events directly in the stream:

Expand Down
24 changes: 24 additions & 0 deletions content/docs/07-reference/01-ai-sdk-core/02-stream-text.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -1799,6 +1799,30 @@ To see `streamText` in action, check out [these examples](#examples).
type: 'Array<StepResult>',
description: 'Details for all previously finished steps.',
},
{
name: 'usage',
type: 'LanguageModelUsage | undefined',
description:
'Token usage for the current (aborted) step. undefined if the provider had not yet sent usage data before the abort.',
},
{
name: 'totalUsage',
type: 'LanguageModelUsage | undefined',
description:
'Aggregated token usage across all completed steps plus the current step. undefined if no usage data was received before the abort.',
},
{
name: 'inputMessages',
type: 'ModelMessage[]',
description:
'The input messages for the current step, including initial messages and all prior step responses. Empty array if abort fires before the first step starts.',
},
{
name: 'partialText',
type: 'string | undefined',
description:
'The text being streamed in the current step at abort time. undefined if no text had been generated. Resets at each step boundary.',
},
],
},
],
Expand Down
39 changes: 35 additions & 4 deletions examples/ai-functions/src/stream-text/openai/abort.ts
Original file line number Diff line number Diff line change
@@ -1,13 +1,44 @@
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { isStepCount, streamText, tool } from 'ai';
import { z } from 'zod';
import { run } from '../../lib/run';

const tools = {
add: tool({
description: 'Add two numbers',
inputSchema: z.object({ a: z.number(), b: z.number() }),
execute: async ({ a, b }) => ({ result: a + b }),
}),
multiply: tool({
description: 'Multiply two numbers',
inputSchema: z.object({ a: z.number(), b: z.number() }),
execute: async ({ a, b }) => ({ result: a * b }),
}),
};

run(async () => {
let stepCount = 0;

try {
const { textStream } = streamText({
model: openai('gpt-3.5-turbo'),
prompt: 'Write a short story about a robot learning to love:\n\n',
abortSignal: AbortSignal.timeout(3000),
model: openai('gpt-4o-mini'),
tools,
stopWhen: isStepCount(5),
prompt:
'First add 7 and 5. Then multiply the result by 6. Then write a very long detailed essay (at least 500 words) about the number you got.',
abortSignal: AbortSignal.timeout(10000),
onStepFinish(step) {
stepCount++;
console.log(`\n[Step ${stepCount} finished]`);
console.log(' Step usage:', step.usage);
},
onAbort({ usage, totalUsage, inputMessages, partialText }) {
console.log('\n\nStream aborted mid-generation.');
console.log('Current step usage:', usage);
console.log('Total usage:', totalUsage);
console.log('Input messages:', inputMessages);
console.log('Partial text:', partialText);
},
});

for await (const textPart of textStream) {
Expand Down
Loading