Skip to content

Commit

Permalink
docs(js): Add section on tracing at high concurrency in serverless en…
Browse files Browse the repository at this point in the history
…vs (#611)
  • Loading branch information
jacoblee93 authored Jan 2, 2025
1 parent ea7977d commit f71402c
Showing 1 changed file with 36 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -35,3 +35,39 @@ const res = await tracedFn();

await langsmithClient.awaitPendingTraceBatches();
```

## Rate limits at high concurrency

By default, the LangSmith client will batch operations as your traced run executions, sending a new batch every few milliseconds.

This works well in most situations, but if your traced function is long-running and you have very high concurrency,
you may also hit rate limits related to overall request count.

If you are seeing rate limit errors related to this, you can try setting `manualFlushMode: true` in your client like this:

```ts
import { Client } from "langsmith";

const langsmithClient = new Client({
manualFlushMode: true,
});

const myTracedFunc = traceable(
async () => {
// Your logic here...
},
{ client: langsmithClient }
);
```

And then manually calling `client.flush()` like this before your serverless function closes:

```ts
try {
await myTracedFunc();
} finally {
await langsmithClient.flush();
}
```

Note that this will prevent runs from appearing in the LangSmith UI until you call `.flush()`.

0 comments on commit f71402c

Please sign in to comment.