# LangChain | Sentry for JavaScript

This integration works in the Node.js, Cloudflare Workers, Vercel Edge Functions, and browser runtimes. It requires SDK version `10.22.0` or higher.

*Import name: `Sentry.langChainIntegration`*

The `langChainIntegration` adds instrumentation for LangChain to capture spans by automatically wrapping LangChain operations and recording AI agent interactions with configurable input/output recording.

For browser applications, you need to manually instrument LangChain operations using the `createLangChainCallbackHandler` helper:

```javascript
import * as Sentry from "@sentry/browser";
import { ChatAnthropic } from "@langchain/anthropic";

// Create a LangChain callback handler
const callbackHandler = Sentry.createLangChainCallbackHandler({
  recordInputs: true, // Optional: record input prompts/messages
  recordOutputs: true, // Optional: record output responses
});

// Use with chat models
const model = new ChatAnthropic({
  model: "claude-3-5-sonnet-20241022",
  apiKey: process.env.ANTHROPIC_API_KEY,
});

await model.invoke("Tell me a joke", {
  callbacks: [callbackHandler],
});
```

## [Options](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#options)

### [`recordInputs`](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#recordinputs)

*Type: `boolean`*

Records inputs to LangChain operations (such as prompts and messages).

Defaults to `true` if `sendDefaultPii` is `true`.

```javascript
Sentry.init({
  integrations: [Sentry.langChainIntegration({ recordInputs: true })],
});
```

### [`recordOutputs`](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#recordoutputs)

*Type: `boolean`*

Records outputs from LangChain operations (such as generated text and responses).

Defaults to `true` if `sendDefaultPii` is `true`.

```javascript
Sentry.init({
  integrations: [Sentry.langChainIntegration({ recordOutputs: true })],
});
```

## [Configuration](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#configuration)

By default this integration adds tracing support for LangChain operations including:

* **Chat model invocations** (`gen_ai.request`) - Captures spans for chat model calls
* **LLM invocations** (`gen_ai.pipeline`) - Captures spans for LLM pipeline executions
* **Chain executions** (`gen_ai.invoke_agent`) - Captures spans for chain invocations
* **Tool executions** (`gen_ai.execute_tool`) - Captures spans for tool calls

### [Supported Runnables](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#supported-runnables)

The integration automatically instruments the following LangChain runnable methods:

* `invoke()` - Single execution
* `stream()` - Streaming execution
* `batch()` - Batch execution

### [Supported Providers](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#supported-providers)

The automatic instrumentation supports the following LangChain provider packages:

* `@langchain/anthropic`
* `@langchain/openai`
* `@langchain/google-genai`
* `@langchain/mistralai`
* `@langchain/google-vertexai`
* `@langchain/groq`

## [Supported Versions](https://docs.sentry.io/platforms/javascript/configuration/integrations/langchain.md#supported-versions)

* `langchain`: `>=0.1.0 <2.0.0`
