Skip to main content
Mastra is a TypeScript framework for building AI-powered applications and agents. Using Mastra’s LangSmith exporter, you can send traces from your Mastra agents and workflows to LangSmith for debugging, evaluation, and observability. This guide shows you how to integrate Mastra with LangSmith using Mastra’s AI tracing system.

Installation

Install Mastra and the LangSmith exporter:
npm install @mastra/core @mastra/langsmith @mastra/observability @mastra/libsql

Setup

  1. Set your LangSmith API key and (optionally) a LangSmith project name:
    export LANGSMITH_API_KEY=<your_langsmith_api_key>
    export LANGCHAIN_PROJECT=<your_project_name> # optional
    
    If LANGCHAIN_PROJECT is not set, traces will be sent to the default project.
  2. If you plan to use OpenAI models, also ensure you have an OpenAI API key available at runtime:
    export OPENAI_API_KEY=<your_openai_api_key>
    
  3. In your project directory, create the following project structure and files:
    src/
        mastra.ts
        agent.ts
        index.ts
    

Configure Mastra with the LangSmith exporter

Mastra tracing is configured directly on the Mastra constructor. Add the following to a mastra.ts file:
import { Mastra } from "@mastra/core";
import { LibSQLStore } from "@mastra/libsql";
import { LangSmithExporter } from "@mastra/langsmith";

import { echoAgent } from "./agent";

export const mastra = new Mastra({
  agents: { echoAgent },

  storage: new LibSQLStore({
    url: "file:./mastra.db",
  }),

  observability: {
    configs: {
      langsmith: {
        serviceName: "mastra-langsmith-demo",
        exporters: [
          new LangSmithExporter({
            apiKey: process.env.LANGSMITH_API_KEY,
          }),
        ],
      },
    },
  },

  // Disable deprecated telemetry system
  telemetry: {
    enabled: false,
  },
});
  • Storage is required for tracing (even when exporting traces externally).
  • The LangSmith exporter reads credentials from environment variables.
  • The deprecated telemetry system is disabled to avoid warnings.
  • No separate instrumentation file is required when running Mastra outside of the Mastra server. For more details, refer to the Mastra docs.

Define an agent

For compatibility, use string-based model identifiers. Add the following code to an agent.ts file:
import { Agent } from "@mastra/core/agent";

export const echoAgent = new Agent({
  name: "echoAgent",
  instructions: "You are a helpful assistant.",
  model: "openai/gpt-4o-mini",
});
Mastra will automatically route the model call using your configured API keys and capture traces for each invocation.

Run the agent

  1. Add the following to an index.ts file:
    import { mastra } from "./mastra";
    
    async function main() {
    const agent = mastra.getAgent("echoAgent");
    const result = await agent.generate("Say hello and explain what Mastra is.");
    console.log(result.text);
    }
    
    main();
    
  2. Run your application:
    npx ts-node src/index.ts
    

View traces in LangSmith

After running the agent:
  1. Open the LangSmith UI.
  2. Select your project. For example, the value of LANGCHAIN_PROJECT.
  3. Locate the trace corresponding to echoAgent.generate.
You’ll be able to inspect:
  • Model inputs and outputs
  • Agent execution steps
  • Timing and error information

Connect these docs to Claude, VSCode, and more via MCP for real-time answers.