Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
130 changes: 130 additions & 0 deletions contents/docs/llm-analytics/installation/opencode.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
---
title: OpenCode LLM analytics installation
platformIconName: IconOpenCode
showStepsToc: true
tableOfContents: [
{
url: 'prerequisites',
value: 'Prerequisites',
depth: 1,
},
{
url: 'install-the-plugin',
value: 'Install the plugin',
depth: 1,
},
{
url: 'configure-posthog',
value: 'Configure PostHog',
depth: 1,
},
{
url: 'start-opencode',
value: 'Start OpenCode',
depth: 1,
},
{
url: 'verify-traces-and-generations',
value: 'Verify traces and generations',
depth: 1,
},
{
url: 'configuration-options',
value: 'Configuration options',
depth: 1,
},
]
---

[OpenCode](https://opencode.ai) is an open-source coding agent that runs in your terminal. The `opencode-posthog` plugin captures LLM generations, tool executions, and conversation traces as `$ai_generation`, `$ai_span`, and `$ai_trace` events and sends them to PostHog.

## Prerequisites

You need:

- [OpenCode](https://opencode.ai/docs/) installed
- A [PostHog account](https://us.posthog.com/signup) with a project API key

## Install the plugin

Add `opencode-posthog` to your `opencode.json`:

```json
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["opencode-posthog"]
}
```

The package installs automatically at startup and is cached in `~/.cache/opencode/node_modules/`.

## Configure PostHog

Set environment variables with your PostHog project API key and host. You can find these in your [PostHog project settings](https://us.posthog.com/settings/project).

```bash
export POSTHOG_API_KEY="<ph_project_api_key>"
export POSTHOG_HOST="<ph_client_api_host>"
```

All other plugin configuration is also done with environment variables. If `POSTHOG_API_KEY` is not set, the plugin is a no-op.

> **Tip:** You can add these environment variables to your shell profile (for example, `~/.zshrc` or `~/.bashrc`) so they persist across sessions.

## Start OpenCode

Start OpenCode as normal:

```bash
opencode
```

The plugin initializes automatically and starts capturing LLM calls, tool executions, and completed prompt traces once OpenCode is running.

## Verify traces and generations

After running a few prompts through OpenCode:

1. Go to the [LLM analytics](https://us.posthog.com/llm-analytics) tab in PostHog.
2. You should see traces and generations appearing within a few minutes.

## Configuration options

All configuration is done via environment variables:

| Variable | Default | Description |
|---|---|---|
| `POSTHOG_API_KEY` | _(required)_ | Your PostHog project API key |
| `POSTHOG_HOST` | `https://us.i.posthog.com` | PostHog instance URL |
| `POSTHOG_PRIVACY_MODE` | `false` | Redact all LLM input/output content when `true` |
| `POSTHOG_ENABLED` | `true` | Set `false` to disable |
| `POSTHOG_DISTINCT_ID` | machine hostname | The `distinct_id` for all events |
| `POSTHOG_PROJECT_NAME` | cwd basename | Project name in all events |
| `POSTHOG_TAGS` | _(none)_ | Custom tags: `key1:val1,key2:val2` |
| `POSTHOG_MAX_ATTRIBUTE_LENGTH` | `12000` | Max length for serialized tool input/output |

### What gets captured

The plugin captures three types of events:

- **`$ai_generation`**: Every LLM call, including model, provider, token usage, cost, stop reason, and message content in [OpenAI chat format](/docs/llm-analytics/generations).
- **`$ai_span`**: Each tool execution, including tool name, duration, input and output state, and any error information ([learn more](/docs/llm-analytics/spans)).
- **`$ai_trace`**: Completed user prompts with total latency, accumulated token counts, and final input and output state ([learn more](/docs/llm-analytics/traces)).

### Privacy mode

When `POSTHOG_PRIVACY_MODE=true`, content fields such as `$ai_input`, `$ai_output_choices`, `$ai_input_state`, and `$ai_output_state` are set to `null`. Token counts, costs, latency, and model metadata are still captured.

Sensitive keys in tool inputs and outputs, including `api_key`, `token`, `secret`, `password`, `authorization`, `credential`, and `private_key`, are automatically redacted regardless of whether privacy mode is enabled.

## Next steps

Now that you're capturing AI conversations, continue with the resources below to learn what else LLM analytics enables within the PostHog platform.

| Resource | Description |
|----------|-------------|
| [Basics](/docs/llm-analytics/basics) | Learn the basics of how LLM calls become events in PostHog. |
| [Generations](/docs/llm-analytics/generations) | Read about the `$ai_generation` event and its properties. |
| [Traces](/docs/llm-analytics/traces) | Explore the trace hierarchy and how to use it to debug LLM calls. |
| [Spans](/docs/llm-analytics/spans) | Review spans and their role in representing individual operations. |
| [Analyze LLM performance](/docs/llm-analytics/dashboard) | Learn how to create dashboards to analyze LLM performance. |
12 changes: 12 additions & 0 deletions src/components/OSIcons/Icons.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -443,6 +443,18 @@ export const IconOpenClaw = (props: IconProps) => (
</BaseIcon>
)

export const IconOpenCode = (props: IconProps) => (
<BaseIcon viewBox="0 0 300 300" width="100%" height="100%" {...props}>
<path
fillRule="evenodd"
clipRule="evenodd"
d="M210 60H90V240H210V60ZM270 300H30V0H270V300Z"
className="fill-[#211E1E] dark:fill-[#F1ECEC]"
/>
<path d="M210 240H90V120H210V240Z" className="fill-[#CFCECD] dark:fill-[#4B4646]" />
</BaseIcon>
)

export const IconOpenAI = (props: IconProps) => (
<BaseIcon viewBox="0 0 24 24" width="100%" height="100%" {...props}>
<path d="M9.67 9.215V7.332c0-.159.06-.278.2-.357l3.785-2.18c.516-.297 1.13-.436 1.764-.436 2.379 0 3.885 1.843 3.885 3.806 0 .138 0 .297-.02.455L15.36 6.321a.664.664 0 0 0-.714 0L9.671 9.215Zm8.841 7.334v-4.5a.663.663 0 0 0-.357-.614L13.18 8.54l1.626-.932a.36.36 0 0 1 .396 0l3.786 2.18c1.09.635 1.824 1.983 1.824 3.291 0 1.506-.892 2.894-2.3 3.469Zm-10.01-3.964-1.625-.952c-.139-.079-.198-.198-.198-.356V6.916c0-2.121 1.625-3.727 3.826-3.727a3.71 3.71 0 0 1 2.26.773l-3.906 2.26a.663.663 0 0 0-.356.615v5.748ZM12 14.606l-2.33-1.308v-2.775L12 9.215l2.329 1.308v2.775L12 14.607Zm1.496 6.026a3.71 3.71 0 0 1-2.26-.773l3.906-2.26a.663.663 0 0 0 .356-.614v-5.748l1.646.951c.138.08.198.198.198.357v4.36c0 2.121-1.645 3.727-3.845 3.727ZM8.8 16.212l-3.786-2.18c-1.09-.635-1.824-1.982-1.824-3.29 0-1.527.912-2.894 2.32-3.47v4.52c0 .277.118.476.356.614l4.955 2.874-1.625.932a.36.36 0 0 1-.396 0Zm-.218 3.25c-2.24 0-3.885-1.684-3.885-3.765 0-.159.02-.317.04-.476l3.904 2.26a.664.664 0 0 0 .714 0l4.975-2.874v1.883c0 .158-.06.277-.198.357l-3.786 2.18c-.516.297-1.13.436-1.764.436Zm4.915 2.36c2.399 0 4.4-1.705 4.857-3.965C20.573 17.282 22 15.201 22 13.08a5.013 5.013 0 0 0-1.665-3.706c.1-.417.159-.833.159-1.25 0-2.834-2.3-4.955-4.956-4.955-.535 0-1.05.08-1.566.258A4.968 4.968 0 0 0 10.504 2a4.956 4.956 0 0 0-4.857 3.964C3.427 6.54 2 8.62 2 10.741c0 1.388.595 2.736 1.665 3.707a5.39 5.39 0 0 0-.159 1.249c0 2.834 2.3 4.955 4.956 4.955.535 0 1.05-.08 1.566-.258a4.967 4.967 0 0 0 3.468 1.428Z" />
Expand Down
5 changes: 5 additions & 0 deletions src/navs/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -5505,6 +5505,11 @@ export const docsMenu = {
url: '/docs/llm-analytics/installation/pi',
icon: 'IconCode',
},
{
name: 'OpenCode',
url: '/docs/llm-analytics/installation/opencode',
icon: 'IconOpenCode',
},
{
name: 'Manual capture',
url: '/docs/llm-analytics/installation/manual-capture',
Expand Down