removing all LLMA beta labels from docs (#12557)

* removing all beta labels from docs

* chore(llma): rename LLMA team

---------

Co-authored-by: Radu Raicea <radu@raicea.com>
This commit is contained in:
Edwin Lim
2025-08-27 04:18:51 -04:00
committed by GitHub
parent 3a70755d28
commit 07ec386126
20 changed files with 25 additions and 51 deletions

View File

@@ -1,5 +1,5 @@
---
title: LLM analytics dashboard (beta)
title: LLM analytics dashboard
availability:
free: full
selfServe: full

View File

@@ -6,8 +6,6 @@ showStepsToc: true
import LLMsSDKsCallout from './_snippets/llms-sdks-callout.mdx'
import VerifyLLMEventsStep from './_snippets/verify-llm-events-step.mdx'
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
<Steps>
<Step title="Install the PostHog SDK" badge="required">

View File

@@ -6,8 +6,6 @@ showStepsToc: true
import LLMsSDKsCallout from './_snippets/llms-sdks-callout.mdx'
import VerifyLLMEventsStep from './_snippets/verify-llm-events-step.mdx'
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
<Steps>
<Step title="Install the PostHog SDK" badge="required">

View File

@@ -6,8 +6,6 @@ showStepsToc: true
import LLMsSDKsCallout from './_snippets/llms-sdks-callout.mdx'
import VerifyLLMEventsStep from './_snippets/verify-llm-events-step.mdx'
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
<Steps>
<Step title="Install the PostHog SDK" badge="required">

View File

@@ -6,8 +6,6 @@ showStepsToc: true
import LLMsSDKsCallout from './_snippets/llms-sdks-callout.mdx'
import VerifyLLMEventsStep from './_snippets/verify-llm-events-step.mdx'
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
<Steps>
<Step title="Install the PostHog SDK" badge="required">

View File

@@ -6,8 +6,6 @@ showStepsToc: true
import LLMsSDKsCallout from './_snippets/llms-sdks-callout.mdx'
import VerifyLLMEventsStep from './_snippets/verify-llm-events-step.mdx'
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
<Steps>
<Step title="Install the PostHog SDK" badge="required">

View File

@@ -1,5 +1,5 @@
---
title: LLM analytics integrations (beta)
title: LLM analytics integrations
availability:
free: full
selfServe: full

View File

@@ -6,8 +6,6 @@ availability:
enterprise: full
---
> This integration is currently an [opt-in public beta](/docs/getting-started/enable-betas). This means it's not yet a perfect experience, but we'd love to know your thoughts. Please [share your feedback](http://us.posthog.com/home#supportModal) and [follow our roadmap](https://github.com/PostHog/posthog/issues/18547).
You can integrate with [Keywords AI](https://www.keywordsai.co) and bring data into PostHog for analysis. Additionally, we offer a dashboard template to help you quickly get insights into your LLM product.
## How to install the integration

View File

@@ -6,8 +6,6 @@ availability:
enterprise: full
---
> This integration is currently an [opt-in public beta](/docs/getting-started/enable-betas). This means it's not yet a perfect experience, but we'd love to know your thoughts. Please [share your feedback](http://us.posthog.com/home#supportModal) and [follow our roadmap](https://github.com/PostHog/posthog/issues/18547).
You can integrate with [Traceloop](https://www.traceloop.com/) and bring data into PostHog for analysis. Additionally, we offer a dashboard template to help you quickly get insights into your LLM product.
## How to install the integration

View File

@@ -30,12 +30,6 @@ The first step is to install a PostHog SDK to capture conversations, requests, a
<LLMsInstallationPlatforms />
### Beta 🚧
LLM analytics is currently considered in `beta`. To access it, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
We are keen to gather as much feedback as possible so if you try this out please let us know. You can email [peter@posthog.com](mailto:peter@posthog.com) and [radu@posthog.com](mailto:radu@posthog.com), send feedback via the [in-app support panel](https://us.posthog.com#panel=support%3Afeedback%3Aexperiments%3Alow), or use one of our other [support options](/docs/support-options).
<CallToAction type="primary" to="/docs/llm-analytics/installation">
Install PostHog SDK
</CallToAction>
@@ -130,17 +124,23 @@ PostHog's SDK wrappers handle all the heavy lifting. Use your LLM provider as no
<QuestLogItem
title="Use for free"
subtitle="Open beta"
subtitle="Free 100k events/mo"
icon="IconPiggyBank"
>
LLM analytics is currently in beta. Events are currently priced the same as regular PostHog data events, which comes with a generous free tier and transparent usage-based pricing.
PostHog LLM analytics is designed to be cost-effective with a generous free tier and transparent usage-based pricing. Since we don't charge per seat, more than 90% of companies use PostHog for free.
No credit card required to start. To access LLM analytics, enable the [feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability) in your PostHog account.
### TL;DR
---
- No credit card required to start
- First 100K LLM events per month are free with 30-day retention
- Above 100k we have usage-based pricing starting at $0.00006/event with discounts as volume increases
- Set billing limits to avoid surprise charges
- See our [pricing page](/pricing) for more up-to-date details
That's it! You're ready to start integrating.
---
That's it! You're ready to start integrating.
<CallToAction type="primary" to="/docs/llm-analytics/installation">
Install LLM analytics

View File

@@ -39,4 +39,4 @@ This goes for everything at PostHog but in AI features specifically, we use
- With OpenAI and Anthropic libraries in Python, use LLM analytics wrappers in `posthoganalytics.ai.openai` & `posthoganalytics.ai.anthropic`
- With LangChain, use the LLM analytics callback handler in `posthoganalytics.ai.langchain.callbacks`
This will give you and the organization full visibility into your feature see the [LLM analytics dashboard](https://us.posthog.com/project/2/llm-observability). Feel free to leave feedback with [#team-llm-observability](https://posthog.slack.com/archives/C087XQ7K9K7).
This will give you and the organization full visibility into your feature see the [LLM analytics dashboard](https://us.posthog.com/project/2/llm-analytics). Feel free to leave feedback with [#team-llm-analytics](https://posthog.slack.com/archives/C087XQ7K9K7).

View File

@@ -45,7 +45,7 @@ Here is a overview that shows which of our PMs currently works with which team:
**Teams with no PM currently**
- [CRM](/teams/crm)
- [LLM Observability](/teams/llm-observability)
- [LLM Analytics](/teams/llm-analytics)
- [Max AI](/teams/max-ai)
- [Revenue Analytics](/teams/revenue-analytics)

View File

@@ -54,7 +54,7 @@ How much is each LLM call costing you? Is the user getting the result they want
Or, the one piece of information Diego is curious about: “What happened in the LLM that influenced a customer? If products once spread via word of mouth — we will start seeing them spread through word of... next token prediction?"
It's quite possible that a founder might not know to ask these questions, let alone where to look to answer them, and it's why staying alert matters. Towards this, PostHog has just launched a new beta product in this category, introducing [LLM analytics](/docs/llm-analytics) to help you follow more closely what's happening with your LLM calls.
It's quite possible that a founder might not know to ask these questions, let alone where to look to answer them, and it's why staying alert matters. Towards this, PostHog has just launched a new product in this category, introducing [LLM analytics](/docs/llm-analytics) to help you follow more closely what's happening with your LLM calls.
It's an exciting area that has us paying attention alongside Diego and other AI-focused founders.

View File

@@ -1,5 +1,5 @@
---
title: LLM Observability
title: LLM Analytics
sidebar: Handbook
showTitle: true
hideAnchor: false

View File

@@ -1,12 +1,12 @@
### Q3 2025 objectives
#### Goal 1: General Availability Launch of LLM Observability
#### Goal 1: General Availability Launch of LLM Analytics
*Description*: Make LLM Observability a fully supported PostHog product available to all customers. This will help teams reliably understand, monitor, and optimize their AI applications at scale.
*Description*: Make LLM Analytics a fully supported PostHog product available to all customers. This will help teams reliably understand, monitor, and optimize their AI applications at scale.
*What we will ship*:
- Comprehensive, easy-to-follow documentation covering all major workflows and SDKs
- Integrated into the onboarding experience to help AI companies discover and set up LLM Observability quickly
- Integrated into the onboarding experience to help AI companies discover and set up LLM Analytics quickly
#### Goal 2: Evals

View File

@@ -177,7 +177,7 @@ Now, when we run `npm run dev` again and submit an input, we should see a respon
## 3. Viewing generations in PostHog
Once you generate a few responses, go to PostHog and enable the [LLM analytics feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability). Once enabled, go to the LLM analytics tab to get an overview of traces, users, costs, and more.
Once you generate a few responses, go to PostHog's [LLM analytics tab](https://app.posthog.com/llm-analytics) to get an overview of traces, users, costs, and more.
<ProductScreenshot
imageLight="https://res.cloudinary.com/dmukukwp6/image/upload/Clean_Shot_2025_02_05_at_10_00_08_2x_a4773a9cd5.png"

View File

@@ -188,7 +188,7 @@ Now, when we run `npm run dev` again and submit an input, we should see a respon
## 3. Viewing generations in PostHog
Once you generate a few responses, go to PostHog and enable the [LLM analytics feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability). Once enabled, go to the LLM observability tab to get an overview of traces, users, costs, and more.
Once you generate a few responses, go to PostHog's [LLM analytics tab](https://app.posthog.com/llm-analytics) to get an overview of traces, users, costs, and more.
<ProductScreenshot
imageLight="https://res.cloudinary.com/dmukukwp6/image/upload/Clean_Shot_2025_02_14_at_18_28_10_2x_242d0e7bf5.png"

View File

@@ -171,7 +171,7 @@ Now, when we run `npm run dev` again and submit an input, we should see a respon
## 3. Viewing generations in PostHog
Once you generate a few responses, go to PostHog and enable the [LLM analytics feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability). Once enabled, go to the LLM observability tab to get an overview of traces, users, costs, and more.
Once you generate a few responses, go to PostHog's [LLM analytics tab](https://app.posthog.com/llm-analytics) to get an overview of traces, users, costs, and more.
<ProductScreenshot
imageLight="https://res.cloudinary.com/dmukukwp6/image/upload/Clean_Shot_2025_01_23_at_10_58_04_2x_a87f97d692.png"
@@ -180,7 +180,7 @@ Once you generate a few responses, go to PostHog and enable the [LLM analytics f
classes="rounded"
/>
You can also go into more detail by clicking on the [generations tab](https://us.posthog.com/llm-observability/generations). This shows each generation as well as model, cost, token usage, latency, and more. You can even see the conversation input and output.
You can also go into more detail by clicking on the [generations tab](https://us.posthog.com/llm-analytics/generations). This shows each generation as well as model, cost, token usage, latency, and more. You can even see the conversation input and output.
<ProductScreenshot
imageLight="https://res.cloudinary.com/dmukukwp6/image/upload/Clean_Shot_2025_01_23_at_11_05_47_2x_31ac89084d.png"

View File

@@ -169,7 +169,7 @@ Now, when you run `npm run dev` again, you can choose your model, enter your mes
## 3. Viewing generations in PostHog
After generating a few responses with different models, go to PostHog and enable the [LLM observability feature preview](https://app.posthog.com/settings/user-feature-previews#llm-observability). Once enabled, you can access the [LLM analytics dashboard](https://app.posthog.com/llm-analytics) to see:
After generating a few responses with different models, go to PostHog to access the [LLM analytics dashboard](https://app.posthog.com/llm-analytics) to see:
- Overview of all AI interactions
- Cost breakdowns by model

View File

@@ -3083,10 +3083,6 @@ export const docsMenu = {
url: 'https://posthog.com/docs/llm-analytics',
icon: 'IconAIText',
color: 'yellow',
badge: {
title: 'Beta',
className: 'uppercase !bg-blue/10 !text-blue !dark:text-white !dark:bg-blue/50',
},
},
{
name: 'Autocapture',
@@ -4200,17 +4196,9 @@ export const docsMenu = {
colorDark: '[#C170E8]',
icon: 'IconAI',
description: 'Insights for building your AI and LLM products',
badge: {
title: 'Beta',
className: 'uppercase !bg-blue/10 !text-blue !dark:text-white !dark:bg-blue/50',
},
children: [
{
name: 'LLM analytics',
badge: {
title: 'Beta',
className: 'uppercase !bg-blue/10 !text-blue !dark:text-white !dark:bg-blue/50',
},
},
{
name: 'Overview',