threadline
The persistent context layer for AI agents.
Every agent forgets you. Threadline remembers.
Trusted by developers building the next generation of AI products.
< 50ms context retrieval
Privacy-first by design
Works with any LLM
your-app.com
Your Agent
user interaction
user →
I prefer concise answers, I'm a backend engineer.
prompts + history
context layer
Threadline
threadline.to
tone: "concise"
role: "engineer"
tools: ["cursor", "vercel"]
enriched context
custom agent
Your Agent/Product
already knows
bot →
Got it — keeping it brief.
one context object · every agent · no repeated conversation
Your coding assistant, your support agent, your onboarding flow — every session starts from zero.
Your users repeat themselves. You keep rebuilding the same memory layer.
Privacy-first by design
User context is isolated, never shared across developers, and fully deletable on request.
One context, every agent
A persistent, structured profile that travels with your users across every agent you build.
A few lines of code
Drop in the SDK and get context-aware agents in minutes, not weeks of infra work.
Free
$0
up to 10,000 calls/month
Get started immediately
Builder
$49/mo
up to 250,000 calls/month
Growing products
Scale
$299/mo
up to 2,000,000 calls/month
Production scale
Enterprise
Custom
Unlimited
Contact us
threadline.ts
import { Threadline } from "threadline-sdk"
const tl = new Threadline({ apiKey: process.env.THREADLINE_KEY! })
// Before your AI call — inject user context into the prompt
const prompt = await tl.inject(userId, basePrompt)
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "system", content: prompt }]
})
// After your AI call — update the user's context
await tl.update({ userId, userMessage, agentResponse })
That's it. Your agent now remembers every user, across every conversation.