Skip to main content

The Problem

LLMs are stateless. Every conversation starts from scratch. Your AI assistant doesn’t remember:
  • User preferences (“I prefer concise answers”)
  • Past context (“We discussed this project last week”)
  • Personal details (“I’m a Python developer working at a startup”)
This makes AI interactions feel impersonal and repetitive.

The Solution

GetProfile is a drop-in LLM proxy that automatically:

Captures

Conversations between users and your AI

Extracts

Structured traits and memories using LLM analysis

Injects

Relevant context into every prompt

Updates

User profiles continuously in the background

Multiple Integration Options

LLM Proxy

Change your OpenAI base URL for automatic memory injection

JavaScript SDK

Programmatic access from Node.js/TypeScript

Proxy Integration (Automatic)

Just change your OpenAI base URL. That’s it.
// Before: Stateless AI
const client = new OpenAI({ apiKey: "sk-..." });

// After: AI with memory
const client = new OpenAI({
  apiKey: "gp_...", // Your GetProfile API key
  baseURL: "https://api.yourserver.com/v1", // Or your self-hosted instance
  defaultHeaders: {
    "X-GetProfile-Id": userId, // Your app's user ID
    "X-Upstream-Key": "sk-...", // Your OpenAI key
  },
});

// Same API, now with persistent memory
const response = await client.chat.completions.create({
  model: "gpt-5",
  messages: [{ role: "user", content: "How should I refactor this?" }],
});

Key Features

Unlike generic memory solutions that store blobs of text, GetProfile extracts typed traits with confidence scores:
{
  "name": { "value": "Alex", "confidence": 0.95 },
  "expertise_level": { "value": "advanced", "confidence": 0.8 },
  "communication_style": { "value": "technical", "confidence": 0.7 }
}
  • OpenAI-compatible proxy — works with any OpenAI SDK
  • No code changes — just update your base URL
  • Streaming support — full SSE streaming passthrough
Define what matters for your app with JSON configuration files.
  • Apache 2.0 licensed — use it anywhere
  • Self-host with Docker — your data stays with you
  • Transparent — audit the code, understand what’s happening

What’s Next?