The Problem
LLMs are stateless. Every conversation starts from scratch. Your AI assistant doesn’t remember:- User preferences (“I prefer concise answers”)
- Past context (“We discussed this project last week”)
- Personal details (“I’m a Python developer working at a startup”)
The Solution
GetProfile is a drop-in LLM proxy that automatically:Captures
Conversations between users and your AI
Extracts
Structured traits and memories using LLM analysis
Injects
Relevant context into every prompt
Updates
User profiles continuously in the background
Multiple Integration Options
LLM Proxy
Change your OpenAI base URL for automatic memory injection
JavaScript SDK
Programmatic access from Node.js/TypeScript
Proxy Integration (Automatic)
Just change your OpenAI base URL. That’s it.Key Features
Structured User Profiles
Structured User Profiles
Unlike generic memory solutions that store blobs of text, GetProfile extracts typed traits with confidence scores:
Zero-Friction Integration
Zero-Friction Integration
- OpenAI-compatible proxy — works with any OpenAI SDK
- No code changes — just update your base URL
- Streaming support — full SSE streaming passthrough
Customizable Trait Schema
Customizable Trait Schema
Define what matters for your app with JSON configuration files.
Open Source & Self-Hostable
Open Source & Self-Hostable
- Apache 2.0 licensed — use it anywhere
- Self-host with Docker — your data stays with you
- Transparent — audit the code, understand what’s happening