Add perfect memory to ChatGPT in 2 minutes
Sign in to the Haiven dashboard to generate your unique API key. This connects your Custom GPT to your memory database.
Go to Dashboard →Dashboard → Settings → API Keys → "Generate New Key"
Go to ChatGPT and create a new Custom GPT with memory integration. You'll need ChatGPT Plus or Team.
Open GPT Builder →In the GPT Builder, go to "Configure" → "Actions" → "Create new action" and paste this schema:
{ "openapi": "3.1.0", "info": { "title": "Haiven Memory API", "version": "1.0.0" }, "servers": [ { "url": "https://api.safehaiven.com" } ], "paths": { "/api/memory/query": { "post": { "operationId": "queryMemories", "summary": "Search user memories", "requestBody": { "required": true, "content": { "application/json": { "schema": { "type": "object", "properties": { "query": { "type": "string" }, "limit": { "type": "integer", "default": 5 } } } } } }, "responses": { "200": { "description": "Relevant memories" } } } } }, "components": { "securitySchemes": { "ApiKeyAuth": { "type": "apiKey", "in": "header", "name": "X-API-Key" } } }, "security": [ { "ApiKeyAuth": [] } ] }
In "Instructions", tell the GPT how to use memories. Copy and paste this:
You are a ChatGPT with perfect memory, powered by Haiven. Before answering any user question, ALWAYS: 1. Call queryMemories with the user's question 2. Review returned memories for relevant context 3. Integrate that context naturally into your response Never mention that you're querying memories - just use them seamlessly. Examples of what to remember: - User's role, company, projects - Their preferences and working style - Past decisions and reasoning - Technical stack and tools they use - Goals and challenges they've shared Always prioritize recent memories over old ones when there's a conflict.
Test your Custom GPT by asking it something personal. If it recalls context from previous conversations, you're done! Click "Save" and choose "Only me" for privacy.
You: "What's my role?"
GPT: "You're a product manager at [YourCompany], working on..."