Local Setup
Running C.O.R.E in Local
Prerequisites
- Docker
- OpenAI API Key
Get started
-
Copy Environment Variables
Copy the example environment file to .env:
-
Start the Application Use Docker Compose to start all required services:
-
Access the App
Once the containers are running, open your browser and go to http://localhost:3000.
-
Login with Magic Link
- Choose the “Magic Link” login option.
- Enter your email.
- Copy the magic link from terminal logs and open it in your browser.
-
Create Your Private Space & Ingest Data
- In the dashboard, go to the ingest section.
- Type a message, e.g., I love playing badminton, and click “Add”.
- Your memory is queued for processing; you can monitor its status in the server logs.
- Once processing is complete, nodes will be added to your private knowledge graph and visible in the dashboard.
- You can later choose to connect this memory to other tools or keep it private.
-
Search Your Memory
Use the dashboard’s search feature to query your ingested data within your private space.
Connecting to the API
You can also interact with C.O.R.E. programmatically via its APIs.
-
Generate an API key
In the dashboard, navigate to the API section and generate a new API key.
-
API Endpoints
- Use your API key to authenticate requests to the following endpoints:
- Ingest API: POST /ingest
- Search API: POST /search
- See below for example request bodies and details.
- Use your API key to authenticate requests to the following endpoints:
Ingest API
-
Endpoint: /ingest
-
Method: POST
-
Authentication: Bearer token (API key)
-
Body Example:
-
Behavior:
- Each ingestion is queued per user for processing in their private space.
- The system automatically creates and links graph nodes.
- You can monitor the status in the logs or dashboard.
- You can later connect this memory to other tools as you wish.
Search API
-
Endpoint: /search
-
Method: POST
-
Authentication: Bearer token (API key)
-
Body Example:
-
Behavior:
Returns relevant text matches scoped to your private memory space.