Warning:
You can self-host CORE on your own infrastructure using Docker.
The following instructions will use Docker Compose to spin up a CORE instance.
Make sure to read the self-hosting overview first.
As self-hosted deployments tend to have unique requirements and configurations, we don’t provide specific advice for securing your deployment, scaling up, or improving reliability.
This guide alone is unlikely to result in a production-ready deployment. Security, scaling, and reliability concerns are not fully addressed here.
Should the burden ever get too much, we’d be happy to see you on CORE Cloud where we deal with these concerns for you.
Requirements
These are the minimum requirements for running the core.Prerequisites
To run CORE, you will need:- Docker 20.10.0+
- Docker Compose 2.20.0+
System Requirements
- 4+ vCPU
- 8+ GB RAM
- 20+ GB Storage
Deployment Options
CORE offers multiple deployment approaches depending on your needs:Quick Deploy with Railway
For a one-click deployment experience, use Railway:Manual Docker Deployment
Prerequisites: Before starting any deployment, ensure you have yourOPENAI_API_KEYready. This is required for AI functionality in CORE. You must add yourOPENAI_API_KEYto thecore/hosting/docker/.envfile before starting the services.
Combined Setup
For self deployment:-
Clone core repository
-
Start the services:
Next Steps
Once deployed, you can:- Configure your AI providers (OpenAI, Anthropic, etc.)
- Set up integrations (Slack, GitHub, Gmail)
- Start building your memory graph
- Explore the CORE API and SDK
