Quick Start
Get IngestIQ running in 5 minutes
Prerequisites#
Before you begin, ensure you have the following installed:
Node.js 18+
Docker & Docker Compose
You'll also need API keys for:
- OpenAI - For embeddings (Get API Key)
- Google AI - For document parsing (Get API Key)
Step 1: Clone the Repository#
git clone https://github.com/avesta-hq/ingestiq-backend.git
cd ingestiq-backend
Step 2: Configure Environment#
Copy the example environment file and add your API keys:
cp .env.example .env
Edit .env and set your API keys:
# Required API Keys
OPENAI_API_KEY=sk-your-openai-key
GOOGLE_API_KEY=your-google-api-key
# JWT Secrets (generate secure random strings)
JWT_SECRET=your-super-secret-jwt-key
JWT_REFRESH_SECRET=your-super-secret-refresh-key
Never commit your .env file to version control. It contains sensitive API keys.
Step 3: Start Services with Docker#
Launch all services (PostgreSQL, Redis, NATS, MinIO):
docker compose up -d
This starts:
- PostgreSQL 16 with pgvector extension (ports 5432, 5433)
- Redis for job queuing (port 6379)
- NATS with JetStream for messaging (port 4222)
- MinIO for S3-compatible storage (port 9000)
- Gotenberg for document conversion (port 3001)
Step 4: Install Dependencies & Run Migrations#
# Install Node.js dependencies
npm install
# Run database migrations
npm run db:migrate
# Seed initial data (AI models, connector types)
npm run db:seed
Step 5: Start the Server#
npm run dev
IngestIQ is now running at http://localhost:3000
Step 6: Verify Installation#
Check the health endpoint:
curl http://localhost:3000/api/health
Expected response:
{
"status": "ok",
"timestamp": "2024-01-28T12:00:00.000Z"
}
Step 7: Explore the API#
Open the Swagger documentation in your browser:
Swagger API Docs
http://localhost:3000/api/docs
What's Next?#
Create a Knowledge Base
Organize your documents into searchable collections
Set Up a Pipeline
Configure automated data ingestion workflows
Upload Documents
Start ingesting PDFs, CSVs, and more
Search Your Data
Query your knowledge base with semantic search
Troubleshooting#
Ensure Docker is running and you have enough memory allocated (minimum 4GB recommended).
docker compose logs
Ensure PostgreSQL is fully started before running migrations:
docker compose logs vectordb
# Wait for "database system is ready to accept connections"
npm run db:migrate
Verify your API keys are correct and have proper permissions:
- OpenAI: Ensure you have credits and API access enabled
- Google: Enable the Generative AI API in your Google Cloud project