Self-Hosting — Coming Q2 2026
The open source repository is not yet public. These instructions are a preview of what will be available after our pilot program. Want to use Aelira now? Join the pilot program for immediate cloud-hosted access.
Quick Start Guide
Get Aelira running locally and scan your first document in under 5 minutes.
What You'll Set Up
This quick start guide walks you through setting up a complete self-hosted Aelira accessibility compliance platform on your local machine using Docker Compose. By the end of this guide, you'll have:
- PostgreSQL database - Stores scan results, user accounts, and compliance reports
- Redis cache - High-performance caching for faster API responses
- Ollama AI server - Self-hosted language models (Llama 3.2, Qwen 2.5 Coder, Moondream2)
- FastAPI backend - REST API for document processing and accessibility scanning
- React dashboard - Web UI for uploading files and viewing compliance reports
All AI processing runs completely on your infrastructure - no data ever leaves your network. Perfect for universities that need FERPA compliance or organizations with strict data privacy requirements.
Why Self-Host Aelira?
Complete Data Privacy
Your educational materials, student documents, and course content never leave your servers. Unlike cloud-hosted solutions that send data to OpenAI or Google, Aelira processes everything locally using self-hosted Ollama models.
FERPA Compliance
Federal law requires universities to protect student education records. Self-hosting ensures full FERPA compliance since no third-party AI services ever access student data. Audit logs prove data never left your network.
No Usage Limits
Process unlimited documents without per-page fees or API rate limits. Cloud services charge per scan - self-hosting means your only cost is infrastructure. Process 10 files or 10,000 files for the same price.
Full Customization
Open source (MIT license) means you can modify anything. Adjust WCAG thresholds, add custom checks, integrate with your LMS, or white-label the entire platform. Your infrastructure, your rules.
Prerequisites
- Docker and Docker Compose installed
- Git for cloning the repository
- 8GB RAM minimum (16GB recommended for AI models)
Clone the Repository
Q2 2026# Repository not yet public — coming Q2 2026
git clone https://github.com/aelira-ai/aelira.git
# Navigate to the backend directory
cd aelira/backendConfigure Environment
# Copy the example environment file
cp .env.example .env
# Edit .env with your settings (optional for local dev)
# The defaults work for local developmentFor production deployments, see the Self-Hosting Guide for detailed configuration options.
Start the Services
# Start all services with Docker Compose
docker-compose up -d
# This starts:
# - PostgreSQL database (port 5432)
# - Redis cache (port 6379)
# - Ollama AI server (port 11434)
# - Backend API (port 8000)
# - Dashboard UI (port 5173)Note: The first startup downloads AI models (~4GB). This may take 5-10 minutes depending on your connection.
Access the Dashboard
# Open your browser to:
http://localhost:5173
# API is available at:
http://localhost:8000
# API documentation:
http://localhost:8000/docsRun Your First Scan
From the dashboard, you can:
- Upload documents - PDFs, PowerPoint, Word, Excel files
- Scan websites - Enter a URL to scan for WCAG issues
- Bulk upload - Process multiple files at once
- View issues - See detailed accessibility violations with fixes
Prefer the Command Line?
Use the CLI for bulk processing and automation:
# Scan a single file
curl -X POST http://localhost:8000/api/v1/scan/document \
-F "[email protected]"
# Scan a website
curl -X POST http://localhost:8000/api/v1/scan/website \
-H "Content-Type: application/json" \
-d '{"url": "https://example.com"}'Common Issues
Port already in use
If port 5432 or 8000 is already in use, edit docker-compose.yml to change the port mappings.
# Check what's using port 8000
lsof -i :8000
# Kill the process if needed
kill -9 <PID>Ollama models not loading
Check that Ollama is running: docker logs aelira-ollama. Models download on first use and require ~4GB disk space.
# Check Ollama container status
docker ps | grep ollama
# View Ollama logs
docker logs -f aelira-ollama
# Manually pull models
docker exec aelira-ollama ollama pull llama3.2:3bOut of memory
AI models require significant RAM. Llama 3.2 3B needs ~4GB, Qwen 2.5 Coder 7B needs ~8GB. Increase Docker memory limits or use smaller models. See memory configuration.
# Check Docker memory usage
docker stats
# Increase Docker Desktop memory:
# Settings → Resources → Memory → 16GBDatabase connection errors
PostgreSQL may take 10-20 seconds to fully start. Wait for "database system is ready to accept connections" in logs.
# Check PostgreSQL logs
docker logs aelira-postgres
# Wait for ready message
docker logs -f aelira-postgres | grep "ready to accept"Dashboard not loading
Frontend build may take 30-60 seconds on first startup. Check Vite dev server logs for compilation progress.
# Check dashboard container logs
docker logs -f aelira-dashboard
# Rebuild if needed
docker-compose restart dashboardWhat You Can Do Now
1. Process Your First Document
Upload a PDF, PowerPoint, or Word document from your course materials. Aelira will:
- Run OCR on scanned pages (if needed)
- Check for proper structure tags (headings, lists, tables)
- Detect images missing alt text and generate descriptions with AI
- Flag color contrast violations
- Generate a compliance report with WCAG 2.1 Level AA violations
2. Scan a Course Website
Enter your Canvas course URL or department website. Aelira uses axe-core and Playwright to:
- Crawl all pages (respecting robots.txt)
- Check for keyboard navigation issues
- Detect missing ARIA labels
- Test focus order and screen reader compatibility
- Generate working code fixes for HTML/CSS violations
3. Bulk Process a Directory
Use the CLI or API to process entire course directories:
# Process all PDFs in a directory
curl -X POST http://localhost:8000/api/v1/batch/scan \
-F "[email protected]" \
-F "[email protected]" \
-F "[email protected]"
# Returns compliance reports for all files4. Integrate with Your LMS
Connect Aelira to Canvas, Blackboard, or Google Workspace to automatically scan course files as faculty upload them. See integrations documentation.