Install Ollama and pull your model(s)
CVEasy AI uses the CVEasy AI Engine (powered by Ollama) for local AI inference. Install it, then pull the recommended model for your tier:
# Install Ollama (macOS)
brew install ollama
# Start the Ollama server
ollama serve
# Single optimized model (~5 GB)
ollama pull cveasy-lite
# Primary engine (~10 GB)
ollama pull cveasy-pro
# Code engine (~5 GB)
ollama pull cveasy-coder
CVEasy AI auto-detects available models on startup. Pro users can configure model routing in Settings to assign specific models to remediation, chat, reports, code generation, and analysis tasks.
Download and run CVEasy AI
CVEasy AI is distributed as a single binary. Download the latest release for your platform:
# Extract your installation package
tar -xzf cveasy-ai-*.tar.gz
cd cveasy-ai
# Install dependencies
bun install
# Start the server
bun run start
CVEasy AI starts on http://localhost:3001. Open it in your browser.
Configure your company profile
Open Settings and set your industry vertical and compliance frameworks. This calibrates the TRIS score to your environment. A healthcare org and a retail org have different patch priorities for the same CVE.
Ingest your first CVEs
Three ways to get vulnerability data into CVEasy AI:
CVE-2024-1234) or keyword. CVEasy AI pulls live data from NIST NVD and enriches it automatically.Review your triage queue
Open the CVE Triage Queue. Every ingested CVE has been automatically scored and ranked. Findings are sorted by TRIS score. KEV-listed CVEs are pinned at the top regardless of CVSS.
Click any CVE to open the detail view and generate an AI remediation guide. The AI runs locally via Ollama. No data leaves your machine.
(Optional) Switch to a cloud AI provider
Ollama works great for most deployments. If you want higher-quality analysis or don't have hardware to run a local model, you can switch AI providers in Settings:
# In Settings → AI Provider
Provider: OpenAI / Azure OpenAI
API Key: (stored encrypted, never leaves your server)
Model: gpt-4o / etc.
API keys are encrypted at rest with AES-256-GCM. They are never exposed via the API or sent to any third party other than the AI provider you select.