Production-ready. On-premises. No vendor lock-in.
RAGops.dev is currently under construction. It will offer practical, production-ready deployments of internal AI knowledge systems using Large Language Models (LLMs).
If this sounds interesting, you're already the right audience.
RAGops helps organizations ask questions to their own knowledge:
The system answers using your:
All inside your infrastructure.
I design and deliver validated reference deployments, not experimental demos.
Crawling and indexing of internal sources with clean chunking and embeddings
Production-ready setup optimized for your scale and query patterns
On-prem or private cloud deployment with performance tuning
A deployment your team can operate independently.
This is a one-off, fixed-scope project, with knowledge transfer at handover. Optional support can be agreed separately.
Sensitive data stays internal. No prompts sent to external providers.
No surprise pricing changes. No per-query costs.
No dependency on closed platforms. You own the future evolution.
You own: the data, the deployment, the future.
I'm an AI / DevOps engineer at Nokia, with a PhD in Computer Science and 15+ years of experience building and operating large-scale systems.
I've worked on telecom-grade cloud platforms, CI/CD and testing infrastructure, internal developer tooling, and AI systems applied to real production constraints.
I'm also the creator of Vivalerts, a news monitoring system that tracks and ranks emerging topics based on public discussion signals.
RAGops is my way of bringing calm, engineering-first AI adoption to teams that just want systems that work.
Pricing is project-based, not hourly. Projects typically range from €8,000 to €25,000 depending on scale and complexity. Details will be shared privately, based on scope.
The site will open soon. Stay tuned for updates.