AI

Document Chat with Local AI Model

Fullstack Developer

Interactive chat application that allows users to ask questions about corporate documents and receive precise, source-based answers. All AI processing runs locally.

Technical Highlights

  • Local LLM (Ollama): No external API calls — full data sovereignty and GDPR compliance
  • RAG: Documents indexed as embeddings, most relevant passages as context for the LLM
  • Blazor WASM: Reactive chat interface with real-time streaming
  • Fluxor state management for chat history and UI state
  • MediatR pipeline for document upload, embedding generation, and chat queries

Tech-Stack

Blazor WASMOllamaFluxorMediatRLLMRAG

What this project demonstrates

Practical experience with LLMs, RAG architectures, vector databases, and privacy-compliant AI implementation.

Planning a similar project?

Get in touch

Rejoining the server...

Rejoin failed... trying again in seconds.

Failed to rejoin.
Please retry or reload the page.

The session has been paused by the server.

Failed to resume the session.
Please reload the page.