DokuWiki AI Chat

Use large language models to interact with your wiki content.

Chat with your Wiki!

Our DokuWiki AI Chat plugin is a revolutionary tool that uses large language models (LLM) to make your wiki more interactive. Ask questions as you would to a colleague and get the information you need instantly! The plugin employs advanced vector-based similarity search techniques to locate the right wiki pages and provide relevant context to the LLM for answering users’ queries.

The plugin is a fantastic tool to enhance the utility of your wiki in your organization.

Take the first step into the exciting world of Artificial Intelligence with ease.

AI Chat Plugin Documentation

What are Large Language Models?

A large language model is a type of artificial intelligence model that has been trained on a vast amount of text data. These models are designed to generate human-like text based on the input they receive. They are capable of a wide range of tasks, such as answering questions, writing essays, summarizing text, translating languages, and even generating code.

Our plugin supports models from OpenAI, Anthropic, Mistral and Voyage, offering a broad selection of chat models. The addition of new model providers and models is seamless, ensuring the plugin stays up-to-date with the evolving AI landscape. We’re also exploring the use of local or self-hosted models for scenarios where data sharing with external providers is not permissible.

OpenAI Mistral Anthropic VoyageAI Reka

Retrieval-Augmented Generation (RAG)

To provide the large language model with the proper context, eg. to have it know about your specific wiki content, the plugin needs to select the most relevant wiki pages for the given query. This is done using semantic vector search.

The plugin maintains an index of your wiki context in the form of vectors. By comparing the distance between the query vector and the vectors of the wiki pages, the plugin can identify the most relevant pages. The plugin then retrieves the content of these pages and feeds it to the large language model for generating the response.

The plugin supports multiple vector storage backends, such as a local SQLite Database, Pinecone, Chroma and Qdrant.

How can we help?

Choosing the right model depends on various factors such as speed, reasoning abilities, supported languages, pricing, and the nature of your content. We recommend testing the content of your specific wiki and the typical questions users ask to determine the best model mix for your needs.

We're happy to help you with the setup and configuration of the plugin. We can also assist you in selecting the right model and vector storage for your wiki and running tests againsts different configurations. Contact us to get started!

Andreas Gohr

Through CosmoCode, you have direct access to DokuWiki's "inventor". You won't find more knowledge on DokuWiki's internals anywhere else!

Contact us. We're happy to support you.