This repository showcases a conversational retrieval system that can route between multiple data sources. It aims to reduce distractions from off-topic documents and is particularly beneficial for small models. The system uses Cloudflare WorkersAI, Cloudflare Vectorize DBs, LangChain.js, and Nuxt + Vue for the frontend. Users can swap in more powerful models to improve performance. The system works by populating vectorstores with data, routing incoming questions to the appropriate vectorstore, retrieving context documents, and generating a final answer based on the retrieved context and the question.
Here’s a step-by-step guide to installing the theme:
npm install
Note: This step requires a paid Cloudflare Workers plan.
@cf/baai/bge-base-en-v1.5 embeddings model.wrangler.toml file. If you choose different names, update the bindings in wrangler.toml.[databasename1] delete
[databasename2] delete
Note: Replace [databasename1] and [databasename2] with the actual names of your databases.
server/api/ingest.ts file.This repository showcases a conversational retrieval system that can route between multiple data sources. It uses Cloudflare WorkersAI, Cloudflare Vectorize DBs, LangChain.js, and Nuxt + Vue for the frontend. The system is designed to reduce distractions caused by off-topic documents and can be customized by swapping in more powerful models. It works by populating vectorstores with data, routing incoming questions to the appropriate vectorstore, retrieving context documents, and generating final answers. To install, follow the provided guide which includes steps for installing dependencies and creating Vectorize DBs.