Skip to content

qdurllm

Search your favorite websites and chat with them, on your desktop🌐

Docs in active development!👷‍♀️

They will be soon available on: https://astrabert.github.io/qdurllm/

In the meantime, refer to the Quickstart guide in this README!

Quickstart

1. Prerequisites

2. Installation

[!IMPORTANT] This is only for the pre-release of v1.0.0, i.e. v1.0.0-rc.0

  1. Clone the january-2025 branch of this GitHub repo:
git clone -b january-2025 --single-branch https://github.com/AstraBert/qdurllm.git
cd qdurllm/
  1. Create the conda environment:
conda env create -f environment.yml
  1. Pull qdrant from Docker Hub:
docker pull qdrant/qdrant

3. Launching

  1. Launch qdrant vector database services with docker compose (from within the qdurllm folder):
docker compose up
  1. Activate the qdurllm conda environment you just created:
conda activate qdurllm
  1. Go inside the app directory and launch the Gradio application:
cd app/
python3 app.py

You should see the app running on http://localhost:7860 once all the models are downloaded from HuggingFace Hub.

Relies on

Give feedback!

Comment on the discussion thread created for this release with your feedback or create issues :)