qdurllm
Search your favorite websites and chat with them, on your desktop🌐
Docs in active development!👷♀️
They will be soon available on: https://astrabert.github.io/qdurllm/
In the meantime, refer to the Quickstart guide in this README!
Quickstart
1. Prerequisites
conda
package managerdocker
anddocker compose
.
2. Installation
[!IMPORTANT] This is only for the pre-release of
v1.0.0
, i.e.v1.0.0-rc.0
- Clone the
january-2025
branch of this GitHub repo:
git clone -b january-2025 --single-branch https://github.com/AstraBert/qdurllm.git
cd qdurllm/
- Create the
conda
environment:
conda env create -f environment.yml
- Pull
qdrant
from Docker Hub:
docker pull qdrant/qdrant
3. Launching
- Launch
qdrant
vector database services withdocker compose
(from within theqdurllm
folder):
docker compose up
- Activate the
qdurllm
conda environment you just created:
conda activate qdurllm
- Go inside the
app
directory and launch the Gradio application:
cd app/
python3 app.py
You should see the app running on http://localhost:7860
once all the models are downloaded from HuggingFace Hub.
Relies on
- Qwen2.5-1.5B-Instruct, with Apache 2.0 license
- nomic-ai/modernbert-embed-base, with Apache 2.0 license
- prithivida/Splade_PP_en_v1, with Apache 2.0 license
Give feedback!
Comment on the discussion thread created for this release with your feedback or create issues :)