qdurllm
Search your favorite websites and chat with them, on your desktop🌐
Docs in active development!👷♀️
They will be soon available on: https://astrabert.github.io/qdurllm/
In the meantime, refer to the Quickstart guide in this README!
Quickstart
1. Prerequisites
condapackage managerdockeranddocker compose.
2. Installation
[!IMPORTANT] This is only for the pre-release of
v1.0.0, i.e.v1.0.0-rc.0
- Clone the
january-2025branch of this GitHub repo:
git clone -b january-2025 --single-branch https://github.com/AstraBert/qdurllm.git
cd qdurllm/
- Create the
condaenvironment:
conda env create -f environment.yml
- Pull
qdrantfrom Docker Hub:
docker pull qdrant/qdrant
3. Launching
- Launch
qdrantvector database services withdocker compose(from within theqdurllmfolder):
docker compose up
- Activate the
qdurllmconda environment you just created:
conda activate qdurllm
- Go inside the
appdirectory and launch the Gradio application:
cd app/
python3 app.py
You should see the app running on http://localhost:7860 once all the models are downloaded from HuggingFace Hub.
Relies on
- Qwen2.5-1.5B-Instruct, with Apache 2.0 license
- nomic-ai/modernbert-embed-base, with Apache 2.0 license
- prithivida/Splade_PP_en_v1, with Apache 2.0 license
Give feedback!
Comment on the discussion thread created for this release with your feedback or create issues :)