Skip to content

Commit dd6d6db

Browse files
committed
Simplify the python dependency installation
1 parent f699bdc commit dd6d6db

File tree

4 files changed

+23
-13
lines changed

4 files changed

+23
-13
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,12 @@ All notable changes to this project will be documented in this file.
44
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
55
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
66

7+
## [1.20.0] - 2024-06-13
8+
9+
### Changed
10+
- [Build] Simplify the python dependency installation
11+
- [Build] Downgrade the "torch" package to 2.1.2+cu121
12+
713
## [1.19.0] - 2024-06-13
814

915
### Added

rebuild_llama.cpp.ps1

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -189,17 +189,18 @@ cmake `
189189

190190
Copy-Item -Path "../../OpenBLAS/bin/libopenblas.dll" -Destination "./bin/Release/libopenblas.dll"
191191

192-
Set-Location -Path "../"
192+
Set-Location -Path "../../../"
193193

194-
conda activate llama.cpp
195-
196-
# We are installing the latest available version of the dependencies.
197-
pip install --upgrade --upgrade-strategy "eager" -r ./requirements.txt
194+
Write-Host "[Python] Installing dependencies..." -ForegroundColor "Yellow"
198195

199-
Set-Location -Path "../../"
196+
conda activate llama.cpp
200197

201-
# We are enforcing specific versions on some packages.
202-
pip install -r ./requirements_override.txt
198+
# We are installing the latest available version of all llama.cpp
199+
# project dependencies and also overriding some package versions.
200+
pip install `
201+
--upgrade `
202+
--upgrade-strategy "eager" `
203+
--requirement ./requirements_override.txt
203204

204205
conda list
205206

requirements_override.txt

Lines changed: 7 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,7 @@
1-
# We are using a specific version of the "torch"
2-
# package which supports a specific CUDA version.
3-
--extra-index-url https://download.pytorch.org/whl/nightly/cu121
4-
torch==2.4.0.dev20240516+cu121
1+
# We are importing the llama.cpp project dependencies.
2+
--requirement ./vendor/llama.cpp/requirements.txt
3+
4+
# We are overriding the "torch" package version with a
5+
# specific compatible version that also supports CUDA.
6+
--extra-index-url https://download.pytorch.org/whl/cu121
7+
torch==2.1.2+cu121

vendor/llama.cpp

0 commit comments

Comments
 (0)