File tree Expand file tree Collapse file tree 4 files changed +23
-13
lines changed Expand file tree Collapse file tree 4 files changed +23
-13
lines changed Original file line number Diff line number Diff line change @@ -4,6 +4,12 @@ All notable changes to this project will be documented in this file.
4
4
The format is based on [ Keep a Changelog] ( https://keepachangelog.com/en/1.0.0/ ) ,
5
5
and this project adheres to [ Semantic Versioning] ( https://semver.org/spec/v2.0.0.html ) .
6
6
7
+ ## [ 1.20.0] - 2024-06-13
8
+
9
+ ### Changed
10
+ - [ Build] Simplify the python dependency installation
11
+ - [ Build] Downgrade the "torch" package to 2.1.2+cu121
12
+
7
13
## [ 1.19.0] - 2024-06-13
8
14
9
15
### Added
Original file line number Diff line number Diff line change @@ -189,17 +189,18 @@ cmake `
189
189
190
190
Copy-Item - Path " ../../OpenBLAS/bin/libopenblas.dll" - Destination " ./bin/Release/libopenblas.dll"
191
191
192
- Set-Location - Path " ../"
192
+ Set-Location - Path " ../../../ "
193
193
194
- conda activate llama.cpp
195
-
196
- # We are installing the latest available version of the dependencies.
197
- pip install -- upgrade -- upgrade- strategy " eager" - r ./ requirements.txt
194
+ Write-Host " [Python] Installing dependencies..." - ForegroundColor " Yellow"
198
195
199
- Set-Location - Path " ../../ "
196
+ conda activate llama.cpp
200
197
201
- # We are enforcing specific versions on some packages.
202
- pip install - r ./ requirements_override.txt
198
+ # We are installing the latest available version of all llama.cpp
199
+ # project dependencies and also overriding some package versions.
200
+ pip install `
201
+ -- upgrade `
202
+ -- upgrade- strategy " eager" `
203
+ -- requirement ./ requirements_override.txt
203
204
204
205
conda list
205
206
Original file line number Diff line number Diff line change 1
- # We are using a specific version of the "torch"
2
- # package which supports a specific CUDA version.
3
- --extra-index-url https://download.pytorch.org/whl/nightly/cu121
4
- torch==2.4.0.dev20240516+cu121
1
+ # We are importing the llama.cpp project dependencies.
2
+ --requirement ./vendor/llama.cpp/requirements.txt
3
+
4
+ # We are overriding the "torch" package version with a
5
+ # specific compatible version that also supports CUDA.
6
+ --extra-index-url https://download.pytorch.org/whl/cu121
7
+ torch==2.1.2+cu121
You can’t perform that action at this time.
0 commit comments