Skip to content

Commit 8a58c5f

Browse files
Merge pull request #838 from SciML/optbasev2.2
Update optimisers extensions and tests
2 parents 4127df2 + 51eaf46 commit 8a58c5f

File tree

14 files changed

+78
-91
lines changed

14 files changed

+78
-91
lines changed

NEWS.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,7 @@
11
# v4 Breaking changes
22

3-
1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
4-
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimziationProblem as the second argument to the objective etc (p) if you
5-
want to do minibatching, else for full batch just pass in the full data.
3+
1. The main change in this breaking release has been the way mini-batching is handled. The data argument in the solve call and the implicit iteration of that in the callback has been removed,
4+
the stochastic solvers (Optimisers.jl and Sophia) now handle it explicitly. You would now pass in a DataLoader to OptimizationProblem as the second argument to the objective etc (p) if you
5+
want to do minibatching, else for full batch just pass in the full data.
66

7-
2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.
8-
9-
7+
2. The support for extra returns from objective function has been removed. Now the objective should only return a scalar loss value, hence callback doesn't take extra arguments other than the state and loss value.

Project.toml

Lines changed: 1 addition & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "Optimization"
22
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
3-
version = "4.0.2"
3+
version = "4.0.3"
44

55
[deps]
66
ADTypes = "47edcb42-4c32-4615-8424-f2b9edc5f35b"
@@ -11,7 +11,6 @@ LBFGSB = "5be7bae1-8223-5378-bac3-9e7378a2f6e6"
1111
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
1212
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
1313
LoggingExtras = "e6f89c97-d47a-5376-807f-9c37f3926c36"
14-
MLUtils = "f1d291b0-491e-4a28-83b9-f70985020b54"
1514
OptimizationBase = "bca83a33-5cc9-4baa-983d-23429ab6bcbb"
1615
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
1716
ProgressLogging = "33c8b6b6-d38a-422a-b730-caa89a2f386c"
@@ -29,16 +28,11 @@ LBFGSB = "0.4.1"
2928
LinearAlgebra = "1.10"
3029
Logging = "1.10"
3130
LoggingExtras = "0.4, 1"
32-
MLUtils = "0.4.4"
3331
OptimizationBase = "2"
3432
Printf = "1.10"
3533
ProgressLogging = "0.1"
3634
Reexport = "1.2"
3735
SciMLBase = "2.39.0"
3836
SparseArrays = "1.10"
39-
Symbolics = "5.12"
4037
TerminalLoggers = "0.1"
4138
julia = "1.9"
42-
43-
[extras]
44-
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"

docs/src/index.md

Lines changed: 30 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -54,110 +54,110 @@ to add the specific wrapper packages.
5454
```@raw html
5555
<details>
5656
<summary><strong>BlackBoxOptim</strong></summary>
57-
- **Global Methods**
57+
- <strong>Global Methods</strong>
5858
- Zeroth order
5959
- Unconstrained
6060
- Box Constraints
6161
</details>
6262
<details>
6363
<summary><strong>CMAEvolutionaryStrategy</strong></summary>
64-
- **Global Methods**
64+
- <strong>Global Methods</strong>
6565
- Zeroth order
6666
- Unconstrained
6767
- Box Constraints
6868
</details>
6969
<details>
7070
<summary><strong>Evolutionary</strong></summary>
71-
- **Global Methods**
71+
- <strong>Global Methods</strong>
7272
- Zeroth order
7373
- Unconstrained
7474
- Box Constraints
7575
- Non-linear Constraints
7676
</details>
7777
<details>
7878
<summary><strong>GCMAES</strong></summary>
79-
- **Global Methods**
79+
- <strong>Global Methods</strong>
8080
- First order
8181
- Box Constraints
8282
- Unconstrained
8383
</details>
8484
<details>
8585
<summary><strong>Manopt</strong></summary>
86-
- **Local Methods**
86+
- <strong>Local Methods</strong>
8787
- First order
8888
- Second order
8989
- Zeroth order
9090
- Box Constraints
9191
- Constrained 🟡
92-
- **Global Methods**
92+
- <strong>Global Methods</strong>
9393
- Zeroth order
9494
- Unconstrained
9595
</details>
9696
<details>
9797
<summary><strong>MathOptInterface</strong></summary>
98-
- **Local Methods**
98+
- <strong>Local Methods</strong>
9999
- First order
100100
- Second order
101101
- Box Constraints
102102
- Constrained
103-
- **Global Methods**
103+
- <strong>Global Methods</strong>
104104
- First order
105105
- Second order
106106
- Constrained
107107
</details>
108108
<details>
109109
<summary><strong>MultistartOptimization</strong></summary>
110-
- **Global Methods**
110+
- <strong>Global Methods</strong>
111111
- Zeroth order
112112
- First order
113113
- Second order
114114
- Box Constraints
115115
</details>
116116
<details>
117117
<summary><strong>Metaheuristics</strong></summary>
118-
- **Global Methods**
118+
- <strong>Global Methods</strong>
119119
- Zeroth order
120120
- Unconstrained
121121
- Box Constraints
122122
</details>
123123
<details>
124124
<summary><strong>NOMAD</strong></summary>
125-
- **Global Methods**
125+
- <strong>Global Methods</strong>
126126
- Zeroth order
127127
- Unconstrained
128128
- Box Constraints
129129
- Constrained 🟡
130130
</details>
131131
<details>
132132
<summary><strong>NLopt</strong></summary>
133-
- **Local Methods**
133+
- <strong>Local Methods</strong>
134134
- First order
135135
- Zeroth order
136136
- Second order 🟡
137137
- Box Constraints
138138
- Local Constrained 🟡
139-
- **Global Methods**
139+
- <strong>Global Methods</strong>
140140
- Zeroth order
141141
- First order
142142
- Unconstrained
143143
- Constrained 🟡
144144
</details>
145145
<details>
146146
<summary><strong>Optim</strong></summary>
147-
- **Local Methods**
147+
- <strong>Local Methods</strong>
148148
- Zeroth order
149149
- First order
150150
- Second order
151151
- Box Constraints
152152
- Constrained
153-
- **Global Methods**
153+
- <strong>Global Methods</strong>
154154
- Zeroth order
155155
- Unconstrained
156156
- Box Constraints
157157
</details>
158158
<details>
159159
<summary><strong>PRIMA</strong></summary>
160-
- **Local Methods**
160+
- <strong>Local Methods</strong>
161161
- Derivative-Free: ✅
162162
- **Constraints**
163163
- Box Constraints: ✅
@@ -167,13 +167,15 @@ to add the specific wrapper packages.
167167
<summary><strong>QuadDIRECT</strong></summary>
168168
- **Constraints**
169169
- Box Constraints: ✅
170-
- **Global Methods**
170+
- <strong>Global Methods</strong>
171171
- Unconstrained: ✅
172172
</details>
173173
```
174+
174175
🟡 = supported in downstream library but not yet implemented in `Optimization.jl`; PR to add this functionality are welcome
175176

176177
## Citation
178+
177179
```
178180
@software{vaibhav_kumar_dixit_2023_7738525,
179181
author = {Vaibhav Kumar Dixit and Christopher Rackauckas},
@@ -185,37 +187,48 @@ to add the specific wrapper packages.
185187
url = {https://doi.org/10.5281/zenodo.7738525},
186188
year = 2023}
187189
```
190+
188191
## Reproducibility
192+
189193
```@raw html
190194
<details><summary>The documentation of this SciML package was built using these direct dependencies,</summary>
191195
```
196+
192197
```@example
193198
using Pkg # hide
194199
Pkg.status() # hide
195200
```
201+
196202
```@raw html
197203
</details>
198204
```
205+
199206
```@raw html
200207
<details><summary>and using this machine and Julia version.</summary>
201208
```
209+
202210
```@example
203211
using InteractiveUtils # hide
204212
versioninfo() # hide
205213
```
214+
206215
```@raw html
207216
</details>
208217
```
218+
209219
```@raw html
210220
<details><summary>A more complete overview of all dependencies and their versions is also provided.</summary>
211221
```
222+
212223
```@example
213224
using Pkg # hide
214225
Pkg.status(; mode = PKGMODE_MANIFEST) # hide
215226
```
227+
216228
```@raw html
217229
</details>
218230
```
231+
219232
```@eval
220233
using TOML
221234
using Markdown

docs/src/tutorials/certification.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ This works with the `structural_analysis` keyword argument to `OptimizationProbl
77
We'll use a simple example to illustrate the convexity structure certification process.
88

99
```@example symanalysis
10-
using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization, OptimizationMOI
10+
using SymbolicAnalysis, Zygote, LinearAlgebra, Optimization
1111
1212
function f(x, p = nothing)
1313
return exp(x[1]) + x[1]^2

docs/src/tutorials/minibatch.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ end
5454
function loss_adjoint(fullp, data)
5555
batch, time_batch = data
5656
pred = predict_adjoint(fullp, time_batch)
57-
sum(abs2, batch .- pred), pred
57+
sum(abs2, batch .- pred)
5858
end
5959
6060
k = 10

lib/OptimizationOptimJL/Project.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name = "OptimizationOptimJL"
22
uuid = "36348300-93cb-4f02-beb5-3c3902f8871e"
33
authors = ["Vaibhav Dixit <vaibhavyashdixit@gmail.com> and contributors"]
4-
version = "0.4.0"
4+
version = "0.4.1"
55

66
[deps]
77
Optim = "429524aa-4258-5aef-a3af-852621145aeb"

lib/OptimizationOptimJL/src/OptimizationOptimJL.jl

Lines changed: 31 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -26,6 +26,7 @@ function SciMLBase.requireshessian(opt::Union{
2626
true
2727
end
2828
SciMLBase.requiresgradient(opt::Optim.Fminbox) = true
29+
# SciMLBase.allowsfg(opt::Union{Optim.AbstractOptimizer, Optim.ConstrainedOptimizer, Optim.Fminbox, Optim.SAMIN}) = true
2930

3031
function __map_optimizer_args(cache::OptimizationCache,
3132
opt::Union{Optim.AbstractOptimizer, Optim.Fminbox,
@@ -142,11 +143,11 @@ function SciMLBase.__solve(cache::OptimizationCache{
142143
θ = metadata[cache.opt isa Optim.NelderMead ? "centroid" : "x"]
143144
opt_state = Optimization.OptimizationState(iter = trace.iteration,
144145
u = θ,
145-
objective = x[1],
146+
objective = trace.value,
146147
grad = get(metadata, "g(x)", nothing),
147148
hess = get(metadata, "h(x)", nothing),
148149
original = trace)
149-
cb_call = cache.callback(opt_state, x...)
150+
cb_call = cache.callback(opt_state, trace.value)
150151
if !(cb_call isa Bool)
151152
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
152153
end
@@ -261,11 +262,11 @@ function SciMLBase.__solve(cache::OptimizationCache{
261262
metadata["x"]
262263
opt_state = Optimization.OptimizationState(iter = trace.iteration,
263264
u = θ,
264-
objective = x[1],
265+
objective = trace.value,
265266
grad = get(metadata, "g(x)", nothing),
266267
hess = get(metadata, "h(x)", nothing),
267268
original = trace)
268-
cb_call = cache.callback(opt_state, x...)
269+
cb_call = cache.callback(opt_state, trace.value)
269270
if !(cb_call isa Bool)
270271
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
271272
end
@@ -277,14 +278,19 @@ function SciMLBase.__solve(cache::OptimizationCache{
277278
__x = first(x)
278279
return cache.sense === Optimization.MaxSense ? -__x : __x
279280
end
280-
fg! = function (G, θ)
281-
if G !== nothing
282-
cache.f.grad(G, θ)
283-
if cache.sense === Optimization.MaxSense
284-
G .*= -one(eltype(G))
281+
282+
if cache.f.fg === nothing
283+
fg! = function (G, θ)
284+
if G !== nothing
285+
cache.f.grad(G, θ)
286+
if cache.sense === Optimization.MaxSense
287+
G .*= -one(eltype(G))
288+
end
285289
end
290+
return _loss(θ)
286291
end
287-
return _loss(θ)
292+
else
293+
fg! = cache.f.fg
288294
end
289295

290296
gg = function (G, θ)
@@ -344,9 +350,9 @@ function SciMLBase.__solve(cache::OptimizationCache{
344350
u = metadata["x"],
345351
grad = get(metadata, "g(x)", nothing),
346352
hess = get(metadata, "h(x)", nothing),
347-
objective = x[1],
353+
objective = trace.value,
348354
original = trace)
349-
cb_call = cache.callback(opt_state, x...)
355+
cb_call = cache.callback(opt_state, trace.value)
350356
if !(cb_call isa Bool)
351357
error("The callback should return a boolean `halt` for whether to stop the optimization process.")
352358
end
@@ -358,15 +364,21 @@ function SciMLBase.__solve(cache::OptimizationCache{
358364
__x = first(x)
359365
return cache.sense === Optimization.MaxSense ? -__x : __x
360366
end
361-
fg! = function (G, θ)
362-
if G !== nothing
363-
cache.f.grad(G, θ)
364-
if cache.sense === Optimization.MaxSense
365-
G .*= -one(eltype(G))
367+
368+
if cache.f.fg === nothing
369+
fg! = function (G, θ)
370+
if G !== nothing
371+
cache.f.grad(G, θ)
372+
if cache.sense === Optimization.MaxSense
373+
G .*= -one(eltype(G))
374+
end
366375
end
376+
return _loss(θ)
367377
end
368-
return _loss(θ)
378+
else
379+
fg! = cache.f.fg
369380
end
381+
370382
gg = function (G, θ)
371383
cache.f.grad(G, θ)
372384
if cache.sense === Optimization.MaxSense
@@ -434,7 +446,7 @@ PrecompileTools.@compile_workload begin
434446
function obj_f(x, p)
435447
A = p[1]
436448
b = p[2]
437-
return sum((A * x - b) .^ 2)
449+
return sum((A * x .- b) .^ 2)
438450
end
439451

440452
function solve_nonnegative_least_squares(A, b, solver)

0 commit comments

Comments
 (0)