Skip to content

Commit c68b1e3

Browse files
committed
Merge branch 'main' of github.com:JuliaStats/MixedModels.jl into pa/remove-lowerbd-field
2 parents 5c7b8a0 + d7f9223 commit c68b1e3

21 files changed

+303
-169
lines changed

NEWS.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,13 @@
11
MixedModels v5.0.0 Release Notes
22
==============================
3-
- Optimization is now performed _without constraints_. In a post-fitting step, the the Cholesky factor is canonicalized to have non-negative diagonal elements. [#840]
4-
- The default optimizer has changed to NLopt's implementation of NEWUOA where possible. NLopt's implementation fails on 1-dimensional problems, so in the case A single, scalar random effect, BOBYQA is used instead. In the future, the default optimizer backend will likely change to PRIMA and NLopt support will be moved to an extension. Blocking this change in backend is an issue with PRIMA.jl when running in VSCode's built-in REPL on Linux. [#840]
3+
- Optimization is now performed _without constraints_. In a post-fitting step, the Cholesky factor is canonicalized to have non-negative diagonal elements. [#840]
4+
- The default optimizer has changed to NLopt's implementation of NEWUOA where possible. NLopt's implementation fails on 1-dimensional problems, so in the case of a single, scalar random effect, BOBYQA is used instead. In the future, the default optimizer backend will likely change to PRIMA and NLopt support will be moved to an extension. Blocking this change in backend is an issue with PRIMA.jl when running in VSCode's built-in REPL on Linux. [#840]
55
- [BREAKING] Support for constrained optimization has been completely removed, i.e. the field `lowerbd` has been removed from `OptSummary`. [#849]
6-
6+
- [BREAKING] A fitlog is always kept -- the deprecated keyword argument `thin` has been removed as has the `fitlog` keyword argument. [#850]
7+
- The fitlog is now stored as Tables.jl-compatible column table. [#850]
8+
- Internal code around the default optimizer has been restructured. In particular, the NLopt backend has been moved to a submodule, which will make it easier to move it to an extension if we promote another backend to the default. [#853]
9+
- Internal code around optimization in profiling has been restructuring so that fitting done during calls to `profile` respect the `backend` and `optimizer` settings. [#853]
10+
- The `prfit!` convenience function has been removed. [#853]
711

812
MixedModels v4.38.0 Release Notes
913
==============================

docs/make.jl

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ makedocs(;
1111
doctest=true,
1212
# pagesonly=true,
1313
# warnonly=true,
14+
# warnonly=[:cross_references],
1415
pages=[
1516
"index.md",
1617
"constructors.md",

docs/src/api.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -79,13 +79,17 @@ stderrror!
7979
varest
8080
```
8181

82-
## Non-Exported Functions
82+
## Non-Exported Functions and Constants
8383

84-
Note that unless discussed elsewhere in the online documentation, non-exported functions should be considered implementation details.
84+
Note that unless discussed elsewhere in the online documentation, non-exported functions and constants should be considered implementation details.
8585

8686
```@autodocs
8787
Modules = [MixedModels]
8888
Public = false
8989
Order = [:function]
9090
Filter = f -> !startswith(string(f), "_")
9191
```
92+
93+
```@docs
94+
MixedModels.OPTIMIZATION_BACKENDS
95+
```

docs/src/optimization.md

Lines changed: 22 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -190,10 +190,8 @@ DisplayAs.Text(ans) # hide
190190
```
191191

192192
More detailed information about the intermediate steps of the nonlinear optimizer can be obtained the `fitlog` field.
193-
By default, `fitlog` is not populated, but passing the keyword argument `fitlog=true` to `fit!` and `refit!` will result in it being populated with the values obtained at each step of optimization:
194193

195194
```@example Main
196-
refit!(fm2; fitlog=true)
197195
first(fm2.optsum.fitlog, 5)
198196
DisplayAs.Text(ans) # hide
199197
```
@@ -222,16 +220,35 @@ Typically, the (1,1) block is the largest block in `A` and `L` and it has a spec
222220
`UniformBlockDiagonal`
223221
providing a compact representation and fast matrix multiplication or solutions of linear systems of equations.
224222

225-
### Modifying the optimization process
223+
## Modifying the optimization process
226224

227-
The `OptSummary` object contains both input and output fields for the optimizer.
225+
The [`OptSummary`](@ref) object contains both input and output fields for the optimizer.
228226
To modify the optimization process the input fields can be changed after constructing the model but before fitting it.
227+
In addition to various tolerances, which we will not discuss further here, users can specify the choice of `backend` (i.e., the non-linear optimization library used) and `optimizer` (i.e., the implementation of an algorithm provided by the backend).
228+
229+
The current default backend is [NLopt](https://github.com/JuliaOpt/NLopt.jl), which is a direct dependency of MixedModels.jl.
230+
A [PRIMA](https://github.com/libprima/PRIMA.jl/) backend is also provided as a package extension and thus only
231+
available when the PRIMA package is loaded.
232+
The list of currently loaded backends is available as [`MixedModels.OPTIMIZATION_BACKENDS`](@ref).
233+
For each individual backend, the list of available optimizers can be inspected with the function [`MixedModels.optimizers`](@ref).
234+
```@example Main
235+
MixedModels.optimizers(:nlopt)
236+
```
237+
Similarly, the list of applicable optimization parameters can be inspected with the function [`MixedModels.opt_params`](@ref).
238+
```@example Main
239+
MixedModels.opt_params(:nlopt)
240+
```
241+
242+
!!! note "Optimizer defaults subject to change"
243+
The choice of default backend and optimizer is subject to change without being considered a breaking change.
244+
If you want to guarantee a particular backend and optimizer, then you should
245+
explicitly load the associated backend's package (e.g. NLopt or PRIMA) and manually set the `optimizer` and `backend` fields.
229246

230247
Suppose, for example, that the user wishes to try a [Nelder-Mead](https://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method) optimization method instead of the default [`BOBYQA`](https://en.wikipedia.org/wiki/BOBYQA) (Bounded Optimization BY Quadratic Approximation) method.
231248
```@example Main
232249
fm2nm = LinearMixedModel(@formula(reaction ~ 1+days+(1+days|subj)), sleepstudy);
233250
fm2nm.optsum.optimizer = :LN_NELDERMEAD;
234-
fit!(fm2nm; thin=1)
251+
fit!(fm2nm)
235252
fm2nm.optsum
236253
DisplayAs.Text(ans) # hide
237254
```
@@ -252,8 +269,6 @@ plot(convdf, x=:step, y=:objective, color=:algorithm, Geom.line)
252269

253270
Run time can be constrained with `maxfeval` and `maxtime`.
254271

255-
See the documentation for the [`NLopt`](https://github.com/JuliaOpt/NLopt.jl) package for details about the various settings.
256-
257272
### Convergence to singular covariance matrices
258273

259274
To ensure identifiability of $\Sigma_\theta=\sigma^2\Lambda_\theta \Lambda_\theta$, the elements of $\theta$ corresponding to diagonal elements of $\Lambda_\theta$ are constrained to be non-negative.

ext/MixedModelsPRIMAExt.jl

Lines changed: 48 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,8 @@ module MixedModelsPRIMAExt
22

33
using MixedModels
44
using MixedModels: Statistics
5-
using MixedModels: ProgressMeter, ProgressUnknown, objective!, _objective!
5+
using MixedModels.ProgressMeter: ProgressMeter, ProgressUnknown
6+
using MixedModels: objective!, _objective!, rectify!
67
using LinearAlgebra: PosDefException
78
using PRIMA: PRIMA
89

@@ -13,25 +14,22 @@ end
1314

1415
const PRIMABackend = Val{:prima}
1516

16-
function MixedModels.prfit!(m::LinearMixedModel;
17-
kwargs...)
18-
MixedModels.unfit!(m)
19-
m.optsum.optimizer = :bobyqa
20-
m.optsum.backend = :prima
21-
22-
return fit!(m; kwargs...)
17+
function _optimizer(o::Val{O}, f, x0::Vector, args...; kwargs...) where {O}
18+
x0 = copy(x0)
19+
info = _optimizer!(o, f, x0, args...; kwargs...)
20+
return x0, info
2321
end
2422

25-
prima_optimizer!(::Val{:bobyqa}, args...; kwargs...) = PRIMA.bobyqa!(args...; kwargs...)
26-
prima_optimizer!(::Val{:cobyla}, args...; kwargs...) = PRIMA.cobyla!(args...; kwargs...)
27-
prima_optimizer!(::Val{:lincoa}, args...; kwargs...) = PRIMA.lincoa!(args...; kwargs...)
28-
prima_optimizer!(::Val{:newuoa}, args...; kwargs...) = PRIMA.newuoa!(args...; kwargs...)
23+
_optimizer!(::Val{:bobyqa}, args...; kwargs...) = PRIMA.bobyqa!(args...; kwargs...)
24+
_optimizer!(::Val{:cobyla}, args...; kwargs...) = PRIMA.cobyla!(args...; kwargs...)
25+
_optimizer!(::Val{:lincoa}, args...; kwargs...) = PRIMA.lincoa!(args...; kwargs...)
26+
_optimizer!(::Val{:newuoa}, args...; kwargs...) = PRIMA.newuoa!(args...; kwargs...)
2927

3028
function MixedModels.optimize!(m::LinearMixedModel, ::PRIMABackend;
31-
progress::Bool=true, fitlog::Bool=false, kwargs...)
29+
progress::Bool=true, kwargs...)
3230
optsum = m.optsum
3331
prog = ProgressUnknown(; desc="Minimizing", showspeed=true)
34-
fitlog && empty!(optsum.fitlog)
32+
empty!(optsum.fitlog)
3533

3634
function obj(x)
3735
val = if x == optsum.initial
@@ -51,15 +49,14 @@ function MixedModels.optimize!(m::LinearMixedModel, ::PRIMABackend;
5149
end
5250
end
5351
progress && ProgressMeter.next!(prog; showvalues=[(:objective, val)])
54-
fitlog && push!(optsum.fitlog, (copy(x), val))
52+
push!(optsum.fitlog, (; θ=copy(x), objective=val))
5553
return val
5654
end
5755

5856
maxfun = optsum.maxfeval > 0 ? optsum.maxfeval : 500 * length(optsum.initial)
59-
info = prima_optimizer!(Val(optsum.optimizer), obj, optsum.final;
57+
info = _optimizer!(Val(optsum.optimizer), obj, optsum.final;
6058
maxfun,
6159
optsum.rhoend, optsum.rhobeg)
62-
ProgressMeter.finish!(prog)
6360
optsum.feval = info.nf
6461
optsum.fmin = info.fx
6562
optsum.returnvalue = Symbol(info.status)
@@ -68,12 +65,12 @@ function MixedModels.optimize!(m::LinearMixedModel, ::PRIMABackend;
6865
end
6966

7067
function MixedModels.optimize!(m::GeneralizedLinearMixedModel, ::PRIMABackend;
71-
progress::Bool=true, fitlog::Bool=false,
68+
progress::Bool=true,
7269
fast::Bool=false, verbose::Bool=false, nAGQ=1,
7370
kwargs...)
7471
optsum = m.optsum
7572
prog = ProgressUnknown(; desc="Minimizing", showspeed=true)
76-
fitlog && empty!(opstum.fitlog)
73+
empty!(optsum.fitlog)
7774

7875
function obj(x)
7976
val = try
@@ -85,7 +82,7 @@ function MixedModels.optimize!(m::GeneralizedLinearMixedModel, ::PRIMABackend;
8582
x == optsum.initial && rethrow()
8683
m.optsum.finitial
8784
end
88-
fitlog && push!(optsum.fitlog, (copy(x), val))
85+
push!(optsum.fitlog, (; θ=copy(x), objective=val))
8986
verbose && println(round(val; digits=5), " ", x)
9087
progress && ProgressMeter.next!(prog; showvalues=[(:objective, val)])
9188
return val
@@ -107,8 +104,8 @@ function MixedModels.optimize!(m::GeneralizedLinearMixedModel, ::PRIMABackend;
107104
end
108105
sc
109106
end
110-
info = prima_optimizer!(Val(optsum.optimizer), obj, optsum.final;
111-
maxfun,
107+
info = _optimizer!(Val(optsum.optimizer), obj, optsum.final;
108+
xl=optsum.lowerbd, maxfun,
112109
optsum.rhoend, optsum.rhobeg,
113110
scale)
114111
ProgressMeter.finish!(prog)
@@ -129,6 +126,34 @@ function _check_prima_return(info::PRIMA.Info)
129126
return nothing
130127
end
131128

132-
MixedModels.opt_params(::PRIMABackend) = (:rhobeg, :rhoend, :maxfeval)
129+
MixedModels.opt_params(::PRIMABackend) = [:rhobeg, :rhoend, :maxfeval]
130+
MixedModels.optimizers(::PRIMABackend) = [:bobyqa, :cobyla, :lincoa, :newuoa]
131+
132+
function MixedModels.profilevc(obj, optsum::OptSummary, ::PRIMABackend; kwargs...)
133+
maxfun = optsum.maxfeval > 0 ? optsum.maxfeval : 500 * length(optsum.initial)
134+
xmin, info = _optimizer(Val(optsum.optimizer), obj,
135+
copyto!(optsum.final, optsum.initial);
136+
maxfun,
137+
optsum.rhoend, optsum.rhobeg,
138+
scale=nothing) # will need to scale for GLMM
139+
_check_prima_return(info)
140+
fmin = info.fx
141+
return fmin, xmin
142+
end
143+
144+
function MixedModels.profileobj!(obj,
145+
m::LinearMixedModel{T}, θ::AbstractVector{T}, osj::OptSummary, ::PRIMABackend;
146+
kwargs...) where {T}
147+
maxfun = osj.maxfeval > 0 ? osj.maxfeval : 500 * length(osj.initial)
148+
xmin = copyto!(osj.final, osj.initial)
149+
info = _optimizer!(Val(osj.optimizer), obj, xmin;
150+
maxfun,
151+
osj.rhoend, osj.rhobeg,
152+
scale=nothing) # will need to scale for GLMM
153+
fmin = info.fx
154+
_check_prima_return(info)
155+
rectify!(m)
156+
return fmin
157+
end
133158

134159
end # module

src/MixedModels.jl

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,10 @@ using LinearAlgebra: ldiv!, lmul!, logdet, mul!, norm, normalize, normalize!, qr
2020
using LinearAlgebra: rank, rdiv!, rmul!, svd, tril!
2121
using Markdown: Markdown
2222
using MixedModelsDatasets: dataset, datasets
23-
using NLopt: NLopt, Opt
2423
using PooledArrays: PooledArrays, PooledArray
24+
using NLopt: NLopt
2525
using PrecompileTools: PrecompileTools, @setup_workload, @compile_workload
26-
using ProgressMeter: ProgressMeter, Progress, ProgressUnknown, finish!, next!
26+
using ProgressMeter: ProgressMeter, Progress, finish!, next!
2727
using Random: Random, AbstractRNG, randn!
2828
using SparseArrays: SparseArrays, SparseMatrixCSC, SparseVector, dropzeros!, nnz
2929
using SparseArrays: nonzeros, nzrange, rowvals, sparse
@@ -213,8 +213,9 @@ include("grouping.jl")
213213
include("mimeshow.jl")
214214
include("serialization.jl")
215215
include("profile/profile.jl")
216-
include("nlopt.jl")
217-
include("prima.jl")
216+
include("MixedModelsNLoptExt.jl")
217+
using .MixedModelsNLoptExt
218+
218219
include("derivatives.jl")
219220

220221
# aliases with non-unicode function names

src/nlopt.jl renamed to src/MixedModelsNLoptExt.jl

Lines changed: 53 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,13 +1,29 @@
1-
push!(OPTIMIZATION_BACKENDS, :nlopt)
1+
module MixedModelsNLoptExt # not actually an extension at the moment
2+
3+
using ..MixedModels
4+
using ..MixedModels: objective!, _objective!, rectify!
5+
# are part of the package's dependencies and will not be part
6+
# of the extension's dependencies
7+
using ..MixedModels.ProgressMeter: ProgressMeter, ProgressUnknown
8+
9+
# stdlib
10+
using LinearAlgebra: PosDefException
11+
# will be a weakdep when this is moved to an extension
12+
using NLopt: NLopt, Opt
13+
14+
function __init__()
15+
push!(MixedModels.OPTIMIZATION_BACKENDS, :nlopt)
16+
return nothing
17+
end
218

319
const NLoptBackend = Val{:nlopt}
420

5-
function optimize!(m::LinearMixedModel, ::NLoptBackend;
6-
progress::Bool=true, fitlog::Bool=false,
21+
function MixedModels.optimize!(m::LinearMixedModel, ::NLoptBackend;
22+
progress::Bool=true,
723
kwargs...)
824
optsum = m.optsum
925
prog = ProgressUnknown(; desc="Minimizing", showspeed=true)
10-
fitlog && empty!(optsum.fitlog)
26+
empty!(optsum.fitlog)
1127

1228
function obj(x, g)
1329
isempty(g) || throw(ArgumentError("g should be empty for this objective"))
@@ -28,7 +44,7 @@ function optimize!(m::LinearMixedModel, ::NLoptBackend;
2844
end
2945
end
3046
progress && ProgressMeter.next!(prog; showvalues=[(:objective, val)])
31-
fitlog && push!(optsum.fitlog, (copy(x), val))
47+
push!(optsum.fitlog, (; θ=copy(x), objective=val))
3248
return val
3349
end
3450

@@ -42,13 +58,13 @@ function optimize!(m::LinearMixedModel, ::NLoptBackend;
4258
return xmin, fmin
4359
end
4460

45-
function optimize!(m::GeneralizedLinearMixedModel, ::NLoptBackend;
46-
progress::Bool=true, fitlog::Bool=false,
61+
function MixedModels.optimize!(m::GeneralizedLinearMixedModel, ::NLoptBackend;
62+
progress::Bool=true,
4763
fast::Bool=false, verbose::Bool=false, nAGQ=1,
4864
kwargs...)
4965
optsum = m.optsum
5066
prog = ProgressUnknown(; desc="Minimizing", showspeed=true)
51-
fitlog && empty!(optsum.fitlog)
67+
empty!(optsum.fitlog)
5268

5369
function obj(x, g)
5470
isempty(g) || throw(ArgumentError("g should be empty for this objective"))
@@ -61,7 +77,7 @@ function optimize!(m::GeneralizedLinearMixedModel, ::NLoptBackend;
6177
x == optsum.initial && rethrow()
6278
optsum.finitial
6379
end
64-
fitlog && push!(optsum.fitlog, (copy(x), val))
80+
push!(optsum.fitlog, (; θ=copy(x), objective=val))
6581
verbose && println(round(val; digits=5), " ", x)
6682
progress && ProgressMeter.next!(prog; showvalues=[(:objective, val)])
6783
return val
@@ -119,6 +135,32 @@ function _check_nlopt_return(ret, failure_modes=_NLOPT_FAILURE_MODES)
119135
end
120136
end
121137

122-
function opt_params(::NLoptBackend)
123-
return (:ftol_rel, :ftol_abs, :xtol_rel, :xtol_abs, :initial_step, :maxfeval, :maxtime)
138+
function MixedModels.opt_params(::NLoptBackend)
139+
return [:ftol_rel, :ftol_abs, :xtol_rel, :xtol_abs, :initial_step, :maxfeval, :maxtime]
124140
end
141+
142+
function MixedModels.optimizers(::NLoptBackend)
143+
return [:LN_NEWUOA, :LN_BOBYQA, :LN_COBYLA, :LN_NELDERMEAD, :LN_PRAXIS]
144+
end
145+
146+
function MixedModels.profilevc(obj, optsum::OptSummary, ::NLoptBackend; kwargs...)
147+
opt = NLopt.Opt(optsum)
148+
NLopt.min_objective!(opt, obj)
149+
fmin, xmin, ret = NLopt.optimize!(opt, copyto!(optsum.final, optsum.initial))
150+
_check_nlopt_return(ret)
151+
152+
return fmin, xmin
153+
end
154+
155+
function MixedModels.profileobj!(obj,
156+
m::LinearMixedModel{T}, θ::AbstractVector{T}, osj::OptSummary, ::NLoptBackend;
157+
kwargs...) where {T}
158+
opt = NLopt.Opt(osj)
159+
NLopt.min_objective!(opt, obj)
160+
fmin, xmin, ret = NLopt.optimize(opt, copyto!(osj.final, osj.initial))
161+
_check_nlopt_return(ret)
162+
rectify!(m)
163+
return fmin
164+
end
165+
166+
end # module

src/bootstrap.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -254,7 +254,7 @@ function parametricbootstrap(
254254
)
255255
samp = replicate(n; progress) do
256256
simulate!(rng, m; β, σ, θ)
257-
refit!(m; progress=false, fitlog=false)
257+
refit!(m; progress=false)
258258
(
259259
objective=ftype.(m.objective),
260260
σ=ismissing(m.σ) ? missing : ftype(m.σ),

0 commit comments

Comments
 (0)