Skip to content

Conversation

palday
Copy link
Member

@palday palday commented Jan 25, 2025

closes #619, closes #689, closes #742

  • add entry in NEWS.md
  • after opening this PR, add a reference and run docs/NEWS-update.jl to update the cross-references.
  • I've bumped the version appropriately
  • refactor LMM
  • refactor GLMM
  • refactor prfit! to take advantage of this

Copy link

codecov bot commented Jan 25, 2025

Codecov Report

Attention: Patch coverage is 97.09302% with 5 lines in your changes missing coverage. Please review.

Project coverage is 97.33%. Comparing base (1ea4083) to head (4bbd1f5).
Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
ext/MixedModelsPRIMAExt.jl 96.29% 2 Missing ⚠️
src/generalizedlinearmixedmodel.jl 86.66% 2 Missing ⚠️
src/nlopt.jl 98.52% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #802      +/-   ##
==========================================
+ Coverage   97.20%   97.33%   +0.12%     
==========================================
  Files          35       36       +1     
  Lines        3434     3488      +54     
==========================================
+ Hits         3338     3395      +57     
+ Misses         96       93       -3     
Flag Coverage Δ
current 96.98% <97.09%> (+0.13%) ⬆️
minimum 97.33% <97.09%> (+0.12%) ⬆️
nightly 96.91% <97.05%> (+0.13%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@palday palday marked this pull request as ready for review January 26, 2025 05:47
@palday palday requested a review from dmbates January 26, 2025 05:53
Copy link
Collaborator

@dmbates dmbates left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wow! This is amazing. I am very glad to see these changes.

Thanks for disabling the thin named argument. That wasn't one of my better ideas.

I don't think I will be able to do a line-by-line check on the changes (still having trouble with my vision) but I would be happy to approve the changes and you can merge the PR whenever you wish.

Is the intent to replace MixedModels.prfit!(m::LinearMixedModel,...) by MixedModels.optimize!(m::LinearMixedModel, Val(:prima); ...)? It isn't clear to me yet how I would use the PRIMA backend without calling prfit! or instantiating a LinearMixedModel object and manually changing m.optsum.backend and m.optsum.optimizer.

I had expected that

using PRIMA, MixedModels
MixedModels.OPTIMIZATION_BACKENDS

would return a vector of two Symbols but it is only one Symbol. Does the :prima backend need to be called before it is added? I see in ext/MixedModelsPRIMAExt.jl that a value is push!ed onto MixedModels.OPTIMIZATION_BACKENDS but it doesn't seem to be then when I examine that object.

When I invoke prfit! on a linear mixed model I sometimes get a warning from the Fortran code in libprima like

julia> MixedModels.prfit!(m1)

Warning: BOBYQA: X0 is revised so that the distance between X0 and the inactive bounds is at least RHOBEG; set HONOUR_X0 to .TRUE. if you prefer to keep X0 unchanged.

Perhaps the rhobeg value should be set to a number less than 1?

@palday
Copy link
Member Author

palday commented Jan 26, 2025

Wow! This is amazing. I am very glad to see these changes.

Thanks for disabling the thin named argument. That wasn't one of my better ideas.

I don't think I will be able to do a line-by-line check on the changes (still having trouble with my vision) but I would be happy to approve the changes and you can merge the PR whenever you wish.

Is the intent to replace MixedModels.prfit!(m::LinearMixedModel,...) by MixedModels.optimize!(m::LinearMixedModel, Val(:prima); ...)? It isn't clear to me yet how I would use the PRIMA backend without calling prfit! or instantiating a LinearMixedModel object and manually changing m.optsum.backend and m.optsum.optimizer.

There is a "secret" kwarg that you can use:

julia> using MixedModels, PRIMA
Precompiling MixedModelsPRIMAExt...
  1 dependency successfully precompiled in 8 seconds. 137 already precompiled.

julia> fit(MixedModel, @formula(reaction ~ 1 + days + (1+days|subj)), MixedModels.dataset(:sleepstudy); optimizer=:bobyqa, backend=:prima)
Linear mixed model fit by maximum likelihood
 reaction ~ 1 + days + (1 + days | subj)
   logLik   -2 logLik     AIC       AICc        BIC    
  -875.9697  1751.9393  1763.9393  1764.4249  1783.0971

Variance components:
            Column    Variance Std.Dev.   Corr.
subj     (Intercept)  565.51525 23.78056
         days          32.68219  5.71683 +0.08
Residual              654.94105 25.59182
 Number of obs: 180; levels of grouping factors: 18

  Fixed-effects parameters:
──────────────────────────────────────────────────
                Coef.  Std. Error      z  Pr(>|z|)
──────────────────────────────────────────────────
(Intercept)  251.405      6.63228  37.91    <1e-99
days          10.4673     1.50224   6.97    <1e-11
──────────────────────────────────────────────────

julia> ans.optsum
Initial parameter vector: [1.0, 0.0, 1.0]
Initial objective value:  1784.642296192471

Backend:                  prima
Optimizer:                bobyqa
Lower bounds:             [0.0, -Inf, 0.0]
rhobeg:                   1.0
rhoend:                   1.0e-6
maxfeval:                 -1

Function evaluations:     69
xtol_zero_abs:            0.001
ftol_zero_abs:            1.0e-5
Final parameter vector:   [0.9292253632482519, 0.018165717899275697, 0.22264540632838237]
Final objective value:    1751.9393444632112
Return code:              SMALL_TR_RADIUS

I haven't advertised them yet because I've found a few cases where something goes wrong with PRIMA. For example, on Intel machines, the VSCode REPL + PRIMA causes a StackOverflowError even for a very minimal optimization problem. Running in a REPL not tied to VSCode works just fine though.

I had expected that

using PRIMA, MixedModels
MixedModels.OPTIMIZATION_BACKENDS

would return a vector of two Symbols but it is only one Symbol. Does the :prima backend need to be called before it is added? I see in ext/MixedModelsPRIMAExt.jl that a value is push!ed onto MixedModels.OPTIMIZATION_BACKENDS but it doesn't seem to be then when I examine that object.

Huh, this is surprising to me -- I'll take a look.

When I invoke prfit! on a linear mixed model I sometimes get a warning from the Fortran code in libprima like

julia> MixedModels.prfit!(m1)

Warning: BOBYQA: X0 is revised so that the distance between X0 and the inactive bounds is at least RHOBEG; set HONOUR_X0 to .TRUE. if you prefer to keep X0 unchanged.

Perhaps the rhobeg value should be set to a number less than 1?

PRIMA's default rhobeg is 1 and I don't yet understand well enough what rho represents to have a feeling for a good starting value.

@dmbates
Copy link
Collaborator

dmbates commented Jan 26, 2025

I had worked out that I should add backend=:prima in the call to fit(MixedModel, ...) but I didn't also add optimizer=:bobyqa. The error message just told me there was a missing method.

@palday
Copy link
Member Author

palday commented Jan 26, 2025

I'm going to merge later this afternoon -- I think us using this is the best way to discover issues with this design. The documented, public facing components aren't changing, so even further internal reorganization isn't technically breaking.

@dmbates
Copy link
Collaborator

dmbates commented Jan 26, 2025

Sounds good to merge this. I'll just turn off notifications while I am watching football today 😄

@palday palday merged commit 7c2d4d9 into main Jan 26, 2025
11 of 12 checks passed
@palday palday deleted the pa/backend branch January 26, 2025 20:45
@kliegl
Copy link

kliegl commented Jan 26, 2025

Thank you very much!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Switch to PRIMA.bobyqa for optimization? Refactor NLopt boilerplate into an easy function call Remove repeated initial entry in fitlog

3 participants