-
-
Notifications
You must be signed in to change notification settings - Fork 8.8k
Description
I’d like to request support for modifying the current margin (raw score) used in training, before each boosting round.
While DMatrix.set_base_margin() exists, it only sets the initial base margin. During training, the margin is updated via internal prediction buffers which are not exposed or resettable.
Use cases:
Dynamic residual correction
Hybrid models that inject new priors into each boosting step
Time-series modeling where the residual signal evolves
Proposal:
Add a before_iteration callback that allows:
Reading the current margin (model.predict(..., output_margin=True))
Overwriting it via new method (e.g., booster.set_current_margin(...))
or:
Allow flushing internal raw score buffer and reloading from base_margin each round
I’m happy to help test or discuss implementation options. This would make XGBoost much more flexible for advanced workflows.