You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* [WIP] Integrate with POI to improve UX
* add missing import
* temp change to proj toml
* format
* simplify method setting to sue model constructor
* add possible fix to scalarize bridge error
* add pkg to project
* format
* improvements
* remove jump wrapper
* clean tests
* fix readme
* use intermediary API
* format
* Apply suggestions from code review
Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
* add suggestion
* use Parameter set
* todo was fixed
* format
* update docs for newer Flux
* format
* kwargs
* remove diff model
* suggestions
* format
* fix examples
---------
Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
For `DiffCP`/`CVXPY` style backend, the package supports following `Function-in-Set` constraints:
31
+
For the `ConicProgram` backend, the package supports following `Function-in-Set` constraints:
32
32
33
33
| MOI Function | MOI Set |
34
34
|:-------|:---------------|
@@ -50,19 +50,16 @@ and the following objective types:
50
50
|`VariableIndex`|
51
51
|`ScalarAffineFunction`|
52
52
53
+
Other conic sets such as `RotatedSecondOrderCone` and `PositiveSemidefiniteConeSquare` are supported through bridges.
53
54
54
-
## Creating a differentiable optimizer
55
+
56
+
## Creating a differentiable MOI optimizer
55
57
56
58
You can create a differentiable optimizer over an existing MOI solver by using the `diff_optimizer` utility.
57
59
```@docs
58
60
diff_optimizer
59
61
```
60
62
61
-
## Adding new sets and constraints
62
-
63
-
The DiffOpt `Optimizer` behaves similarly to other MOI Optimizers
64
-
and implements the `MOI.AbstractOptimizer` API.
65
-
66
63
## Projections on cone sets
67
64
68
65
DiffOpt requires taking projections and finding projection gradients of vectors while computing the jacobians. For this purpose, we use [MathOptSetDistances.jl](https://github.com/matbesancon/MathOptSetDistances.jl), which is a dedicated package for computing set distances, projections and projection gradients.
@@ -104,6 +101,4 @@ In the light of above, DiffOpt differentiates program variables ``x``, ``s``, ``
104
101
- OptNet: Differentiable Optimization as a Layer in Neural Networks
105
102
106
103
### Backward Pass vector
107
-
One possible point of confusion in finding Jacobians is the role of the backward pass vector - above eqn (7), *OptNet: Differentiable Optimization as a Layer in Neural Networks*. While differentiating convex programs, it is often the case that we don't want to find the actual derivatives, rather we might be interested in computing the product of Jacobians with a *backward pass vector*, often used in backprop in machine learning/automatic differentiation. This is what happens in scheme 1 of `DiffOpt` backend.
108
-
109
-
But, for the conic system (scheme 2), we provide perturbations in conic data (`dA`, `db`, `dc`) to compute pertubations (`dx`, `dy`, `dz`) in input variables. Unlike the quadratic case, these perturbations are actual derivatives, not the product with a backward pass vector. This is an important distinction between the two schemes of differential optimization.
104
+
One possible point of confusion in finding Jacobians is the role of the backward pass vector - above eqn (7), *OptNet: Differentiable Optimization as a Layer in Neural Networks*. While differentiating convex programs, it is often the case that we don't want to find the actual derivatives, rather we might be interested in computing the product of Jacobians with a *backward pass vector*, often used in backpropagation in machine learning/automatic differentiation. This is what happens in `DiffOpt` backends.
0 commit comments