Commit 0d186a4
Add per-sample gradient norm computation as a functionality (#724)
Summary:
Pull Request resolved: #724
Per-sample gradient norm is computed for Ghost Clipping, but it can be useful generally. Exposed it as a functionality.
```
...
loss.backward()
per_sample_norms = model.per_sample_gradient_norms
```
Reviewed By: iden-kalemaj
Differential Revision: D68634969
fbshipit-source-id: 7d5cb8a05de11d7492d3c1ae7f7384243cc03c731 parent e4eb3fb commit 0d186a4
File tree
1 file changed
+16
-0
lines changed- opacus/grad_sample
1 file changed
+16
-0
lines changedLines changed: 16 additions & 0 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
120 | 120 | | |
121 | 121 | | |
122 | 122 | | |
| 123 | + | |
123 | 124 | | |
124 | 125 | | |
125 | 126 | | |
| |||
131 | 132 | | |
132 | 133 | | |
133 | 134 | | |
| 135 | + | |
134 | 136 | | |
135 | 137 | | |
136 | 138 | | |
| |||
231 | 233 | | |
232 | 234 | | |
233 | 235 | | |
| 236 | + | |
| 237 | + | |
| 238 | + | |
| 239 | + | |
| 240 | + | |
| 241 | + | |
| 242 | + | |
| 243 | + | |
| 244 | + | |
| 245 | + | |
| 246 | + | |
| 247 | + | |
| 248 | + | |
| 249 | + | |
0 commit comments