Skip to content

Commit 9d555f0

Browse files
committed
inverse design seminar notebooks
1 parent 91107bb commit 9d555f0

16 files changed

+4581
-2
lines changed

2025-10-09-invdes-seminar/00_setup_guide.ipynb

Lines changed: 344 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/01_bayes.ipynb

Lines changed: 556 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/02_adjoint.ipynb

Lines changed: 523 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/03_sensitivity.ipynb

Lines changed: 1032 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/04_adjoint_robust.ipynb

Lines changed: 441 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/05_robust_comparison.ipynb

Lines changed: 427 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

2025-10-09-invdes-seminar/06_measurement_calibration.ipynb

Lines changed: 474 additions & 0 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.
Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
Inverse Design Seminar Demos
2+
============================
3+
4+
These notebooks track the inverse-designed dual-layer grating coupler workflow presented during the October 9, 2025 seminar. Start with the simulation setup, follow the optimization and robustness studies, and finish with a calibration example that ties measurements back into the digital twin.
5+
6+
Seminar recording: `YouTube link <https://www.youtube.com/watch?v=OpVBJmomzoo>`_
7+
8+
Repository Layout
9+
-----------------
10+
- ``00_setup_guide.ipynb`` - builds the baseline Tidy3D simulation for a dual-layer grating coupler and visualizes the initial, uniform geometry.
11+
- ``01_bayes.ipynb`` - performs a five-parameter Bayesian optimization to locate a high-performing uniform grating without gradient information.
12+
- ``02_adjoint.ipynb`` - expands to per-tooth parameters and applies adjoint gradients with Adam to apodize the grating and boost peak efficiency.
13+
- ``03_sensitivity.ipynb`` - quantifies fabrication variability through plus or minus 20 nm bias sweeps, Monte Carlo sampling, and adjoint-based sensitivity analysis.
14+
- ``04_adjoint_robust.ipynb`` - optimizes the adjoint design against nominal, over, and under etch corners by penalizing performance variance.
15+
- ``05_robust_comparison.ipynb`` - reruns the Monte Carlo experiment with the robust and nominal designs side by side to measure yield improvements.
16+
- ``06_measurement_calibration.ipynb`` - demonstrates how adjoint gradients can back-fit SiN widths so simulated spectra line up with measured (synthetic) data.
17+
18+
Supporting assets
19+
-----------------
20+
- ``setup.py`` - shared simulation utilities, geometry constraints, and helper routines used across the series.
21+
- ``optim.py`` - lightweight, autograd-friendly Adam implementation plus parameter clipping helpers.
22+
- ``results/`` - JSON snapshots of intermediate designs (Bayesian best guess, adjoint refinements, robust solution) consumed by later notebooks.
23+
24+
Getting Started
25+
---------------
26+
#. Install dependencies (Python 3.10 or newer recommended):
27+
28+
.. code-block:: bash
29+
30+
pip install tidy3d bayes_opt autograd pandas matplotlib scipy
31+
32+
You also need an active Tidy3D account and API access since every notebook submits jobs with ``tidy3d.web.run``.
33+
34+
#. Launch Jupyter and open the notebooks in numerical order; each one assumes the prior results exist in ``results/``.
35+
36+
Suggested Workflow
37+
------------------
38+
- Use ``00_setup_guide.ipynb`` to verify your environment and understand the baseline geometry.
39+
- Iterate through optimization (``01`` to ``04``) to see how global and local methods complement each other.
40+
- Leverage the sensitivity and comparison notebooks (``03`` and ``05``) when you need wafer-level statistics.
41+
- Apply ``06_measurement_calibration.ipynb`` after you gather measured spectra to keep your model synced with hardware.
42+
43+
Enjoy the seminar content, and reach out if you adapt these workflows to your own devices.

2025-10-09-invdes-seminar/optim.py

Lines changed: 132 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
"""Utility routines for functional-style optimization in the tutorial notebooks.
2+
3+
The helpers here avoid mutating inputs so they play nicely with autograd.
4+
"""
5+
6+
import autograd.numpy as np
7+
from autograd.misc import flatten
8+
9+
10+
def clip_params(params, bounds):
11+
"""Clip a parameter dictionary according to per-key bounds.
12+
13+
Parameters
14+
----------
15+
params : dict[str, np.ndarray]
16+
Dictionary mapping parameter names to array values.
17+
bounds : dict[str, tuple[float | None, float | None]]
18+
Lower and upper limits for each parameter. Missing keys default to no
19+
clipping. ``None`` disables a bound on that side.
20+
21+
Returns
22+
-------
23+
dict[str, np.ndarray]
24+
New dictionary with values clipped to the requested interval.
25+
"""
26+
clipped = {}
27+
for key, value in params.items():
28+
lo, hi = bounds.get(key, (None, None))
29+
lo_val = -np.inf if lo is None else lo
30+
hi_val = np.inf if hi is None else hi
31+
clipped[key] = np.clip(value, lo_val, hi_val)
32+
return clipped
33+
34+
35+
def _flatten(tree):
36+
"""Return a flat representation of a pytree and its inverse transform."""
37+
flat, unflatten = flatten(tree)
38+
return np.array(flat, dtype=float), unflatten
39+
40+
41+
def init_adam(params, lr=1e-2, beta1=0.9, beta2=0.999, eps=1e-8):
42+
"""Initialize Adam optimizer state for a parameter pytree.
43+
44+
Parameters
45+
----------
46+
params : dict[str, np.ndarray]
47+
Current parameter values used to size the optimizer state.
48+
lr : float = 1e-2
49+
Learning rate applied to each step.
50+
beta1 : float = 0.9
51+
Exponential decay applied to the first moment estimate.
52+
beta2 : float = 0.999
53+
Exponential decay applied to the second moment estimate.
54+
eps : float = 1e-8
55+
Numerical stabilizer added inside the square-root denominator.
56+
57+
Returns
58+
-------
59+
dict[str, object]
60+
Dictionary holding the Adam accumulator vectors and hyperparameters.
61+
"""
62+
flat_params, unflatten = _flatten(params)
63+
state = {
64+
"t": 0,
65+
"m": np.zeros_like(flat_params),
66+
"v": np.zeros_like(flat_params),
67+
"unflatten": unflatten,
68+
"lr": lr,
69+
"beta1": beta1,
70+
"beta2": beta2,
71+
"eps": eps,
72+
}
73+
return state
74+
75+
76+
def adam_update(grads, state):
77+
"""Compute Adam parameter updates from gradients and state.
78+
79+
Parameters
80+
----------
81+
grads : dict[str, np.ndarray]
82+
Gradient pytree with the same structure as the parameters.
83+
state : dict[str, object]
84+
Optimizer state returned by :func:`init_adam`.
85+
86+
Returns
87+
-------
88+
updates : dict[str, np.ndarray]
89+
Parameter deltas that should be subtracted from the current values.
90+
new_state : dict[str, object]
91+
Updated optimiser state after incorporating the gradients.
92+
"""
93+
g_flat, _ = _flatten(grads)
94+
t = state["t"] + 1
95+
96+
beta1 = state["beta1"]
97+
beta2 = state["beta2"]
98+
m = (1 - beta1) * g_flat + beta1 * state["m"]
99+
v = (1 - beta2) * (g_flat * g_flat) + beta2 * state["v"]
100+
101+
m_hat = m / (1 - beta1**t)
102+
v_hat = v / (1 - beta2**t)
103+
updates_flat = state["lr"] * (m_hat / (np.sqrt(v_hat) + state["eps"]))
104+
105+
new_state = {
106+
**state,
107+
"t": t,
108+
"m": m,
109+
"v": v,
110+
}
111+
updates = state["unflatten"](updates_flat)
112+
return updates, new_state
113+
114+
115+
def apply_updates(params, updates):
116+
"""Apply additive updates to a parameter pytree.
117+
118+
Parameters
119+
----------
120+
params : dict[str, np.ndarray]
121+
Original parameter dictionary.
122+
updates : dict[str, np.ndarray]
123+
Update dictionary produced by :func:`adam_update`.
124+
125+
Returns
126+
-------
127+
dict[str, np.ndarray]
128+
New dictionary with ``updates`` subtracted element-wise.
129+
"""
130+
p_flat, unflatten = _flatten(params)
131+
u_flat, _ = _flatten(updates)
132+
return unflatten(p_flat - u_flat)
Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
{
2+
"widths_si": [
3+
0.48262618509355615,
4+
0.5207667397076212,
5+
0.45559630741287455,
6+
0.46361599591364383,
7+
0.44415937806339206,
8+
0.4725359297284237,
9+
0.4897650205171269,
10+
0.43401055421044743,
11+
0.5470685569556949,
12+
0.36090417995022805,
13+
0.35112952011499815,
14+
0.25182851621035,
15+
0.2287991538613288,
16+
0.21772272282716135,
17+
0.5871457399636976
18+
],
19+
"gaps_si": [
20+
0.6608377498745214,
21+
0.7153002966535659,
22+
0.6755416250853287,
23+
0.762711911245917,
24+
0.6957603543580327,
25+
0.6485980725930465,
26+
0.7270242877193821,
27+
0.6569877864900205,
28+
0.7434394276954258,
29+
0.8910689853995577,
30+
0.92010444487145,
31+
0.887662287039533,
32+
0.8439724990649012,
33+
0.7880932609023489,
34+
0.7992416233438039
35+
],
36+
"widths_sin": [
37+
0.7891411537966333,
38+
0.6441362131696193,
39+
0.5221408734233975,
40+
0.31370712049190075,
41+
0.6036396259080945,
42+
0.5709134822507435,
43+
0.6102929883304251,
44+
0.5666814968867978,
45+
0.5911545201167835,
46+
0.5511240455381605,
47+
0.6759490391650566,
48+
0.424347404772533,
49+
0.4917036091769178,
50+
0.5990651442832533,
51+
0.7041841301345496
52+
],
53+
"gaps_sin": [
54+
0.4541255482246594,
55+
0.4802605152344745,
56+
0.3,
57+
0.4914339475589058,
58+
0.5480930702315364,
59+
0.6026168524939672,
60+
0.6561924161853298,
61+
0.5934806415337143,
62+
0.478494886109227,
63+
0.44772190354423175,
64+
0.7331937769153588,
65+
0.6299485623886972,
66+
0.48849470041329063,
67+
0.35636407607194925,
68+
0.5135103145142313
69+
],
70+
"first_gap_si": -0.6720330444742626,
71+
"first_gap_sin": 0.5035568088634116,
72+
"target_power": 0.5676497430872463
73+
}

0 commit comments

Comments
 (0)