Skip to content

Commit ae1eeb1

Browse files
committed
revert back to norming queries in linear attention, seems to lead to instability
1 parent be290b8 commit ae1eeb1

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
'stylegan2_pytorch = stylegan2_pytorch.cli:main',
99
],
1010
},
11-
version = '1.2.0',
11+
version = '1.2.1',
1212
license='GPLv3+',
1313
description = 'StyleGan2 in Pytorch',
1414
author = 'Phil Wang',

stylegan2_pytorch/stylegan2_pytorch.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ def forward(self, x):
120120
# one layer of self-attention and feedforward, for images
121121

122122
attn_and_ff = lambda chan: nn.Sequential(*[
123-
Residual(Rezero(ImageLinearAttention(chan, norm_queries = False))),
123+
Residual(Rezero(ImageLinearAttention(chan, norm_queries = True))),
124124
Residual(Rezero(nn.Sequential(nn.Conv2d(chan, chan * 2, 1), leaky_relu(), nn.Conv2d(chan * 2, chan, 1))))
125125
])
126126

0 commit comments

Comments
 (0)