Skip to content

can you share the experiment? #1

@csufangyu

Description

@csufangyu

hi,thanks very much.I am reading the code,but I feel fused.the lr of LyapunovFunction may not use in class LyapunovFunction,if not it it well work.so can you share some experiment?

class LyapunovFunction(nn.Module):
    def __init__(self,
                 input_shape,
                 smooth_relu_thresh=0.1,
                 layer_sizes=[64, 64],
                 lr=3e-4,
                 eps=1e-3):
        super(LyapunovFunction, self).__init__()
        self._d = smooth_relu_thresh
        self._icnn = ICNN(input_shape, layer_sizes, self.smooth_relu)
        self._eps = eps

    def forward(self, x):
        g = self._icnn(x)
        g0 = self._icnn(torch.zeros_like(x))
        return self.smooth_relu(g - g0) + self._eps * x.pow(2).sum()

    def smooth_relu(self, x):
        relu = x.relu()
        # TODO: Is there a clean way to avoid computing both of these on all elements?
        sq = x.pow(2) / (2 * self._d)
        lin = x - self._d/2

        return torch.where(relu < self._d, sq, lin)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions