Skip to content

Severe BUG when initializing parameters!!! #11

@RongfanLi98

Description

@RongfanLi98

Whenever you use nn.Parameter(torch.Tensor(...)) , it will sometimes include nan value and results in training failure. In case someone skip the nn.init.xavier_uniform_, the right way to initialize parameter is to use nn.Parameter(torch.rand(...)) or nn.Parameter(torch.randn(...)). For example, at GAT.py.

See also: https://discuss.pytorch.org/t/nn-parameter-contains-nan-when-initializing/44559

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions