Skip to content

Reproducing training procedure #82

@violetdenim

Description

@violetdenim

Hello! I'm reproducing your training procedure and I'm trying to remove discrepancy between paper and code. In code settings https://github.com/NVlabs/RVT/blob/master/rvt/configs/rvt2.yaml there are 15 epochs and lr 1.25e-5, however in paper you used 10 epochs and lr 2.4e+3. Also batch size in configuration file is 24 and 192 in paper. Also, the model your provide has 99 postfix in naming, which suggests that you used 100 epochs during training.
When I tried to train 100 epochs and kept lr and batch size intact, I got NaN on 54-th epoch.

I wonder what should I change in settings to reproduce your results. I really appreciate any help you can provide!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions