Skip to content

Lower depth yields same performance #95

@Youyoun

Description

@Youyoun

Hi !

Thanks for publishing your training repository !

I trained the 17 depth model on BSD dataset (train) using the pytorch_training scripts (that I fixed because of some compatibility issues), and it yielded the same result as a model of depth 4 (didn't try to go lower).

It just feels weird that I get the same mean train loss / PSNR with a much lower depth model. I only tried on gray scale images.

Is it possible that something is wrong with the code, or is this result normal ? Is there a thorough study on the performance of DnCNN according to its depth ?

Thank you in advance for your response.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions