Skip to content

changing relu function on mobilenetV2 #10

@dagnichz

Description

@dagnichz

i'm trying to do guided backpropogation on mobilenet using your code with tensorflow 1.14.
unfortunately, tensorflow does not reposnd to the override you suggested:

@ops.RegisterGradient("GuidedRelu")
def _GuidedReluGrad(op, grad):
return tf.where(0. < grad, gen_nn_ops.relu_grad(grad, op.outputs[0]), tf.zeros(grad.get_shape()))

I also tried chganing the override line to "Relu6" but still nothing.
with graph.gradient_override_map({'Relu6': 'GuidedRelu'}):

mybe you tried doing it yourself on mobilnet and can offer a solution?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions