-
Notifications
You must be signed in to change notification settings - Fork 5.6k
Fix Ivy Failing Test: jax - norms.batch_norm #28920
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…latest changes and resolve conflicts 'main' of https://github.com/Ajay6601/ivy
Reformat layers : conv1d_transpose |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You seem to be doing multiple things all as part of the same PR, can you change this PR to only make the necessary changes to fix the test for ivy.batch_norm, and open separate PRs for resolving any other issues? Also see my other comments. Thanks!
ivy/data_classes/array/layers.py
Outdated
{ | ||
a: ivy.Shape(1, 56, 6), | ||
b: ivy.Shape(1, 112, 6) | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't understand why you've changed this to refer to ivy.Container when these methods are on ivy.Array? Can you revert you're changes to this file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you, Sam. I have changed it.
I thought like in Ivy; there are separate implementations for:
Regular array operations (on ivy.Array)
Container operations (on ivy.Container) which apply functions to nested arrays
so when working with a single array instead of a container, you would use ivy.conv1d_transpose or ivy.static_conv1d_transpose directly on an ivy.Array . Thats why i thought like for multiple arrays we must use container. But now i got it reading through docs again.
x: Union[tf.Tensor, tf.Variable], | ||
filters: Union[tf.Tensor, tf.Variable], | ||
x: Union[ivy.Array, ivy.NativeArray], | ||
filters: Union[ivy.Array, ivy.NativeArray], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This was correct before.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh Yeah that's correct I have been parallelly working on other issue so i might have changed this code mistakenly.
Thank you
@@ -45,11 +45,16 @@ def assert_all_close( | |||
ret_np, ret_from_gt_np = ivy.promote_types_of_inputs(ret_np, ret_from_gt_np) | |||
ret_dtype = str(ret_np.dtype) | |||
ret_from_gt_dtype = str(ret_from_gt_np.dtype).replace("longlong", "int64") | |||
assert ret_dtype == ret_from_gt_dtype, ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We probably shouldn't be making any changes to this file.
Co-authored-by: Sam Armstrong <88863522+Sam-Armstrong@users.noreply.github.com>
close #28919
close #28921