Reconsider equivalence between type.broadcastable and type.shape #1170
ricardoV94
started this conversation in
Ideas
Replies: 1 comment 1 reply
-
Don't forget that we're trying to phase out use of The only reason |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Right now
TensorType("float64", shape=(None, 5, 1)).broadcastable == (False, False, True)
However, due to enabling dynamic broadcasting, this is not correct as that type can broadcast along the first dimension when that has a size of 1 at runtime. In many rewrites we check for conditions like
var1.dtype.broadcastable == var2.dtype.broadcastable
, but we shouldn't assume they are the same when they come fromNone
type shape dimensions.One option would be to change to the following:
TensorType("float64", shape=(None, 5, 1)).broadcastable == (np.nan, False, True)
, or any other type that respects(self == self) is False
asnp.nan
does.Related to #1089 and #1122
Beta Was this translation helpful? Give feedback.
All reactions