-
Notifications
You must be signed in to change notification settings - Fork 74
rnncapsule出现了一些问题。 #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
是不是keras版本不对? |
谢谢,我尝试一下。 |
在运行rnncapsule时出现了以下报错。
ValueError Traceback (most recent call last)
in
----> 1 model = Gru_Capsule_Model(word_seq_len, word_embedding,classification)
in Gru_Capsule_Model(sent_length, embeddings_weight, class_num)
25 embed = SpatialDropout1D(0.2)(embedding(content))
26 x = Bidirectional(CuDNNGRU(200, return_sequences=True))(embed)
---> 27 capsule = Capsule(num_capsule=Num_capsule, dim_capsule=Dim_capsule, routings=Routings, share_weights=True)(x)
28 capsule = Flatten()(capsule)
29 x = Dense(1000)(capsule)
~/anaconda3/lib/python3.7/site-packages/keras/engine/base_layer.py in call(self, inputs, **kwargs)
487 # Actually call the layer,
488 # collecting output(s), mask(s), and shape(s).
--> 489 output = self.call(inputs, **kwargs)
490 output_mask = self.compute_mask(inputs, previous_mask)
491
~/ZDY/2018-daguan-competition-master/biGruModel/glove/util.py in call(self, u_vecs)
95 outputs = self.activation(K.batch_dot(c, u_hat_vecs, [2, 2]))
96 if i < self.routings - 1:
---> 97 b = K.batch_dot(outputs, u_hat_vecs, [2, 2])
98
99 return outputs
~/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py in batch_dot(x, y, axes)
1497 str(x_shape) + ' and ' + str(y_shape) +
1498 ' with axes=' + str(axes) + '. x.shape[%d] != '
-> 1499 'y.shape[%d] (%d != %d).' % (axes[0], axes[1], d1, d2))
1500
1501 # backup ndims. Need them later.
ValueError: Can not do batch_dot on inputs with shapes (None, 10, 10, 16) and (None, 10, None, 16) with axes=[2, 3]. x.shape[2] != y.shape[3] (10 != 16).
The text was updated successfully, but these errors were encountered: