-
Notifications
You must be signed in to change notification settings - Fork 150
Layer::Forward does not work when inference faster-rcnn #90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
All reshape stuffs are done in PlaceMemory. When the shape of input blob changes, all internal blob will change the shape and realloc the memory buffer. |
Yes. I have read the code. It will reshape every layer of the net BEFORE net forward. Some kind of net like faster-rcnn will change the down layer shape information at forward time. So I think PlaceMemory won't fix it. |
The layer itself gets all shape info about input blobs, it should be able to computer the shape of output blobs when reshape function called. |
For proposal layer, checkout code here. We set maximum shape for output rois. |
@luoyetx I got this strange problem when call C:\workspace\opensource\mini-caffe\src\net.cpp:277: Cannot copy param 0 weights from layer '221'; shape mismatch. Source param shape is 1 64 1 1 (64); target param shape is 64 1 1 1 (64). To learn this layer's parameters from scratch rather than copying from a saved net, rename the layer.my last 2 layers has prototxt as below: 1:under cpu mode. everything's fine. My conv layer's weights target param has shape 1 x 64 x 1 x 1, with bias shape (1), can correctly load from caffemodel to caffe::Net. I have tried with add a special case into line 259 of net.cpp, |
Recently I did some code reading and debug. The result shows that: |
I find the master code of layer::Forward has be changed like this:
inline void Layer::Forward(const vector<Blob*>& bottom,
const vector<Blob*>& top) {
switch (Caffe::mode()) {
case Caffe::CPU:
Forward_cpu(bottom, top);
break;
case Caffe::GPU:
Forward_gpu(bottom, top);
break;
default:
LOG(FATAL) << "Unknown caffe mode.";
}
}
There is no reshape() before real layer to do forward.
This change does not support faster-rcnn because this kind of network will reshap top layer by down layer at runtime.
I think this feature should be considered. Thanks.
The text was updated successfully, but these errors were encountered: