8000 parameter is not found in .ckpt, when i restore the checkpoint for resnet_V2 fine_tuning training · Issue #2527 · tensorflow/models · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

parameter is not found in .ckpt, when i restore the checkpoint for resnet_V2 fine_tuning training #2527

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Zzmc opened this issue Oct 12, 2017 · 8 comments
Labels
stat:awaiting model gardener Waiting on input from TensorFlow model gardener

Comments

@Zzmc
Copy link
Zzmc commented Oct 12, 2017

Hi,
when i restore the checkpoint, there is error
NotFoundError (see above for traceback): Tensor name "resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/biases" not found in checkpoint files ./fine_tuning/resnet_v2_50.ckpt
what's wrong with that?Could you give me some suggestions?

@reedwm
Copy link
Member
reedwm commented Oct 12, 2017

I apologize but I am having a hard time understanding what the problem is, where the problem is, and what version it affects. Please resubmit and pay attention to the issue template (https://github.com/tensorflow/tensorflow/issues/new) . Please provide all the information it asks. Thank you.

@reedwm reedwm added the stat:awaiting response Waiting on input from the contributor label Oct 12, 2017
@HanqingWangAI
Copy link
HanqingWangAI commented Dec 11, 2017

This issue also happened to me.

It seems that the following biases are missing in the checkpoint of resnet_v2_50 downloaded from https://github.com/tensorflow/models/tree/master/research/slim#pre-trained-models.

resnet_v2_50/block1/unit_1/bottleneck_v2/conv1/biases resnet_v2_50/block1/unit_1/bottleneck_v2/conv2/biases resnet_v2_50/block1/unit_2/bottleneck_v2/conv1/biases resnet_v2_50/block1/unit_2/bottleneck_v2/conv2/biases resnet_v2_50/block1/unit_3/bottleneck_v2/conv1/biases resnet_v2_50/block1/unit_3/bottleneck_v2/conv2/biases resnet_v2_50/block2/unit_1/bottleneck_v2/conv1/biases resnet_v2_50/block2/unit_1/bottleneck_v2/conv2/biases resnet_v2_50/block2/unit_2/bottleneck_v2/conv1/biases resnet_v2_50/block2/unit_2/bottleneck_v2/conv2/biases resnet_v2_50/block2/unit_3/bottleneck_v2/conv1/biases resnet_v2_50/block2/unit_3/bottleneck_v2/conv2/biases resnet_v2_50/block2/unit_4/bottleneck_v2/conv1/biases resnet_v2_50/block2/unit_4/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_1/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_1/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_2/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_2/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_3/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_3/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_4/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_4/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_5/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_5/bottleneck_v2/conv2/biases resnet_v2_50/block3/unit_6/bottleneck_v2/conv1/biases resnet_v2_50/block3/unit_6/bottleneck_v2/conv2/biases resnet_v2_50/block4/unit_1/bottleneck_v2/conv1/biases resnet_v2_50/block4/unit_1/bottleneck_v2/conv2/biases resnet_v2_50/block4/unit_2/bottleneck_v2/conv1/biases resnet_v2_50/block4/unit_2/bottleneck_v2/conv2/biases resnet_v2_50/block4/unit_3/bottleneck_v2/conv1/biases resnet_v2_50/block4/unit_3/bottleneck_v2/conv2/biases
@reedwm

@balconychy
Copy link
balconychy commented Dec 21, 2017

encount same issue. you need to use proper scope.
arg_scope = resnet_v2.resnet_arg_scope(weight_decay=self.WEIGHT_REGULIZER_W)
with slim.arg_scope(arg_scope):
return resnet_v2.resnet_v2_101(inputs=images,is_training=is_training,num_classes=self.CLASS_NUM)
ref tensorflow/tensorflow#4249

@reedwm
Copy link
Member
reedwm commented Dec 21, 2017

/CC @sguada, can you look into why the biases are missing?

@reedwm reedwm added stat:awaiting model gardener Waiting on input from TensorFlow model gardener and removed stat:awaiting response Waiting on input from the contributor labels Dec 21, 2017
@daisysnow
Copy link

@reedwm hi, I met the same issue, but I solved it by update my tensorflow from v1.1.0 to v1.2.0.
I think it is due to the different parameters' name in different version of TF, I use it like this way:

inputs_data = /your/data/
model_path = /your/path/of/checkpoint
with slim.arg_scope(resnet_v2.resnet_arg_scope(is_training=False)):
resnet_v2.resnet_v2_152(inputs_data, 1001)
variables_to_restore = slim.get_variables_to_restore(include=["resnet_v2_152"])
init_fn = slim.assign_from_checkpoint_fn(model_path, variables_to_restore)
sess = tf.Session()
init_fn(sess)

I wish it helps

@reedwm
Copy link
Member
reedwm commented Jan 30, 2018

Thanks for the info @daisysnow. I'm closing this issue since it seems to be resolved, but please reopen if anyone still has this issue on 1.5.

@reedwm reedwm closed this as completed Jan 30, 2018
@ghost
Copy link
ghost commented Aug 6, 2018

This isssue persists in TF 1.7 and TF1.9

@Cppowboy
Copy link

@reedwm hi, I met the same issue, but I solved it by update my tensorflow from v1.1.0 to v1.2.0.
I think it is due to the different parameters' name in different version of TF, I use it like this way:

inputs_data = /your/data/
model_path = /your/path/of/checkpoint
with slim.arg_scope(resnet_v2.resnet_arg_scope(is_training=False)):
resnet_v2.resnet_v2_152(inputs_data, 1001)
variables_to_restore = slim.get_variables_to_restore(include=["resnet_v2_152"])
init_fn = slim.assign_from_checkpoint_fn(model_path, variables_to_restore)
sess = tf.Session()
init_fn(sess)

I wish it helps

It works.
This line with slim.arg_scope(resnet_v2.resnet_arg_scope(is_training=False)): sets the bias_initializer to None, which won't create bias. The default bias_initializer of layer_lib.conv2d is zero initializer, which will create bias.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stat:awaiting model gardener Waiting on input from TensorFlow model gardener
Projects
None yet
Development

No branches or pull requests

6 participants
0