Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error message running tf_to_keras notebook #10

Open
ramiro-feria-puron opened this issue Dec 19, 2018 · 5 comments
Open

Error message running tf_to_keras notebook #10

ramiro-feria-puron opened this issue Dec 19, 2018 · 5 comments

Comments

@ramiro-feria-puron
Copy link

When running the last cell of tf_to_keras notebook (changing model folder and checkpoint to the latest model available), I get the following error:

Loading numpy weights from ../model/keras/npy_weights/
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-14-903bce1a4b25> in <module>
      8             weight_arr = np.load(os.path.join(npy_weights_dir, weight_file))
      9             weights.append(weight_arr)
---> 10         layer.set_weights(weights)
     11 
     12 print('Saving weights...')

~/anaconda3/envs/venv/lib/python3.6/site-packages/keras/engine/base_layer.py in set_weights(self, weights)
   1055                                  str(pv.shape) +
   1056                                  ' not compatible with '
-> 1057                                  'provided weight shape ' + str(w.shape))
   1058             weight_value_tuples.append((p, w))
   1059         K.batch_set_value(weight_value_tuples)

ValueError: Layer weight shape (1792, 128) not compatible with provided weight shape (1792, 512)

Any insight would be really appreciated...

@xiezhuangping
Copy link

Same problem.

`Loading numpy weights from ../model/keras/npy_weights/
Conv2d_1a_3x3
Conv2d_1a_3x3_BatchNorm
Conv2d_2a_3x3
Conv2d_2a_3x3_BatchNorm
Conv2d_2b_3x3
Conv2d_2b_3x3_BatchNorm
Conv2d_3b_1x1
Conv2d_3b_1x1_BatchNorm
Conv2d_4a_3x3
Conv2d_4a_3x3_BatchNorm
Conv2d_4b_3x3
Conv2d_4b_3x3_BatchNorm
Block35_1_Branch_2_Conv2d_0a_1x1
Block35_1_Branch_2_Conv2d_0a_1x1_BatchNorm
Block35_1_Branch_1_Conv2d_0a_1x1
Block35_1_Branch_2_Conv2d_0b_3x3
Block35_1_Branch_1_Conv2d_0a_1x1_BatchNorm
Block35_1_Branch_2_Conv2d_0b_3x3_BatchNorm
Block35_1_Branch_0_Conv2d_1x1
Block35_1_Branch_1_Conv2d_0b_3x3
Block35_1_Branch_2_Conv2d_0c_3x3
Block35_1_Branch_0_Conv2d_1x1_BatchNorm
Block35_1_Branch_1_Conv2d_0b_3x3_BatchNorm
Block35_1_Branch_2_Conv2d_0c_3x3_BatchNorm
Block35_1_Conv2d_1x1
Block35_2_Branch_2_Conv2d_0a_1x1
Block35_2_Branch_2_Conv2d_0a_1x1_BatchNorm
Block35_2_Branch_1_Conv2d_0a_1x1
Block35_2_Branch_2_Conv2d_0b_3x3
Block35_2_Branch_1_Conv2d_0a_1x1_BatchNorm
Block35_2_Branch_2_Conv2d_0b_3x3_BatchNorm
Block35_2_Branch_0_Conv2d_1x1
Block35_2_Branch_1_Conv2d_0b_3x3
Block35_2_Branch_2_Conv2d_0c_3x3
Block35_2_Branch_0_Conv2d_1x1_BatchNorm
Block35_2_Branch_1_Conv2d_0b_3x3_BatchNorm
Block35_2_Branch_2_Conv2d_0c_3x3_BatchNorm
Block35_2_Conv2d_1x1
Block35_3_Branch_2_Conv2d_0a_1x1
Block35_3_Branch_2_Conv2d_0a_1x1_BatchNorm
Block35_3_Branch_1_Conv2d_0a_1x1
Block35_3_Branch_2_Conv2d_0b_3x3
Block35_3_Branch_1_Conv2d_0a_1x1_BatchNorm
Block35_3_Branch_2_Conv2d_0b_3x3_BatchNorm
Block35_3_Branch_0_Conv2d_1x1
Block35_3_Branch_1_Conv2d_0b_3x3
Block35_3_Branch_2_Conv2d_0c_3x3
Block35_3_Branch_0_Conv2d_1x1_BatchNorm
Block35_3_Branch_1_Conv2d_0b_3x3_BatchNorm
Block35_3_Branch_2_Conv2d_0c_3x3_BatchNorm
Block35_3_Conv2d_1x1
Block35_4_Branch_2_Conv2d_0a_1x1
Block35_4_Branch_2_Conv2d_0a_1x1_BatchNorm
Block35_4_Branch_1_Conv2d_0a_1x1
Block35_4_Branch_2_Conv2d_0b_3x3
Block35_4_Branch_1_Conv2d_0a_1x1_BatchNorm
Block35_4_Branch_2_Conv2d_0b_3x3_BatchNorm
Block35_4_Branch_0_Conv2d_1x1
Block35_4_Branch_1_Conv2d_0b_3x3
Block35_4_Branch_2_Conv2d_0c_3x3
Block35_4_Branch_0_Conv2d_1x1_BatchNorm
Block35_4_Branch_1_Conv2d_0b_3x3_BatchNorm
Block35_4_Branch_2_Conv2d_0c_3x3_BatchNorm
Block35_4_Conv2d_1x1
Block35_5_Branch_2_Conv2d_0a_1x1
Block35_5_Branch_2_Conv2d_0a_1x1_BatchNorm
Block35_5_Branch_1_Conv2d_0a_1x1
Block35_5_Branch_2_Conv2d_0b_3x3
Block35_5_Branch_1_Conv2d_0a_1x1_BatchNorm
Block35_5_Branch_2_Conv2d_0b_3x3_BatchNorm
Block35_5_Branch_0_Conv2d_1x1
Block35_5_Branch_1_Conv2d_0b_3x3
Block35_5_Branch_2_Conv2d_0c_3x3
Block35_5_Branch_0_Conv2d_1x1_BatchNorm
Block35_5_Branch_1_Conv2d_0b_3x3_BatchNorm
Block35_5_Branch_2_Conv2d_0c_3x3_BatchNorm
Block35_5_Conv2d_1x1
Mixed_6a_Branch_1_Conv2d_0a_1x1
Mixed_6a_Branch_1_Conv2d_0a_1x1_BatchNorm
Mixed_6a_Branch_1_Conv2d_0b_3x3
Mixed_6a_Branch_1_Conv2d_0b_3x3_BatchNorm
Mixed_6a_Branch_0_Conv2d_1a_3x3
Mixed_6a_Branch_1_Conv2d_1a_3x3
Mixed_6a_Branch_0_Conv2d_1a_3x3_BatchNorm
Mixed_6a_Branch_1_Conv2d_1a_3x3_BatchNorm
Block17_1_Branch_1_Conv2d_0a_1x1
Block17_1_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_1_Branch_1_Conv2d_0b_1x7
Block17_1_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_1_Branch_0_Conv2d_1x1
Block17_1_Branch_1_Conv2d_0c_7x1
Block17_1_Branch_0_Conv2d_1x1_BatchNorm
Block17_1_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_1_Conv2d_1x1
Block17_2_Branch_1_Conv2d_0a_1x1
Block17_2_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_2_Branch_1_Conv2d_0b_1x7
Block17_2_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_2_Branch_0_Conv2d_1x1
Block17_2_Branch_1_Conv2d_0c_7x1
Block17_2_Branch_0_Conv2d_1x1_BatchNorm
Block17_2_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_2_Conv2d_1x1
Block17_3_Branch_1_Conv2d_0a_1x1
Block17_3_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_3_Branch_1_Conv2d_0b_1x7
Block17_3_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_3_Branch_0_Conv2d_1x1
Block17_3_Branch_1_Conv2d_0c_7x1
Block17_3_Branch_0_Conv2d_1x1_BatchNorm
Block17_3_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_3_Conv2d_1x1
Block17_4_Branch_1_Conv2d_0a_1x1
Block17_4_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_4_Branch_1_Conv2d_0b_1x7
Block17_4_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_4_Branch_0_Conv2d_1x1
Block17_4_Branch_1_Conv2d_0c_7x1
Block17_4_Branch_0_Conv2d_1x1_BatchNorm
Block17_4_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_4_Conv2d_1x1
Block17_5_Branch_1_Conv2d_0a_1x1
Block17_5_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_5_Branch_1_Conv2d_0b_1x7
Block17_5_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_5_Branch_0_Conv2d_1x1
Block17_5_Branch_1_Conv2d_0c_7x1
Block17_5_Branch_0_Conv2d_1x1_BatchNorm
Block17_5_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_5_Conv2d_1x1
Block17_6_Branch_1_Conv2d_0a_1x1
Block17_6_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_6_Branch_1_Conv2d_0b_1x7
Block17_6_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_6_Branch_0_Conv2d_1x1
Block17_6_Branch_1_Conv2d_0c_7x1
Block17_6_Branch_0_Conv2d_1x1_BatchNorm
Block17_6_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_6_Conv2d_1x1
Block17_7_Branch_1_Conv2d_0a_1x1
Block17_7_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_7_Branch_1_Conv2d_0b_1x7
Block17_7_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_7_Branch_0_Conv2d_1x1
Block17_7_Branch_1_Conv2d_0c_7x1
Block17_7_Branch_0_Conv2d_1x1_BatchNorm
Block17_7_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_7_Conv2d_1x1
Block17_8_Branch_1_Conv2d_0a_1x1
Block17_8_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_8_Branch_1_Conv2d_0b_1x7
Block17_8_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_8_Branch_0_Conv2d_1x1
Block17_8_Branch_1_Conv2d_0c_7x1
Block17_8_Branch_0_Conv2d_1x1_BatchNorm
Block17_8_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_8_Conv2d_1x1
Block17_9_Branch_1_Conv2d_0a_1x1
Block17_9_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_9_Branch_1_Conv2d_0b_1x7
Block17_9_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_9_Branch_0_Conv2d_1x1
Block17_9_Branch_1_Conv2d_0c_7x1
Block17_9_Branch_0_Conv2d_1x1_BatchNorm
Block17_9_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_9_Conv2d_1x1
Block17_10_Branch_1_Conv2d_0a_1x1
Block17_10_Branch_1_Conv2d_0a_1x1_BatchNorm
Block17_10_Branch_1_Conv2d_0b_1x7
Block17_10_Branch_1_Conv2d_0b_1x7_BatchNorm
Block17_10_Branch_0_Conv2d_1x1
Block17_10_Branch_1_Conv2d_0c_7x1
Block17_10_Branch_0_Conv2d_1x1_BatchNorm
Block17_10_Branch_1_Conv2d_0c_7x1_BatchNorm
Block17_10_Conv2d_1x1
Mixed_7a_Branch_2_Conv2d_0a_1x1
Mixed_7a_Branch_2_Conv2d_0a_1x1_BatchNorm
Mixed_7a_Branch_0_Conv2d_0a_1x1
Mixed_7a_Branch_1_Conv2d_0a_1x1
Mixed_7a_Branch_2_Conv2d_0b_3x3
Mixed_7a_Branch_0_Conv2d_0a_1x1_BatchNorm
Mixed_7a_Branch_1_Conv2d_0a_1x1_BatchNorm
Mixed_7a_Branch_2_Conv2d_0b_3x3_BatchNorm
Mixed_7a_Branch_0_Conv2d_1a_3x3
Mixed_7a_Branch_1_Conv2d_1a_3x3
Mixed_7a_Branch_2_Conv2d_1a_3x3
Mixed_7a_Branch_0_Conv2d_1a_3x3_BatchNorm
Mixed_7a_Branch_1_Conv2d_1a_3x3_BatchNorm
Mixed_7a_Branch_2_Conv2d_1a_3x3_BatchNorm
Block8_1_Branch_1_Conv2d_0a_1x1
Block8_1_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_1_Branch_1_Conv2d_0b_1x3
Block8_1_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_1_Branch_0_Conv2d_1x1
Block8_1_Branch_1_Conv2d_0c_3x1
Block8_1_Branch_0_Conv2d_1x1_BatchNorm
Block8_1_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_1_Conv2d_1x1
Block8_2_Branch_1_Conv2d_0a_1x1
Block8_2_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_2_Branch_1_Conv2d_0b_1x3
Block8_2_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_2_Branch_0_Conv2d_1x1
Block8_2_Branch_1_Conv2d_0c_3x1
Block8_2_Branch_0_Conv2d_1x1_BatchNorm
Block8_2_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_2_Conv2d_1x1
Block8_3_Branch_1_Conv2d_0a_1x1
Block8_3_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_3_Branch_1_Conv2d_0b_1x3
Block8_3_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_3_Branch_0_Conv2d_1x1
Block8_3_Branch_1_Conv2d_0c_3x1
Block8_3_Branch_0_Conv2d_1x1_BatchNorm
Block8_3_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_3_Conv2d_1x1
Block8_4_Branch_1_Conv2d_0a_1x1
Block8_4_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_4_Branch_1_Conv2d_0b_1x3
Block8_4_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_4_Branch_0_Conv2d_1x1
Block8_4_Branch_1_Conv2d_0c_3x1
Block8_4_Branch_0_Conv2d_1x1_BatchNorm
Block8_4_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_4_Conv2d_1x1
Block8_5_Branch_1_Conv2d_0a_1x1
Block8_5_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_5_Branch_1_Conv2d_0b_1x3
Block8_5_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_5_Branch_0_Conv2d_1x1
Block8_5_Branch_1_Conv2d_0c_3x1
Block8_5_Branch_0_Conv2d_1x1_BatchNorm
Block8_5_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_5_Conv2d_1x1
Block8_6_Branch_1_Conv2d_0a_1x1
Block8_6_Branch_1_Conv2d_0a_1x1_BatchNorm
Block8_6_Branch_1_Conv2d_0b_1x3
Block8_6_Branch_1_Conv2d_0b_1x3_BatchNorm
Block8_6_Branch_0_Conv2d_1x1
Block8_6_Branch_1_Conv2d_0c_3x1
Block8_6_Branch_0_Conv2d_1x1_BatchNorm
Block8_6_Branch_1_Conv2d_0c_3x1_BatchNorm
Block8_6_Conv2d_1x1
Bottleneck

ValueError Traceback (most recent call last)
in
9 weights.append(weight_arr)
10 print(layer.name)
---> 11 layer.set_weights(weights)
12
13 print('Saving weights...')

D:\Anaconda3\lib\site-packages\keras\engine\base_layer.py in set_weights(self, weights)
1055 str(pv.shape) +
1056 ' not compatible with '
-> 1057 'provided weight shape ' + str(w.shape))
1058 weight_value_tuples.append((p, w))
1059 K.batch_set_value(weight_value_tuples)

ValueError: Layer weight shape (1792, 128) not compatible with provided weight shape (1792, 512)`

@stezarpriansya
Copy link

i face the same problem too, anyone can help?

@jyun-bunny-honey
Copy link

model = InceptionResNetV1(classes=512) solved the problem.
My guess is that newly updated model was not trained under the default frame of Inception ResNet V1.

@stezarpriansya
Copy link

model = InceptionResNetV1(classes=512) solved the problem.
My guess is that newly updated model was not trained under the default frame of Inception ResNet V1.

thank you 👍

@jyun-bunny-honey
Copy link

model = InceptionResNetV1(classes=512) solved the problem.
My guess is that newly updated model was not trained under the default frame of Inception ResNet V1.

I found the answer from the link: https://jekel.me/2018/512_vs_128_facenet_embedding_application_in_Tinder_data/. I rechecked the model structure, and the facenet model I used (20180408-102900) does have 512 embeddings. I think possibly some codes in demos need to be updated accordingly as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants