-
Notifications
You must be signed in to change notification settings - Fork 56
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
can't use weight directly #20
Comments
Same issue... |
I have an error like ... Please help @OeslleLucena as we are using your best work !!! |
Which keras backend are you using? I got the Negative dimension error if I tried to run the code with tensorflow backend. |
It seems something wrong with the input shape, the authors did not provide the suitable input shape, it always causes the error. |
you can set input shape 112*112 |
For anyone still facing the issue of 'Negative Dimension', it is getting caused due to a possible typo in the usage of Convolution2D() function calls in the load_model() method. For instance, for the below line:
It is getting interpreted as kernel size = 3 and stride =3. When stride != 1, even doing padding =='same' is not going to give a feature map of the same size as before. Due to this, the feature map keeps decreasing even in the consecutive Convolution2D calls( whereas in VGG16 architecture, consecutive Conv layers are of the same feature size). Due to this, on line 38 when MaxPooling2D is done, the feature map size is reduce to 1x1. Doing a max pooling with kernel size 2x2 on a feature map of 1x1 is obviously going to give error. In order to resolve this, I made a minor change:
here kernel_size = (3,3) and strides = (1,1) by default. This solved the issue of 'Negative Dimension Error' for me. Also, please note that the size of the input image has to be 96x96. This has been mentioned by the author(@OeslleLucena) here. In short, because the model was set to use an input size of 96x96 during training, we're restricted to use the same during inference. Otherwise for VGG16, you can use sizes below 224x224. |
I get a error when i run test.py : ValueError: Dimension 0 in both shapes must be equal, but are 25088 and 4608. Shapes are [25088,256] and [4608,256]. for 'Assign_84' (op: 'Assign') with input shapes: [25088,256], [4608,256]
The text was updated successfully, but these errors were encountered: