Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CReLU and BatchNorm #2

Open
yulizhou opened this issue Mar 10, 2018 · 3 comments
Open

CReLU and BatchNorm #2

yulizhou opened this issue Mar 10, 2018 · 3 comments

Comments

@yulizhou
Copy link

yulizhou commented Mar 10, 2018

Hi, I'm reading the paper and curious about your implementation.

CReLU layer seems defined but not used. Instead, the code implements it again in the layer construction.

Also, the paper has a batch norm layer but it's not implemented.

What is the consideration for this implementation? Better performance?

Thanks

@lxg2015
Copy link
Owner

lxg2015 commented Mar 10, 2018

Whichever way is ok about CReLU . With the bn layer, faceboxes should have better results, I just forget to add the batch norm layer, thanks

@abeardear
Copy link
Contributor

abeardear commented Jul 20, 2018

@yulizhou @lxg2015
Hi, I fix network design in this repo, little changes made better performance, such as conv to conv_bn_relu~

@XiaXuehai
Copy link

@xiongzihua
hi, I think the predict code is wrong.We can't resize the original image.
See this repo , the output is closer to the paper.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants