You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have express-limiter in place and it works as a charm .
The question is that for SEO reasons, I'd like to whitelist:
1 - crawlers that will index the site
2 - phantomjs that I use to prerender angular pages
Do you have any experience on that ?
I have the impression that I can work a solution for the phantomjs (as it's ran locally) relying in whitelist + connection.remoteAddress. But, I cannot find a solution for crawlers (other headers ? which ?)
Thanks a lot for the help and congrats for the module, it's really great!
Best regards!
The text was updated successfully, but these errors were encountered:
Hi,
I have express-limiter in place and it works as a charm .
The question is that for SEO reasons, I'd like to whitelist:
1 - crawlers that will index the site
2 - phantomjs that I use to prerender angular pages
Do you have any experience on that ?
I have the impression that I can work a solution for the phantomjs (as it's ran locally) relying in whitelist + connection.remoteAddress. But, I cannot find a solution for crawlers (other headers ? which ?)
Thanks a lot for the help and congrats for the module, it's really great!
Best regards!
The text was updated successfully, but these errors were encountered: