You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There a chance I missed something, but I don't understand why it is not possible to have a rule that instead of serving /public/sitemap.xml, which can be outdated, regenerates it and serves the newest file lively (and stores it in /public/sitemap.xml at the same time).
It would save us the need to have a whenever task running, and being sure the sitemap is never older than xx time.
Is it that the search engines look at the duration of the query? That would be lame as in many situation, querying /sitemap.xml doesn't follow the same process as any page generation (static content), so it should not be a SEO criteria ...
The text was updated successfully, but these errors were encountered:
The previous version of Dynamic Sitemaps built the sitemap dynamically on each request. However, this was unsuitable for larger sitemaps (like with 2 million pages), and it would hold up the webserver when this could be used to serve "real" requests.
Therefore the sitemap generation was moved to background processing and optimized it for many URLs.
However, I know that the dynamic sitemap feature is something a lot of people could use, especially for sites with less pages. Therefore I'm considering a solution that would allow both dynamic and "static" sitemaps, and serving the static sitemaps via an engine to require less initial setup.
Thanks for feedback. Indeed a double system would be super nice!
I realized though that as using Heroku, I will have to store the file in a different location anyway. But that could still work I guess.
There a chance I missed something, but I don't understand why it is not possible to have a rule that instead of serving
/public/sitemap.xml
, which can be outdated, regenerates it and serves the newest file lively (and stores it in/public/sitemap.xml
at the same time).It would save us the need to have a
whenever
task running, and being sure the sitemap is never older than xx time.Is it that the search engines look at the duration of the query? That would be lame as in many situation, querying
/sitemap.xml
doesn't follow the same process as any page generation (static content), so it should not be a SEO criteria ...The text was updated successfully, but these errors were encountered: