I recently rolled out a sitemap update with around 3,000 new pages using the /en/blog/... structure.
But Yandex Webmaster decided to ignore the /en prefix entirely and tried to crawl /blog/... instead. That created the risk of thousands of crawl errors.
The lifesaver: Luckily, I already had proactive 308 redirects in place. So instead of running into a wall of 404s, the bots were seamlessly sent to the correct URLs.
Key takeaways:
- ā
Never assume search engines will parse your
sitemap.xmlperfectly. - ā Set up redirects before you need them. They are your safety net against traffic loss and indexing problems.