I'm currently preparing for a major SEO initiative at work. Instead of just reading theory, I decided to take a “learn-by-doing” approach through a personal side project.
The result after one month? 632 organic visitors.
It has been an eye-opening sprint. I've learned that even tiny UX details can affect rankings. I was genuinely surprised to see a correlation between font size adjustments and performance metrics.
The Bot Anomaly
On October 15, I hit a daily peak of 83 visitors. But once I dug into the data, something felt off.
A significant share of that traffic came directly from China. Using Yandex Webvisor, I watched the sessions and realized the site was being accessed with CSS disabled. My estimate is that around 20 of those “users” were actually bots.
Question for the community: Why would bots target a small site via direct traffic while intentionally disabling CSS? Is it simply a way to save bandwidth while scraping?
I’d love to hear your theories as I keep digging through the logs.