Digital Marketing Case Study
Planful Sees Organic Search Improvement by Fixing Indexed Pages Blocked by Robots.txt
Monitoring, diagnosing and fixing issues with a site’s robots.txt file is a crucial component of technical SEO in any holistic white hat SEO strategy and provides an incremental boost to organic search performance – especially for larger domains with many URLs. An important page category of Planful’s domain was indexed though blocked by robots.txt and was thus providing fewer organic search impressions and clicks. As a result, a robots.txt SEO audit was needed to identify and fix the pages blocked by robots.txt allowing them to be shown in SERPs more often.
While Firebrand had been improving technical SEO for this domain for a while (core web vitals, status code audit, duplicate content) we conducted a deeper analysis of robots.txt best practices that led to the discovery of incorrect robots.txt disallow instructions that caused an entire set of pages to be indexed though blocked by robots.txt file. After rolling out the fix for this robots.txt disallow instruction – along with additional robots txt seo optimizations – Planful recovered valuable organic search impressions and clicks for this set of pages.
Organic search results after fixing major blocked by robots.txt issue:
Organic traffic results after improving blocked by robots.txt issue:
470% increase in organic search impressions (2 months after fixing pages blocked by robots.txt issue)
315% increase in organic search clicks (2 months after fixing pages blocked by robots.txt issue)
385% increase in new users for affected pages after implementing robots.txt best practices (2 months after fixing robots.txt disallow instructions)
106% increase in sessions for affected pages after implementing robots.txt best practices (2 months after fixing robots.txt disallow instructions)
Comprehensive robots.txt file audit using WordPress Yoast File Editor and FTP through site’s hosting server to identify all Allow/Disallow instructions and errors.
Diagnosed issues for core SEO pages indexed though blocked by robots.txt and mapped out robots.txt seo optimizations and robots.txt best practices for search engine visibility and indexability.
Improved robots.txt for SEO by removing disallow instructions for affected pages and adding additional disallow instructions for indexability and security.
Validated fixes through Google Search Console and provided full reporting to show the tangible value of fixing pages blocked by robots.txt and implementing robots.txt best practices.
“Firebrand’s detailed and comprehensive Technical SEO efforts have made a tremendous impact on our site health and behavioral metrics. They have provided valuable guidance, leading to tremendous organic growth through the execution of technical audits that would not have been identified and fixed without their expertise.” – Jason Stevens, Senior Manager of Creative – Planful