CMS-specific implementation guides
Operational runbooks translating this playbook onto each major CMS, including hosting edges, authoring workflows, and integration seams that typically move rankings and AI retrieval outcomes.
Prefer a CMS-wide lens before tackling another topic?
Review every SEO & GEO playbook surfaced for
WordPress,
Shopify,
Webflow,
Drupal,
HubSpot CMS,
Contentful, or
Adobe Experience Manager.
Implement Content Freshness & Decay on WordPress
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside WordPress authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on Shopify
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside Shopify authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on Webflow
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside Webflow authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on Drupal
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside Drupal authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on HubSpot CMS
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside HubSpot CMS authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on Contentful
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside Contentful authoring, templating, and CDN edges.
Open guide →
Implement Content Freshness & Decay on Adobe Experience Manager
A systematic playbook for detecting content decay, prioritizing refreshes, and maintaining topical relevance in AI-powered search environments, operationalized inside Adobe Experience Manager authoring, templating, and CDN edges.
Open guide →
What Is Content Decay?
Content decay is the gradual decline in organic search traffic a page experiences as it ages — caused by new competitors, updated information, changing search intent, and algorithm freshness preferences. It's one of the most overlooked threats to organic traffic at scale because it happens slowly and isn't surfaced by most SEO dashboards without deliberate monitoring.
Why Freshness Matters for AI SEO
AI-powered search systems are trained to prefer accurate, up-to-date information. In AI Overviews and answer engine results, pages with recent content signals are consistently favored — especially for queries where information changes rapidly (tools, statistics, regulations, best practices). A page published in 2022 with no updates competes against a 2025 article even if the original content was excellent.
Types of Content That Decay Fastest
- Statistics and data pages — Industry statistics become outdated within 1–2 years
- Tool comparisons and listicles — Software changes, pricing changes, products get discontinued
- Best practices guides — What was best practice in 2022 may be outdated by 2025
- Event-driven content — Annual reports, conference recaps, trend pieces
Types of Content That Age Well
- Foundational concept explanations ("What is X")
- Framework and methodology pages (like this one)
- Historical reference content
- Step-by-step process guides for stable workflows
- Build a content decay dashboard — Pull 12 months of Google Search Console data for all URLs; calculate month-over-month impression and click trends; flag pages with >20% YoY decline
- Prioritize by decay rate × keyword value — High-traffic pages with fast decay are urgent; low-traffic pages with slow decay can wait
- Classify content type — Determine whether each decaying page needs a light refresh (update stats, add new sections) or a full rewrite
- Conduct a competitor gap analysis — For each decaying page, audit top-ranking competitors to identify what new information they cover that you don't
- Update substantively — Add new sections, refresh all statistics, update examples, incorporate new tools; minor copy edits don't trigger freshness signals
- Update the published/modified date — Only after making substantive changes; Google detects date manipulation without content changes
- Re-promote updated content — Share on social, send to email list, build new internal links pointing to the refreshed page
- Set a review schedule — Assign refresh intervals by content type: quarterly for statistics/tools, annually for evergreen guides, continuously for news
- Updating dates without updating content — Google's freshness algorithms detect this pattern; it can actually trigger a quality review
- Deleting old content instead of refreshing it — Aged pages often have backlinks and historical authority; 301-redirecting or refreshing is almost always better than deletion
- Refreshing low-priority pages first — Always prioritize by traffic impact; refreshing a 50-visit/month page before a 5,000-visit/month decaying page is a resource misallocation
- Not tracking the impact of updates — Set a 90-day traffic tracking checkpoint after each refresh to measure whether the update worked
- Ignoring new search intent — Sometimes content decays because the intent behind the query has changed (informational → transactional); a refresh won't fix an intent mismatch
- Google Search Console — Primary source for tracking impression and click trends over time; export for decay analysis
- Ahrefs Content Gap & Organic Keywords — Identify what competitors rank for that you don't; essential for refresh gap analysis
- SEMrush Content Audit — Automated content performance scoring and refresh prioritization
- Google Sheets / Looker Studio — Build a custom content decay dashboard by pulling GSC data via API
- Clearscope — Content optimization and freshness scoring for updated pages
How often should evergreen content be updated?
At minimum, once per year for a substantive review. High-traffic evergreen pages should be reviewed every 6 months. If you notice a 15%+ YoY traffic decline, treat it as urgent and refresh within the quarter.
Does adding a new section count as a substantive update?
Yes — adding a new H2 section with 300+ words of new content, updated statistics, or a new use case is a clear freshness signal. Fixing typos or changing a few sentences is not.
Should I redirect old content or refresh it?
Refresh first if the page has any meaningful backlinks or organic traffic history. Redirect only if the topic has fundamentally changed and a new URL better serves current search intent.
Ahrefs' Content Decay Recovery Program
Ahrefs has publicly shared data showing that systematically refreshing their blog posts — updating statistics, expanding sections, adding new examples, and republishing with updated dates — consistently recovers and often exceeds pre-decay traffic levels. In several documented cases, a refreshed post recovered 50-200% of lost traffic within 60 days of republishing. Their process: quarterly audit of all posts older than 12 months, prioritized by traffic decline, with a structured update checklist. Content refreshes are now a scheduled part of their content calendar, not an ad-hoc activity.