Technical SEO Checklist: 50 Points to Audit
A 50-point technical SEO checklist covering crawlability, indexing, site speed, mobile, structured data, and security. Fix issues that block your rankings.
A technical SEO checklist is the blueprint for making sure search engines can crawl, index, and rank your website properly. You can write the best content on the internet — but if Google can't access it, render it, or understand it, none of that content ranks.
Technical SEO accounts for the infrastructure layer of search optimization. It's the plumbing behind the walls. When it works, nobody notices. When it breaks, everything else fails.
This checklist covers 50 auditable items organized into six categories. Use it as a quarterly audit framework or as a one-time deep dive to diagnose ranking problems.
How to Use This Checklist
Work through each section in order. Crawlability and indexing issues should be fixed first — they're foundational. Speed and mobile come next because they directly affect rankings and user experience. Structured data and security are optimization layers that build on the foundation.
For each item:
- Pass — the item meets best-practice standards
- Fail — the item needs immediate attention
- N/A — not applicable to your site
If you need a broader view of how technical SEO fits into your overall strategy, start with our complete SEO for small business guide.
Section 1: Crawlability (Items 1-10)
Crawlability determines whether search engines can discover and access your pages. If Google's bots can't crawl your site, nothing else matters.
1. robots.txt Is Properly Configured
Your robots.txt file tells search engines which pages to crawl and which to skip. Check for:
- File exists at
yoursite.com/robots.txt - Important pages are not accidentally blocked (
Disallow: /blocks everything) - CSS and JavaScript files are not blocked (Google needs them to render pages)
- Staging or admin areas are blocked
- XML sitemap URL is referenced
2. XML Sitemap Exists and Is Submitted
Your sitemap tells search engines about every page you want indexed.
- Sitemap exists at
yoursite.com/sitemap.xml - Submitted to Google Search Console and Bing Webmaster Tools
- Contains only canonical, indexable URLs (no 404s, no redirected pages, no noindexed pages)
- Updates automatically when pages are added or removed
- Stays under 50,000 URLs per sitemap file (use sitemap index for larger sites)
3. No Orphan Pages
Orphan pages have no internal links pointing to them. Google discovers pages primarily through links — orphan pages may never be crawled.
- Every important page is linked from at least one other page
- Check Google Search Console > Pages for indexed pages not in your sitemap
- Use Screaming Frog or Sitebulb to identify orphan pages
4. Crawl Budget Is Not Wasted
For larger sites (1,000+ pages), Google allocates a crawl budget — a limit on how many pages it crawls per visit.
- Remove or noindex thin/duplicate/low-value pages
- Fix redirect chains (more than one redirect in sequence)
- Block crawling of URL parameters that create duplicate pages
- Minimize server errors (5xx responses waste crawl budget)
5. No Redirect Chains or Loops
A redirect chain happens when Page A redirects to Page B, which redirects to Page C. Each hop loses link equity and slows crawling.
- Maximum one redirect between any two URLs
- No circular redirect loops (A → B → A)
- Update internal links to point directly to final destinations
- Audit with Screaming Frog > Response Codes > Redirect Chains
6. Clean URL Structure
URLs should be human-readable, keyword-rich, and logically organized.
- Use hyphens, not underscores (
/technical-seonot/technical_seo) - Keep URLs short and descriptive
- Avoid dynamic parameters when possible (
/services/seonot/page?id=42&cat=3) - Use lowercase only
- Maintain consistent trailing slash usage (either always or never)
7. Internal Links Use Crawlable HTML
Google can only follow standard HTML anchor tags. JavaScript-based navigation, onclick events, and dynamically loaded links may not be crawled.
- All navigation uses
<a href="...">tags - No critical links behind JavaScript click handlers
- Dropdown menus and mobile navigation are HTML-based
8. Server Response Time Is Under 200ms
Slow server response (Time to First Byte / TTFB) delays crawling and hurts user experience.
- Measure TTFB with WebPageTest or Chrome DevTools
- Target under 200ms for most pages
- Common fixes: upgrade hosting, enable server-side caching, optimize database queries, use a CDN
9. No Soft 404 Errors
Soft 404s are pages that display "not found" content but return a 200 HTTP status code. Google wastes crawl budget on these.
- Deleted pages should return a proper 404 status code
- Empty pages should return 404 or be redirected to relevant content
- Check Google Search Console > Pages for soft 404 reports
10. Pagination Is Properly Handled
For paginated content (product listings, blog archives):
- Each page is crawlable and has a unique URL
- Use
rel="next"andrel="prev"link tags (still useful for Bing) - Or use "View All" pages where practical
- Paginated pages should have self-referencing canonical tags
Section 2: Indexing (Items 11-20)
Indexing determines which of your crawled pages actually appear in Google's search results.
11. Important Pages Are Indexed
Check Google Search Console > Pages to see which pages are indexed and which are excluded.
- All key pages (homepage, service pages, product pages, blog posts) are indexed
- Search
site:yourdomain.comin Google to see what's actually in the index - Pages you want indexed should not have
noindextags
12. Noindex Tags Are Used Correctly
Use noindex to keep pages out of search results:
- Admin/login pages — noindex
- Thank-you/confirmation pages — noindex
- Internal search results pages — noindex
- Tag/category archive pages with thin content — noindex
- Do NOT noindex service pages, blog posts, or product pages
13. Canonical Tags Are Implemented
Canonical tags tell Google which version of a page is the "master" copy when similar content exists at multiple URLs.
- Every page has a self-referencing canonical tag
- Duplicate content pages point canonical to the preferred version
- HTTP pages canonical to HTTPS versions
- www and non-www versions resolve to one (via redirect AND canonical)
- Paginated pages have self-referencing canonicals (not pointing to page 1)
14. No Duplicate Content Issues
Duplicate content confuses Google about which page to rank.
- Check for www vs. non-www duplicates
- Check for HTTP vs. HTTPS duplicates
- Check for trailing slash vs. non-trailing slash duplicates
- URL parameters should not create duplicate pages
- Printer-friendly page versions should canonical to the main version
15. Hreflang Tags for Multi-Language Sites
If your site serves multiple languages or regions:
hreflangtags are present on all pages- Tags reference all language versions, including self
- An
x-defaultversion is specified - Hreflang tags are reciprocal (if EN points to FR, FR must point back to EN)
16. Meta Robots Tags Are Correct
Check that meta robots tags aren't accidentally restricting indexing:
- No unexpected
noindexon important pages - No
nofollowon internal links (this wastes link equity) - X-Robots-Tag HTTP headers aren't blocking indexing
- Check both HTML meta tags AND HTTP headers
17. Page Titles Are Unique
Every page should have a unique title tag. Duplicate titles make Google guess which page to rank.
- No two pages share the same title tag
- Titles are descriptive and include relevant keywords
- Titles are under 60 characters
- Screaming Frog > Page Titles > Duplicate to identify issues
18. Meta Descriptions Are Unique
While not a ranking factor, unique meta descriptions improve click-through rates.
- Every page has a unique meta description
- Descriptions are 150-160 characters
- Descriptions include a call to action
- No duplicate descriptions across pages
19. Image Alt Text Is Present
Alt text helps Google understand images and is essential for accessibility.
- All images have descriptive alt attributes
- Alt text describes the image content, not just keywords
- Decorative images use empty alt (
alt="") - No keyword stuffing in alt attributes
20. JavaScript Content Is Rendered and Indexed
If your site uses JavaScript frameworks (React, Angular, Vue), Google may not see all your content.
- Test rendering with Google Search Console > URL Inspection > Rendered Page
- Server-side rendering (SSR) or static site generation (SSG) is preferred for SEO-critical content
- Critical content is not loaded lazily below the fold without proper triggers
Our web development team builds sites with Next.js, which handles SSR and SSG natively — eliminating JavaScript rendering issues.
Section 3: Site Speed (Items 21-30)
Page speed is a confirmed Google ranking factor. Core Web Vitals — LCP, INP, and CLS — directly affect rankings since the Page Experience update.
21. Largest Contentful Paint (LCP) Under 2.5s
LCP measures how long it takes for the largest visible element (usually a hero image or headline) to load.
- Target: under 2.5 seconds
- Measure: PageSpeed Insights, Chrome DevTools, or CrUX data
- Common fixes: optimize hero images, preload critical resources, upgrade hosting
22. Interaction to Next Paint (INP) Under 200ms
INP measures responsiveness — how quickly the page responds to user interactions.
- Target: under 200ms
- Common fixes: break up long JavaScript tasks, defer non-critical JS, minimize main thread blocking
23. Cumulative Layout Shift (CLS) Under 0.1
CLS measures visual stability — elements shouldn't jump around as the page loads.
- Target: under 0.1
- Common fixes: set explicit width/height on images and videos, reserve space for ads/embeds, avoid injecting content above existing content
24. Images Are Optimized
Images are typically the heaviest assets on any web page.
- Use WebP or AVIF format (30-50% smaller than JPEG/PNG)
- Implement responsive images with
srcset - Lazy-load images below the fold
- Set explicit width and height attributes (prevents CLS)
- Compress images to appropriate quality (80% is usually indistinguishable from 100%)
25. CSS and JavaScript Are Minified
Removing whitespace and comments from CSS/JS files reduces file size.
- CSS and JS files are minified in production
- Unused CSS is removed (PurgeCSS or similar)
- Critical CSS is inlined in the
<head> - Non-critical CSS is loaded asynchronously
26. Browser Caching Is Enabled
Caching lets returning visitors load your site faster by reusing previously downloaded assets.
- Static assets have
Cache-Controlheaders (target: 1 year for versioned assets) - HTML pages use shorter cache times (or
no-cachewithETagvalidation) - Service worker caching for repeat visitors (if applicable)
27. CDN Is Configured
A Content Delivery Network serves your files from servers closest to each visitor.
- Static assets (images, CSS, JS) served from CDN
- CDN covers your target geographic regions
- CDN is configured with proper caching headers
28. No Render-Blocking Resources
Resources that block rendering delay the time to first paint.
- Non-critical CSS is loaded asynchronously (
media="print" onload) - Non-critical JavaScript uses
deferorasyncattributes - Third-party scripts are loaded after critical content
29. Font Loading Is Optimized
Custom fonts can cause invisible text (FOIT) or layout shifts (FOUT).
- Use
font-display: swapto show fallback text while fonts load - Preload critical font files (
<link rel="preload" as="font">) - Host fonts locally instead of relying on external services
- Subset fonts to include only needed characters
30. Server Compression Is Enabled
Gzip or Brotli compression reduces transfer size by 60-80%.
- Brotli compression enabled (better than Gzip)
- HTML, CSS, and JS responses are compressed
- Test with
curl -H "Accept-Encoding: br" -I yoursite.com
Section 4: Mobile (Items 31-37)
Google uses mobile-first indexing — the mobile version of your site is what gets ranked.
31. Site Is Fully Responsive
Every page should work correctly at all screen sizes.
- Test at 320px, 375px, 414px, 768px, 1024px, and 1440px widths
- No horizontal scrolling
- No content hidden on mobile that's visible on desktop (Google indexes mobile content)
32. Touch Targets Are Properly Sized
Buttons and links must be large enough to tap accurately.
- Minimum 44x44px touch target size
- At least 8px spacing between adjacent touch targets
- No overlapping clickable elements
33. Viewport Meta Tag Is Present
Without this tag, mobile browsers render pages at desktop width and scale down.
<meta name="viewport" content="width=device-width, initial-scale=1">is present- No
maximum-scale=1oruser-scalable=no(blocks accessibility zoom)
34. Text Is Readable Without Zooming
- Body text is at least 16px on mobile
- Line height is at least 1.5x font size
- Adequate contrast between text and background (minimum 4.5:1 ratio)
35. Mobile Page Speed Is Fast
Mobile connections are slower than desktop. Optimize aggressively.
- Mobile LCP under 2.5 seconds on 4G connection
- Total page weight under 2MB (ideally under 1MB)
- Reduce number of HTTP requests (combine files, use sprites)
36. No Intrusive Interstitials
Google penalizes mobile pages with popup overlays that block content.
- No full-screen popups on page load
- No interstitials that block content before the user can read it
- Cookie consent banners should not cover the full screen
- Exceptions: legal requirements (age verification, cookie consent)
37. Mobile Navigation Works Correctly
- Hamburger menu opens and closes reliably
- All navigation items are accessible
- Phone numbers are tap-to-call links
- Forms are easy to fill on mobile (proper input types, auto-complete)
Section 5: Structured Data (Items 38-44)
Structured data helps Google understand your content and can earn rich results (star ratings, FAQ dropdowns, breadcrumbs, etc.) in search results.
38. Organization Schema Is Implemented
Add Organization schema to your homepage:
- Business name, logo, URL, contact information
- Social media profile links
- Founding date and description
39. Breadcrumb Schema Is Present
Breadcrumbs improve navigation and can appear in search results.
- Breadcrumb structured data matches visible breadcrumbs on the page
- Every page (except homepage) has breadcrumbs
- Hierarchy is logical (Home > Services > SEO Growth)
40. FAQ Schema on Relevant Pages
FAQ schema can display your questions and answers directly in search results, significantly increasing click-through rates.
- FAQ structured data on pages with FAQ content
- Questions and answers match visible page content exactly
- Limited to 3-5 questions per page for best results
41. Product Schema for E-commerce
If you sell products:
- Product name, description, image, price, currency, availability
- Aggregate rating and review count
- SKU and brand information
- Offer validity dates
42. Article Schema for Blog Posts
Every blog post should have Article structured data:
- Headline, description, date published, date modified
- Author information
- Publisher (organization) information
- Featured image
43. Local Business Schema
For businesses serving a geographic area (covered in detail in our local SEO guide):
- Business type, name, address, phone, hours
- Geographic coordinates
- Service area (for service-area businesses)
44. Validate All Structured Data
- Test every schema implementation with Google's Rich Results Test
- Monitor Google Search Console > Enhancements for errors
- Fix errors within 48 hours — broken schema can lose rich result eligibility
- Re-validate after any page template changes
Section 6: Security and HTTPS (Items 45-50)
Security is a ranking factor and a trust signal. HTTPS is non-negotiable.
45. HTTPS Is Enabled Site-Wide
- Valid SSL/TLS certificate installed
- Certificate is not expired
- All pages load via HTTPS
- No mixed content (HTTP resources loaded on HTTPS pages)
46. HTTP to HTTPS Redirects Are in Place
- All HTTP URLs 301 redirect to HTTPS
- Redirects are server-level (not JavaScript-based)
- Internal links point to HTTPS versions directly
47. Security Headers Are Configured
Security headers protect your visitors and signal a well-maintained site.
Strict-Transport-Security(HSTS) — forces HTTPS connectionsX-Content-Type-Options: nosniff— prevents MIME type sniffingX-Frame-Options: DENY(orSAMEORIGIN) — prevents clickjackingContent-Security-Policy— controls which resources can loadReferrer-Policy— controls referrer information sent to other sites
48. No Malware or Security Warnings
- Site is not flagged in Google Safe Browsing
- Check Google Search Console > Security Issues for warnings
- Regular malware scans if using WordPress or other CMS
- All plugins and dependencies are up to date
49. DNSSEC Is Enabled
DNSSEC (Domain Name System Security Extensions) protects against DNS spoofing.
- Enabled through your domain registrar
- Verify with a DNSSEC analyzer tool
- Not strictly an SEO factor, but protects site integrity
50. Backup and Recovery Plan Exists
While not an SEO factor, losing your site means losing all SEO equity.
- Automated daily backups
- Backups stored off-server
- Tested recovery procedure
- Database and file system both included
Running Your Audit
The tools you need to complete this checklist:
Free tools:
- Google Search Console — indexing, crawl errors, Core Web Vitals
- Google PageSpeed Insights — speed metrics and recommendations
- Google Rich Results Test — structured data validation
- Google Mobile-Friendly Test — mobile rendering check
- Chrome DevTools — manual inspection, Lighthouse audits
Professional tools (free tiers available):
- Screaming Frog SEO Spider — crawls up to 500 URLs free
- Ahrefs Webmaster Tools — backlink analysis, site audit
- GTmetrix — detailed speed analysis
Our recommendation: Run this audit quarterly. Technical issues creep in with every site update, plugin installation, and content change. A quarterly check catches problems before they compound.
If this checklist reveals more issues than your team can handle internally, our technical SEO services cover the full audit, fix implementation, and ongoing monitoring. We work alongside your development team — or our web development team can handle the implementation directly.
Frequently Asked Questions
How often should I run a technical SEO audit?
At minimum, quarterly. Run an additional audit after any major site change: redesigns, CMS migrations, hosting changes, or large content updates. Set up automated monitoring in Google Search Console to catch critical issues (crawl errors, security issues, Core Web Vitals drops) between full audits.
Which technical SEO issues have the biggest impact on rankings?
Crawlability issues are the highest priority — if Google can't access your pages, nothing else matters. After that: indexing errors (accidental noindex tags, canonical issues), Core Web Vitals failures (especially LCP), and mobile usability problems. Fix issues in the order they appear in this checklist — it's ordered by impact.
Can I fix technical SEO issues without a developer?
Some items are non-technical: submitting sitemaps, writing alt text, fixing meta descriptions. But most technical SEO work requires access to server configuration, HTML templates, and sometimes code changes. If you're using a CMS like WordPress, plugins like Yoast or Rank Math handle some items. For custom sites built with frameworks like React or Next.js, you'll likely need a developer.
What's the difference between technical SEO and on-page SEO?
On-page SEO focuses on content: keywords, title tags, headings, internal links — the stuff users see. Technical SEO focuses on infrastructure: crawlability, site speed, mobile rendering, structured data, security — the stuff behind the scenes. Both are essential. Think of technical SEO as the foundation and on-page SEO as the building on top.
Need a professional technical SEO audit? Reach out to our team for a comprehensive analysis of your site's technical health. We'll identify every issue, prioritize by impact, and build a fix plan with clear timelines.
Need Help With Your Project?
Our team of experts is ready to help you build, grow, and succeed. Get a free consultation today.
Book Free Consultation