The Web Developer's SEO Cheat Sheet
The Web Developer's SEO Cheat Sheet
You live and breathe SEO and can’t understand why others don’t share
the same sense of passion for search optimisation as you.
Sound familiar?
If you are struggling to get your web developers to see why you need a
constant stream of changes and adjustments made to keep one step
ahead of the game when it comes to effective SEO, we have got exactly
what you need.
BEST PRACTICES
META DESCRIPTION TAG
• Aim for 160 characters,
<head> including spaces.
WEBMASTER TOOLS
<meta name="description" • Make sure each description is
Webmaster tools allow you to see how your site is being viewed
content="This is an unique.
by search engines such as Google and can help you to uncover any
example."> • Focus on high-quality writing
for maximum CTR. areas that need fixing.
</head>
You can choose to block whatever bots you don't want to be able to
BEST PRACTICES • Use "alt" as the anchor text. crawl your site using your robots.txt file.
URLS ROBOTS.TXT BEST PRACTICES
Location:
• Ensure all important pages
URLs are a small ranking factor in SEO that helps search engines https://example.com/robots.txt are crawlable.
to determine a particular web page's relevancy to a search query. • Don't block your site's
There are a number of factors that are related to URLs that can User-agent: googlebot JavaScript and CSS files.
affect your ranking, with two of the most important being length Disallow: /example.html • Use proper capitalisation of
and the addition of keywords. Sitemap: directory, sub-directory, and
https://example.com/sitemap.xml file names.
BEST PRACTICES • Add your sitemap's location
to your robots.text file.
Common URL Elements • Aim for between 50-60
characters
1. Protocol • Add keywords that are
2. Subdomain relevant to the page's topic X-ROBOTS-TAG BEST PRACTICES
3. Root domain • Avoid the use of parameters Location: Sent in the HTTP
• X-Robots-Tag can remove
4. Top-level domain • Do not use spaces, headers URLs from search results.
5. Subfolder/path underscores, or other characters
6. Page • Try to place content on the
X-Robots-Tag: noindex
7. Parameter same subdomain for enhanced
8. Named anchor authority
• Use HTTPS protocol
META ROBOTS BEST PRACTICES
Location: In the HTML <head>
• Meta Robots can remove
URLs from search results.
<meta name="robots" content="
[PARAMETER]" />
ROBOTS EXCLUSION STANDARD
The robots exclusion standard, which is also known as the IMPORTANT PARAMETERS
robots.txt, is a standard used by websites to communicate with
web crawlers. More specifically, it tells web robots where not to • Noindex
scan on a website. This can help your SEO efforts as it cuts down • Nofollow
the amount of time it takes a search engine bot to crawl a site, • Noarchive
helping it to stay within its crawl budget and, in turn, improve a • Or a combination of noindex and nofollow
site's potential ranking.
SITEMAP SYNTAX DEFAULT LOCATION:
https://example.com/sitemap.xml
A sitemap is an integral part of any website and is needed for both
users and search engines. Essentially the structure of your sitemap: parent tag for each sitemap
website, it is similar to a book's contents page, but within the loc: location of the sitemap
sections are links. lastmod: the last modified date
Rich Snippets, which are sometimes called Rich Results, are <script type="application/ld+json">
Google search results that display additional data which is pulled {
from the Structured Data found in a web page's HTML. Rich "@context": "http://schema.org/",
Snippets are more eye-catching than normal search results, which "@type": "Review",
can increase your organic click-through rate. "reviewBody": "The restaurant has great ambiance.",
"itemReviewed": {
The use of structured data can help a website to stand out in "@type": "Restaurant",
SERPs. Stick to schema.org for best results. "name": "Fine Dining Establishment"
},
BREADCRUMBS
"reviewRating": {
"@type": "Rating",
<script type="application/ld+json">
"ratingValue": 5,
{
"worstRating": 1,
"@context": "http://schema.org",
"bestRating": 5,
"@type": "BreadcrumbList",
"reviewAspect": "Ambiance"
"itemListElement": [
}
{
}
<---Repeat markup for additional list items--->
</script>
"@type": "ListItem",
"position": 1,
"item": {
"@id": "http://example.com/dinner",
"name": "Dinner" }
},
SECURITY
<---Additional list items here--->
}]
Making sure your website is secure is crucial for SEO as one of
}</script> Google's top priority ranking factors is website security.
Common Structured Data Types:
BEST PRACTICES
• Local business • Product • Install an SSL certificate.
• FAQ page • Article • Use strong passwords.
• Person • Recipes • Get security plug-ins installed.
• How to • QApage
TARGETING MULTIPLE LANGUAGES SITEMAP VERSION
The performance of your site is intrinsically linked to your Java-Script is a text-based programming language that allows you
customer experience and is, therefore, crucial if you want to stand to make web pages interactive for enhanced user engagement.
out from your competitors. There are several different aspects of
site performance, including:
BEST PRACTICES
BEST PRACTICES
Lighthouse: developers.google.com/web/tools/lighthouse
GTmetrix: gtmetrix.com
WebPageTest: webpagetest.org