Updated: September 2, 2022.
The essential on-page SEO checklist (70+ items) that every SEO should always have at hand.
Thanks to this on-page SEO checklist, you will get to know 70+ on-page SEO elements that are essential if you want your site to achieve its full SEO potential, discover several SEO best practices for each of the on-page SEO elements discussed, learn a total of 500+ pro SEO tips that come from my real-life experience with SEO, and become a better and smarter SEO.
Let’s get started!
On-Page SEO Checklist
The purpose of this on-page SEO checklist is to ensure that you’ve taken care of almost every on-page SEO element so that your site can achieve its full SEO potential.
This on-page SEO checklist will also be beneficial for people who are midway doing their on-page SEO or have already completed their on-page SEO. By going through this checklist, they can ensure that they’ve taken all the steps.
Ready? Let’s get started!
⚡ Do not miss my website redesign SEO checklist that contains 20+ SEO elements you need to remember about when redesigning your site.
0. Essential SEO Tools
To be able to monitor and measure the SEO progress of a site, you need to make sure it uses the most important SEO tools.
- The essential SEO tools that each website should use are Google Analytics (GA) and Google Search Console (GSC). Microsoft Clarity is also a useful tool for monitoring user behavior.
- GA allows you to monitor and anlyze your users and their behavior.
- GSC lets you analyze the performance of your site in search.
- These tools require manual verification by placing the verification code on the site (usually in the
- In the case of Google Search Console and the domain verification, you will need to add a TXT DNS record.
- The chances are that existing sites already use these tools. Check if that is the case. If not, make sure to verify GA and GSC so that they can be used.
1. XML Sitemap
An XML sitemap allows Googlebot and other search engine crawlers to discover the web pages of the site.
- XML sitemaps are essential for large websites that have more than a few thousand web pages.
- Having an XML sitemap is a great practice for smaller websites. I recommend each website should have one.
- An XML sitemap should contain only indexable and canonical URLs.
- An XML sitemap should not use the deprecated
<changefreq>parameters. Google ignores these parameters.
- Ideally, an XML sitemap should also contain images. A separate XML image sitemap is also OK.
⚡ Check my guide on how to find the sitemap of a website if you have problems localizing the XML sitemap.
2. Robots.txt And Crawlability
The robots.txt file is the first place a crawler visits when crawling the site. The file informs the crawler whether or not it should crawl specific website resources.
- The website does not have to have the robots.txt file. However, it is a good practice for a website to have one.
- Acceptable (successful) responses for robots.txt include HTTP 200 (OK) or 403/404/410 (file does not exist).
- Unacceptable (unsuccessful) robots.txt responses include HTTP 429/5XX. These responses make it impossible for the crawler to crawl the website.
- Robots.txt should indicate the address of an XML sitemap.
- Robots.txt should be valid and free of any syntax errors.
- Robots.txt should not block the resources that should be indexed. It does not prevent a resource from being indexed.
- Robots.txt is a great way to optimize the crawl budget for huge websites (with millions of web pages).
- You can use robots.txt Tester to check if there are any syntax warnings or logic errors.
⚡ If you have a WordPress website, check my guide on how to modify robots.txt in WordPress.
A website or a web page can be visible in organic search only if it is indexable and accessible to the Googlebot and other search engine crawlers.
- A website won’t rank if it’s invisible to search engine crawlers.
- An indexable page is one that can appear on Google search. To check if a given web page is indexed by Google simply type
site:domain.comand see if it shows up. If it’s not showing up, check for the noindex tag on the page.
- Use Semrush or Screaming Frog SEO Spider to check the indexability of multiple pages (if you want to see which can be crawled and/or indexed).
- Use the SEO Indexability Check Chrome extension to check if one specific web page can be indexed.
- The most common blockers are robots.txt and noindex tags.
- It is not only the noindex tag that makes it impossible for a page to be indexed. The site architecture might also be at fault. If the page is squared away in a forgotten digital hole somewhere, the crawlers might not be able to index it. Crawl depth might help you identify if that’s the problem. The ideal number is three clicks or less. Screaming Frog SEO Spider will show you the crawl depth.
⚡ Check the list of Google search operators to perform even more advanced searches and check what is in the index.
4. Directives in Robots Meta Tag and X-Robots-Tag
The robots meta tag and X-Robots-Tag inform search engine robots about whether specific resources should be indexed.
- Use Semrush or Screaming Frog SEO Spider to check the values of the robots meta tag and X-Robots-Tag.
- Make sure that these tags block the resources that indeed should be blocked.
- Check if there are indexable website resources that should be blocked by the robots meta tag or X-Robots-Tag.
- Check if the directives given in the robots meta tag, X-Robots-Tag, and robots.txt do not contradict one another.
The single best way to check if a search engine robot can see the entire content of a web page is to compare the source and the rendered code.
- Use the Mobile-Friendly Test to check if a page is rendered correctly or if there are any loading issues.
- Compare the source and rendered HTML to make sure there are no differences between the two.
- You can use Screaming Frog SEO Spider to render all web pages of the website. The tool also allows you to compare the source and the rendered HTML.
- You can also check how a page is rendered in Google Search Console. Go to URL inspection and choose LIVE TEST. Click VIEW TESTED PAGE to see how it renders and compare the code.
6. Language Versions
A multilingual or multi-regional website needs to inform search engine robots about the alternate language versions of its web pages.
- Language versions should be clearly divided.
- It is best to use a URL structure that makes it easy to geotarget a site. The possible setups include using a country-specific domain, subdomains with gTLD, or subdirectories with gTLD.
- Hreflang tags should be used on multilingual and multi-regional websites.
- The return links and x-default hreflang attributes should also be used.
- You can use Semrush or Screaming Frog SEO Spider to verify the implementation or hreflang tags.
Redirects inform search engine crawlers about the new URL address of a given web page or resource.
- Redirects make navigating through the website smooth for both crawlers and users who do not need to come across 404 or other types of error pages when a given URL is not available anymore.
- It is important to use the correct types of redirects.
- 301 (permanent) redirects should be used in the case of permanent changes (i.e. permanent removal of a web page).
- 302 (temporary) redirects should be used to indicate a temporary change of address.
- Multiple redirects (redirect chains) should not be used.
- Meta refresh redirects should not be used at all. They are considered sneaky redirects by Google.
- The best way to check the redirects in bulk is to crawl a site with a website crawler, such as Screaming Frog SEO Spider or Semrush.
8. Status Codes
Status codes provide information about the status of the HTTP requests sent by the browser to the server.
- The resources returning error codes 5xx and 4xx cannot be crawled and may not be indexed or dropped out of the index.
- You can check the status codes in bulk with the help of Semrush or Screaming Frog SEO Spider.
- You want the majority of the web pages to return status 200 (OK).
- The web pages returning 4xx errors should be – ideally – redirected to the resources returning 200.
- 404 pages do not have a negative influence on SEO but they may lead to a bad user experience.
- 5xx status codes indicate problems with the server.
9. Error Page
A website should handle error pages correctly to avoid the creation of soft 404 pages that can be indexed by Google and displayed in search results.
- An error page should always return a 404 status code.
- A website should have an error page that informs the user about the error and has links to the most important web pages of the site.
- A website should have a dedicated error page that has the same layout and design as other web pages.
- Google is pretty good at recognizing and ignoring soft 404 pages but it is still a great practice to avoid them.
- You can use the Link Redirect Trace Chrome extension to quickly check the response code of any web page.
Duplicate content is not penalized by Google. However, duplication can have a negative effect on the organic visibility of the website.
- Duplicate content is often caused by the incorrect implementation of the sorting of content. For example, URLs with filters and/or sorting parameters are indexable and do not have the
rel="canonical"pointing to the URL without parameters.
- A site should not be available at both the HTTPS and non-HTTPS versions. The HTTP version should permanently (301) redirect to the HTTPS version.
- The same applies to the non-WWW and WWW versions.
- The web page of the site should not be accessible at URLs in which the letter case is not significant.
- Duplicate or near-duplicate pages should be canonicalized, removed, or merged.
- The content of the website should be unique on the Internet.
- Duplicate content is bad for SEO because – for example – if you have a few duplicate or near-duplicate articles, you will be competing with yourself in SERPs. Only one of these articles is likely to be visible in search.
The rel=”canonical” element is treated by search engines as a hint that a given URL is the main version that should be indexed.
- Canonical link elements are a great way to avoid content duplication.
- It is a good practice for a website to have canonical URLs defined for each URL.
- Make sure the web pages of your site do not have duplicate canonicals.
- Google might choose a canonical address that is different than the one indicated with the
- Use the Google Search Console URL inspect tool to check the canonical URL selected by Google.
- You can check the declared canonical link elements in bulk with a website crawler, such as Semrush or Screaming Frog SEO Spider.
12. HTML Code
The only thing that search engine robots can see and understand is the code of the website. Make sure robots like it!
- The better the quality of the code, the easier it is for Google to crawl, render and index the site.
- There is no such thing as an ideal ratio of content to HTML. In general, however, the code should not be excessively extended in comparison to the content of the site.
- Any unnecessary comments should be removed from the code.
<head>or before the closing
13. URL Structure
The URL structure does not have a direct influence on SEO but may help both users and search engine robots better understand the topic of the web page.
- Keep URLs sweet, short, and as informative as possible.
- Use the main keyword in the URL.
- URLs should not contain parameters (e.g. session or user identifiers) that do not influence the content displayed.
- URLs should not contain words in a different language than the language of the website.
14. Schema Markup
The Schema markup is the language of search engine robots. It is a great way to provide additional information about the website and its topics (entities).
- The Schema markup allows search engines to better understand what the website and its web pages are about.
- The Schema markup for rich snippets helps Google return more information and data to the users, like ratings, price, product availability, etc. directly in search results.
- Since rich snippets help the user get a piece of the relevant information without even opening the link, so the site is more likely to get a click-through.
- You can Screaming Frog SEO Spider to validate Schema markup across the entire website.
- You can use the Google Rich Results Test to test if rich snippets can appear correctly for your site.
15. Website structure
Internal and external links are some of the most important on-page SEO elements. If handled incorrectly, they can make it almost impossible for a website to rank in organic search. If implemented according to the best practices (that I am about to share), they can help the site achieve its full SEO potential.
The most optimal way to build a website is to use a pyramid structure.
- The website structure should neither be too deep (more than 4-5 levels) or too flat (1-2 levels).
- The most important web pages should be linked from the home page and the navigation menu.
- The most important web pages (parent pages) should then link to the most relevant and important subpages (child pages). Child pages should link back to parent pages.
- Thematically related web pages (articles, posts, product pages, etc.) should link to one another.
- You can use Screaming Frog SEO Spider to analyze the structure and crawl depth of the site.
16. Internal Linking
Internal links let search engine robots crawl the site. They also make it possible for users to navigate the website and discover its web pages.
- Internal links should be helpful to users and relevant.
- The anchor text for your internal links can and should contain the main keyword or its variation.
- You can help slightly boost the ranking of other web pages on your website by linking to them from your high-ranking pages.
- If you are linking multiple times to the same page, place the first link where it’s most likely to be clicked because Google may ignore all future instances of the link.
- Internal links should use preferred (canonical) URLs.
- You can analyze all the internal links and anchor texts in bulk with the help of Screaming Frog SEO Spider.
Breadcrumbs are an important element of a good internal linking structure. They allow both users and search engine robots better understand the structure of a web page.
- Small websites do not need to have breadcrumbs but it is still a good practice to have them.
- Breadcrumbs should be implemented with the use of Schema.org so that they can be displayed in SERPs.
- Breadcrumbs should be used consistently across the whole website.
The navigation menu is one of the most important elements of the site. Not only does it help users navigate throughout the site but it also informs search engine robots about the most important web pages.
- The main navigation should contain links to the most important web pages of the site.
- The navigation of the site should be implemented in the form of text links and list tags.
- Navigation elements should be visible to search engine robots.
- Navigation usually is NOT a good place to place external links.
- Navigation should work correctly on mobile devices.
- The desktop and mobile versions of the site should have identical navigation elements.
External links are a very powerful on-page SEO element that is totally within your control.
- Linking to high-quality, authoritative websites helps your page appear more credible, well-researched, and informative to Google.
- Try linking to primary sources on information, i.e., case studies, research papers, etc.
- External links to websites with more authority and certain domains (.edu, .org, etc.) can be beneficial in terms of SEO.
- Use “relevant” anchor text, and don’t try to “force-marry” your external link to your target keywords.
- Make sure external links open in a new window/tab to keep the user from jumping away to another website.
- You should not add
rel="nofollow"to high-quality external links that are your true recommendations.
It is extremely important to let Google know about different types of links by using proper link attributes.
- Use either
rel="nofollow"attributes for any sponsored, paid or affiliate links.
- Google is pretty good at identifying affiliate links on a site and ignoring them but it is still a good practice to mark those links as sponsored or/and nofollow.
- You might consider “cloaking” your affiliate links. If you have a WordPress site, you can do that with the use of a plugin like Thirsty Affiliates. Be careful, though.
- Use the
rel="ugc"attribute for links added by users.
21. Link Visibility
The appearance of links is also an important on-page SEO element.
- Based on your UI preferences, you need to decide whether a link will diffuse into the text or should stand out by being underlined, bolded, or having a different color from the text. But make sure it’s consistent throughout the page.
- Hiding links by – for example – using the white font on the white background is a black hat SEO technique. Don’t do that!
- Making links more noticeable on a page can increase their click-through rate which can transfer more “power” to the linked page.
Broken links are not penalized by search engines but they may have a negative influence on SEO.
- Audit your website/web pages regularly to ensure there are no broken links.
- The best way to check if there are broken links is to use a website crawler, such as Screaming Frog SEO Spider or Semrush.
- Broken links to valuable web pages that have backlinks will have a negative influence on a site and its visibility because the power coming from those backlinks is wasted.
- Broken links also negatively influence a user experience. Users do not like 404 pages!
- The best way to fix a broken link is to either remove it or replace it with a working link that returns status 200 (OK).
Low-value links can dilute the topical focus of the site and even confuse search engine robots.
- Low-value links include links with the anchor text like “click here” or “read more”.
- The anchor text of any low-value links should be replaced with text that contains some keywords. Of course, I do not mean turning all of those links into exact-match types of links.
- Image links with no ALT attributes are also low-value links that do not pass any information about the linked page.
24. Orphan pages
Orphan or “dead-end” pages are the pages that have no internal links pointing to them.
- Search engine robots may not be able to crawl and index orphan pages.
- Examples of “good” orphan pages are “thank you” pages. These pages do not need to be linked from anywhere. Nor should they be indexable.
- You can detect orphan pages with the help of Screaming Frog SEO Spider.
The speed of the website influences how search engines and users experience and assess the site.
- Page speed is a (tiny) ranking factor.
- Ideally, a site should load in less than 3 seconds.
- Speed becomes more important if your site is way slower than its competition.
- It is a good idea to optimize your site speed based on the internet speed and devices of your target audience, i.e. use lighter resources and compressed images to help your site load faster.
- Use speed tools, such as Google PageSpeed Insights, Lighthouse, GTmetrix, and WebPageTest to analyze and troubleshoot the speed and performance of your site.
26. Core Web Vitals
Core Web Vitals are a set of 3 metrics that relate to the interactivity, loading performance, and visual stability of the website.
- Ideally, the site should pass all three Core Web Vitals assessments.
- The Largest Contentful Paint (LCP) which is about the loading performance of a page should occur in less than 2.5 seconds.
- The First Input Delay (FID) which is about the interactivity of a page should stay under 100 milliseconds.
- Cumulative Layout Shift (CLS) which is about the visual stability of a site should not exceed 0.1.
⚡ Make sure to check my Core Web Vitals audit with 35+ points to check.
Every website on the internet should be mobile-friendly.
- The mobile traffic has already exceeded the desktop traffic. This means there is no room for websites that are not mobile-friendly.
- When evaluating a web page, Google takes into account only its mobile version (i.e. how the site is displayed on a mobile device).
- You can use the Mobile-Friendly Test to quickly check if a site is mobile-friendly.
⚡ Check my guide on different ways to check if a site is mobile-friendly.
HTTPS has been a ranking factor since 2014.
- Your website should have an SSL certificate.
- Browsers mark websites without an SSL certificate as insecure. Users are less and less likely to trust an insecure website.
- When making a transition from HTTP to HTTPS, you need to implement 301 redirects from HTTP to the HTTPS version.
- The site is likely to have mixed content issues if some of its resources are loaded over HTTP.
- You can use a site crawler like Semrush or Screaming Frog SEO Spider to detect mixed content or other HTTPS/HTTP issues on the site.
⚡ Check my guide on the difference between HTTP and HTTPS to learn more.
29. Intrusive interstitials
Intrusive interstitials can lead to a bad user experience or make it almost impossible to use a site and its content.
- Google might penalize interstitials that obscure the user’s ability to read and access the content on a web page.
- Slide-down, slide-up messages, legally enforced interstitials, and the ones that can be easily dismissed or cover a relatively small part of the screen might pass Google’s scrutiny.
⚡ Check my guide to intrusive interstitials to learn more.
30. Safe browsing
Thanks to Google Safe Browsing, the Internet is a relatively safe place.
- Make sure the website has no security issues, or it may not be able to rank organically.
- You can use the Google Safe Browsing check or the Google Search Console Security report to check if a site has security issues.
- If you have a WordPress site, you can protect it with a security plugin like iThemes Security.
⚡ Check my guide to Google safe browsing to learn more.
31. Keyword Research
Keywords – the words that users type into the search box to find what they are looking for – are probably the most important element of SEO. To succeed in SEO, you need to know how to find relevant keywords, how to analyze them, and how to use them.
Keyword research is the first step towards creating content that can rank organically. No matter how “technically” optimized your page is, if you don’t do justice to your keyword, it might all be in vain.
- Use tools like Ahrefs, Semrush or Google Keyword Planner to conduct keyword research.
- Long-tail and localized keywords are relatively easier to rank for.
- Look for low-hanging fruits, i.e., keywords with ranking difficulties in the lower half, especially if your website’s domain authority is low (which is common for new sites).
- Research the top web pages that are ranking for a given keyword. You can get great data insights for content. And from variables like URL Rating (UR) for the precise page and Domain Rating (DR) for the entire website, you can also determine whether you really can compete for ranking for that keyword.
32. Long-Tail Keywords
Long-tail keywords account for over 70% of searches on Google. Thanks to their high specificity, long-tail keywords bring highly-targeted and easily-converting traffic to the site.
- Ahrefs or Semrush will also let you look for long-tail keywords easily.
- Look for long-tail variations of the keyword you are trying to rank for using Google autocomplete suggestions and “People also ask” boxes.
- Use Google Keyword Planner (it’s not just ideal for paid search).
- Look for hot topics, insights, and queries about the keyword in online communities like Quora or Reddit or wherever your target audience is.
- Even if your keyword isn’t a question, there would likely be questions stemming from it. Use AskThePeople (ATP) or a similar service to find out questions regarding your main keyword (in the context of search intent).
33. Question keywords
Answering the questions of users is the best way to meet their needs and create a great user experience.
- You can find the questions people ask about a given topic using Google autocomplete suggestions or/and “People also ask” boxes displayed in SERPs.
- You can also use AnswerThePublic to discover the questions around any topic.
- AlsoAsked.com is also a brilliant tool that lets you do just that.
34. Keyword Cannibalization
The web pages of the site should not compete with one another for the same keywords.
- Make sure you haven’t already addressed and satisfied your main keyword on another page. Otherwise, the two would compete for ranking and bring each other down (it’s called keyword cannibalization), and you will waste precious resources in rebuilding a duplicate.
- Each web page should be loyal to (and be about) one primary keyword.
- You can use
rel="canonical"to indicate the main version of a URL to avoid keyword cannibalization.
35. Search intent
The selection of keywords found through keyword research must always be accompanied by the analysis of the search intent of website users.
- Don’t just create a webpage for a keyword. Keep the user intent in mind.
- Identify the intent behind the page. Is it going to be informational, transactional, commercial, or navigational? Some people consider comparative an intent in its own right.
- If an informational page tries to sell something, it might turn people away. That usually happens when you don’t identify and satisfy the user’s intent behind that keyword search.
- You can write multiple pages on the same keyword if each page satisfies different search intents. That should be evident from the URL, headings, and titles. If there is too much overlap, it will be akin to keyword cannibalization.
36. First Sentences
The first sentences of a web page are an important on-page SEO element both for Google and website users.
- The first sentences of content determine whether the user decides to read on or return to SERPs.
- The first sentences also help search engine robots better understand the topic of the page.
- Try using the keyword in the first few sentences of the webpage as well, since Google might replace meta with those.
- The first sentences should be interesting and captivating so that a user wants to keep reading.
37. Main Keyword Density
Obsessing about the keyword density is useless but totally neglecting it is also a mistake.
- If you start writing an article with keyword density in mind, you might place your keywords unnaturally in the text. Write the page as naturally as possible first, and then you can look into keyword density.
- Use a keyword density checker to identify the density employed in competitive pages, average it out, and see where your density stands in comparison. If there is too much dissent in different keyword densities, you might consider going for the median instead of an average.
- If you have a WordPress site, you can use the Rank Math or Yoast SEO plugins to analyze and check keyword density.
- You can also use tools such as Thruuu or Surfer SEO to analyze keyword density and other on-page factors.
38. Keyword Variations
Using different variations of the main keyword or synonyms can help search engine robots associate the web page with more keywords.
- Use variations of your keywords on the web page to try and answer the questions or convey the information related to the main keyword (without sacrificing the intent).
- Try using synonyms of the keyword and answer as many questions about the main keyword as you possibly can on your web page.
39. Focus On The Topic (Not Just The Keyword)
It is always a good idea to think of keywords as topics (or in some cases entities).
- If your webpage answers all the common questions relevant or associated with your main keyword (topic), the reader might not have to stray from your page to find more answers.
- You may add Frequently Asked Questions to your web page to exhaust the topic.
- Use Schema to mark up FAQs so that they can appear directly in search results.
40. Title Tags
By making sure that meta elements are SEO-optimized, you can usually achieve quick SEO wins (especially if the site has not been optimized in this way before).
Page titles are one of the most important on-page SEO elements. Not only do they convey the topic of the website to the user, but are also a clear signal to search engine robots.
- Each web page of the website should have a unique title tag.
- Your main keyword should fit snugly into your title, ideally in the beginning. This will ensure it’s not truncated off-screen if you have a title longer than 65 characters.
- You need to create an impact in the first 65 characters of your title, so make the most of them.
- Even if Google places your title near the top, if it’s not eye-catching enough, you might not get a click. Use numbers and power words in your title and check its relative “strength” in headline analyzer tools.
- You can also add modifiers to your existing pages to attract more traffic. Use title tag modifiers like “guide”, “best”, “checklist”, etc.
- Add current year to the title to make it appear more relevant and increase its click-through rate. This simple technique is really effective.
- You can check the titles of your site in bulk using Screaming Frog SEO Spider or Semrush.
41. Meta Description Tags
Meta description tags are not mandatory and are rewritten by Google most of the time. However, it is still a great practice to write a custom meta description for each web page.
- Use your main target keyword in the meta description where it’s easy to read.
- For an attractive and click-worthy meta, either try to answer a question in your meta or invoke the reader’s curiosity.
- Add the current year to the description (if applicable).
- Creating a custom meta description – of course – does not apply to huge websites with thousands of automatically-generated web pages. In such a case, you need a good template.
42. H1 Heading
The H1 heading is a very important on-page SEO element that informs both search engines and users about the topic of the web page.
- H1 heading should include the main keyword the web page wants to rank for.
- Each web page should have one unique H1 tag.
- Try not to use multiple H1s on a page.
- The content of an H1 tag should be different from the content of the title tag.
Headings are the best way to structure your content in a way that is logical and clear to users (including screen reader users) and search engine robots.
- Break down your content into H2, H3, and H4 headings.
- Use your main keyword in at least one H2 and try using its variations in subheadings.
- Make headings comprehensive, detailed, and to the point.
- Cryptic headings that don’t tell users exactly what the content below is about might turn readers away from your web page.
44. Structure of Headings
The correct structure of headings communicates the structure of content and its subtopic in a way that is comprehensible for both users and search engines.
- The correct structure of headings is similar to the table of contents you can find in a book.
- A web page – especially if it aspires to rank for certain keywords – should use H2 and H3 headings to divide its content.
- A web page that has no headings is not optimal from an SEO standpoint.
- On the other hand, headings should not be used excessively (e.g. 50-100 headings on one web page).
- The number of headings and subheadings that should be used depends on the length of the content.
- The content of headings should be unique both within a web page and the entire website.
- You can check the structure of headings on any web page with the use of the Detailed SEO Extension or Chrome Web Developer.
45. A Clean Copy
Content is KING, so here is a bunch of the best SEO practices regarding content.
A clean copy makes it easy for users to scan and read the text. This improves the user experience. And Google likes it when users are happy.
- An overly casual, i.e., unprofessional copy riddled with grammatical errors, might hurt a site’s rankings.
- Make your page scannable by dividing it into comprehensive headings and bolding (or isolating) important definitions.
- For longer pages, a simple list of on-page topics at the start of the content might help users get a feel for what the page is about without going through the entire thing.
- Use short paragraphs (two to three, at most four sentences per paragraph), and try to keep them oriented around a single point or piece of information.
- Bullet points (as evident from this checklist, I hope) and numbered lists help your content look cleaner and easy to understand. The reader feels like they can get a fresh piece of information by reading just one bullet point or get the whole picture by viewing the list.
46. An “Affable” Copy
An “affable” copy will improve the experience of your users and make Google happy.
- Break down complex concepts and ideas into easy-to-digest sentences.
- Seek readability below high-school level, i.e., don’t use overly complex words, very niche phrases (unless you are targeting a particular audience), and make a copy that speaks to almost all your readers.
- Your copy can be more engaging and might “speak” to the reader if it’s actually “speaking” to the reader, i.e., is in second-person.
- Anecdotes, relatable local/social events, and including a sense of community might help you grab your reader’s attention on a more personal level.
47. Content Update
Keeping content up-to-date is just as important as creating new content. A piece of content will not rank indefinitely without any maintenance or update from your side.
- Keep your webpage time-relevant, fresh, and updated.
- Stale and outdated information might push your web page down in ranking unless it is evergreen content.
- You might get better conversions, more views, and even score more SEO points by updating an existing webpage for fresh content than creating one from scratch.
48. New Content
Adding new content is probably the best way to keep growing your site and improving its visibility in search.
- Depending on your time and budget, you should aim at creating at least 1-2 new articles per week.
- The more content you create, the better.
- When creating a lot of new content, do not forget about quality. Quality is way more important than quantity.
- Each new piece of content should be created based on thorough search intent research and keyword research.
49. Content freshness
Content freshness is another highly important on-page SEO element.
- Fresh and new information regarding your main keyword might be better than just writing a longer copy than your competitors.
- For informational content, more updated data, graphical representations/charts, and information consolidated from different sources might present it as relatively unique.
- For marketing content, your competitive edge and points that set you apart from the competitor might present your web page as unique and fresh.
50. Content length
There is no ideal number of words that a piece of content should have to rank well. However, in most cases, you should pay attention to how long your content is.
- When creating a new piece of content, check the average content length of the articles that are already on the first page.
- Aim for the average or slightly above the average.
- A WordPress SEO plugin, such as Rank Math or Yoast SEO will automatically calculate the length of any of your pages.
- You can use a tool like Thruuu or Surfer SEO to check the average length of articles ranking in TOP 10 for a given query.
51. Content depth
How deep a specific piece of content should really depend on the search intent behind a given query.
- When creating new content or updating existing articles, you should always think about how deep and specialized the content should be.
- Analyze the web pages that are on the first page to determine the desired depth of content.
- Take it also into account that the articles that are already in TOP10 may be more general simply because no one has created an in-depth guide on a given topic. You can be the first! This is especially true in the case of high-authority websites whose more general articles often rank for specific queries only because the site is high-authority overall.
52. Thin-content and/or low-content pages
Thin-content pages lead to a bad user experience and provide no value or useful information about the website to search engine robots.
- Thin-content pages are the pages that have few words (100-150 or less) or/and little original content.
- It is usually a good idea to block thin-content pages from indexing.
- Examples of thin-content pages are tag pages or category pages that have little original content except for a list of posts.
- You can use Screaming Frog SEO Spider to find low-content pages.
53. Featured Snippets
Featured snippets always get the majority of clicks, so it is always worth optimizing for them.
- The number of featured snippets displayed has dropped significantly over the last few weeks.
- When optimizing for featured snippets, make sure to check what type of a featured snippet is displayed for a given query (and if a featured snippet is displayed).
- Remember that you can win a featured snippet only if your site is in TOP 10 for a given query.
- The three main types of featured snippets include a definition, bullet-point list, and table.
- Pro Tip: To check who is next in the queue for a featured snippet for a query, simply exclude the site that currently holds the snippet using
-site:domain.comwhen typing the query. Update: This trick does not seem to be working correctly all the time now.
54. Image Quality
Graphic elements are a highly important on-page SEO element that you should never neglect. Here are a bunch of best practices for using images in the context of SEO.
Using quality images can improve the user experience a lot.
- Have more or a similar number of images as your best-ranking competitor.
- Drowning your web pages in stock images will make it difficult for your page to stand out from the competition.
- Even if you are using stock photos, try to use the ones your competitors haven’t.
- Investing in photographers or a graphics designer (or learning it yourself) for high-quality images might be worth it. It might not relate to your SEO as a ranking factor, but it’s likely to give you a competitive edge and a fresh look compared to your competitors.
- High-quality, relevant images help you convey a potent brand image.
- Make sure the images used correspond to the topic of the page.
- If that is applicable, use your own screenshots like I do.
55. Image Optimization
The quality of images is just as important as their proper optimization when it comes to on-page SEO.
- Use the image format that loads faster in your user’s browser and mobile device without compromising the quality.
- Make sure image elements have explicit width and height.
- If possible, upload your image in the “destination” dimensions (as they would appear on your user’s screen) to expedite loading times.
- Use an efficient cache policy on images.
- Compress your images to have a lighter size (but don’t compromise on quality).
- Defer offscreen images.
- Use next-gen formats, such as JPEG 200, JPEG XR, and WebP which can provide better compression than JPEG or PNG.
- Preload the Largest Contentful Paint (LCP) image.
- If you are using WordPress, you can use WP Rocket to do many of image optimizations on your site.
56. Image Meta
You can provide additional information about your images to help search engine robots better understand their content.
- Use a comprehensive file name that conveys what’s in the image.
- Write descriptive ALT tags for your web page images to let Google know what they are about.
- Use image captions to provide even more information about the image.
- Use (but don’t abuse) your main keyword or a variation in your image meta.
Videos are a great way to enrich your site content. Using them smartly can help your site improve its organic reach and visibility.
Videos can help you enrich your content and provide a better user experience. They will also give search engine robots more context about a page.
- If your competitors don’t have a video on a similar web page, creating one and adding it to your page might help you attract more readers.
- If your competitors have a video, creating a better quality one or a more concise, to-the-point video might help you retain user attention.
- A video created by someone else that’s just partly relevant to your webpage is better than no video at all, but ideally, you should invest in the video to convey the right information and brand image.
- The same video on YouTube will help you rank on two different “search” platforms.
- If your video ranks at the top, you might get more page views as well.
58. Video Optimization
Optimizing the video for both users and search queries is essential if you want the video content to help your site with SEO.
- Your video title and description should contain the main keyword.
- The video should be responsive, so users don’t have trouble watching it, regardless of the device they are using.
- By optimizing your video for specific search queries, you increase its chances of appearing on top of Google results.
EAT is becoming a thing in SEO, especially for YMYL websites. Most websites will have trouble ranking if they do not have a decent amount of E-A-T.
Expertise, Authoritativeness, and Trustworthiness (E-A-T) are the essential components of the SEO success of a website. In their Search Quality Evaluator Guidelines, Google provides a ton of useful information on E-A-T and what they value in a website.
Here are the most important takeaways regarding E-A-T:
- The website should have backlinks coming from other high-authority sites from the same industry.
- Other highly authoritative websites should mention the site or its authors.
- The content of the site should be up to date.
- The authors of the site should be recognized authorities in the field.
- The site should show its credentials, such as testimonials, certificates, awards, etc.
- The majority of the reviews of the website or its products should be positive.
- The website should provide information about its authors.
- The website should provide all the necessary contact details.
60. Your Money Your Life (YMYL)
Most websites on the internet are Your Money Your Life (YMYL). A YMYL website should pay special attention to EAT.
- Webpages that offer financial, health, and legal advice are under Google’s microscope (with a pretty big lens).
- Wrong advice/information on YMYL sites can cause actual harm to the readers, and Google wants no part of it.
- YMYL websites should be very careful with controversial opinions on topics on which there is a general scientific consensus.
- Write content that’s accurate, updated, and endorsed.
- Avoid using definitive language when giving advice or predicting a certain outcome.
- Use in-content and separate disclaimers.
61. Reference The Information Displayed On Your Page
A website can improve its trustworthiness a lot if it provides information about the sources and materials it uses.
- Even if you aren’t using hyperlinked external resources, adding reference links to the bottom of your page where you got the information or data for the content on your page can help you establish trust with the reader.
- It can also help you avoid getting in trouble with the original content creator.
- Prominently cite and refer to the original creator/poster when you are adding images and helpful graphics from other sources.
62. Author Visibility
It should be clear who the author or the authors of the website are.
- Google values content that is written by experts who are considered high-authority and trustworthy.
- The name and picture of the author, along with a short bio at the bottom (or one the side of the page) that shares why the author is qualified to write on the topic of the webpage can help you establish trust with both the reader and Google.
- A separate author page that links out to the author’s social media pages, where their qualifications and experiences are on prominent display, serves to improve the authoritativeness and credibility of the content.
64. About page
Every website should have the About page.
- The About page should provide information about the website, its purpose, and its values.
- The About page should also provide information about the author or the authors of the site.
- The more valuable information about the authors and why they are authorities in their field the website provides, the better.
- The About page is also a good place to present the author’s credentials, certificates, guarantees, etc.
63. Contact info
Each website should provide all the necessary contact information.
- There should be at least a few ways users can contact you. Your contact page should have a contact form, an email address, a telephone, and/or a physical address (if applicable).
- Users need to be sure that they can contact you any time and expect a timely response from you.
- If a given web page serves a specific purpose, relates to one of your multiple locations, or the CTA urges the user to contact a specific individual or department to reach out to you, make sure the relevant information is part of the page.
- If the relevant contact information is not as prominent or clearly defined as the one at the foot of the page, your users might reach out to a different location (usually your head office) and get frustrated.
Fonts are an important on-page SEO element.
- Don’t lose legibility for artistry, and use a font that’s clearly legible.
- Find the sweet spot for font size. It should be large enough that users on mobile phones can read without squinting their eyes or zooming in on the page. And it shouldn’t be garishly large on computers.
- Font can also be a culprit when it comes to cumulative layout shift, so optimize for that as well.
- Make sure all text remains visible during web font loads. You can achieve that by leveraging the
- You can use WP Rocket to optimize and preload fonts to avoid any issues.
66. Subtle Marketing
It is OK to earn money from your site! However, you should do it in a way that is not harmful to users.
- A web page created solely for placing ads and content that’s aggressively marketing a product without creating value for the user is unlikely to rank.
- Think user-first, even when you are placing the ads or writing marketing content. Focus on how the user might benefit from the product or how it solves the problem your user might have.
67. Competing Websites
Your site should always go one extra mile to be better than its competition.
- Offer a better user experience and interactions than your competitors.
- Invest in content presentations (video, infographics, and animations) to stand out against your competition. This makes your content more attractive even if the information conveyed and the intent is the same.
- Use FAQs, offer unbiased advice, and try to help your readers learn more about the topic (not just your business/brand). If the reader feels that you are more about helping them than selling your product, they are more likely to convert to you.
68. Intent-Driven CTAs
Like creating content, creating CTAs should be intent-driven.
- Almost every webpage benefits from having a clear CTA that motivates users to make up their minds and perform a certain action.
- Create CTAs based on the user’s search intent for the main keyword and your page’s position in the conversion/sales funnel.
A site can make it easier for users to share its content and increase its reach.
- Use social media sharing buttons on the page itself and individual resources (video, images, stats, charts, etc.) to encourage users to share them. You will get more eyes on your business.
- Use CTAs on informational pages asking users to share your page.
- I always ask my users at the end of each article to share the article.
70. Above the fold
The above-the-fold section of the site is very important when it comes to SEO.
- The above-the-fold section should have valuable content which will allow both users and search engine robots to determine what the web page is about.
- It is a great practice not to place ads above the fold.
71. Voice Search
With devices like Google Home becoming more commonplace, optimizing for voice searches is likely to pay off. You can:
- Identify and use conversational keywords or conversational variants of your main keyword throughout the content.
- Create content based on your target audience persona and answer questions in your content using the phrases they are most likely to use. Understand that teenagers might use different phrases and terms to ask about investments than retirees.
- Use FAQs generously.
- You can find questions about your topic using AlsoAsked.com, AnswerThePublic, or Ahrefs.
72. Local SEO
If the website you optimize is a local business, you need to remember to put all the most important local SEO practices in action.
- Make sure the location strategy is in line with the type of business (e.g. brick-and-mortar business, a home-based business, or a multi-practitioner business).
- Make sure the site has a separate contact, About and home page.
- The contact page needs to have the complete name, address and phone number of each location. It’s called NAP.
- Add a Google Map to the contact page.
- Make sure NAP addresses are consistent across the whole website and any directories the site is added to. If the business has below 10 locations, it is a good practice to list all the locations in the footer.
- Make sure phone numbers are clickable from a mobile device.
- Use Schema structured data markup to provide information about your business in a format that is easy to understand and connect by Google.
- Verify and optimize the Google Business Profile page for the site.
- Earn genuine reviews. Always respond to the reviews (whether they are negative or positive).
- Identify the right social media platforms for the site.
If you like this article, you should also check these articles: