Updated: June 9, 2023.
Here are 180+ best SEO practices based on the Google Search Engine Optimization (SEO) Starter Guide.
If you don’t have time to go through the entire SEO Starter Guide from Google or you need a refresher, you are just in the right place.
Here I am sharing with you my notes from Google’s guide together with my own comments and insights. I was able to find 180+ best SEO practices and tips!
P.S. I strongly recommend that you read and study both the SEO Starter Guide and this article.
How to use these Google SEO best practices & tips
These notes will be especially helpful if:
- you are a beginner SEO who wants to discover the best practices for SEO and implement them in their processes right away,
- you are an advanced SEO who wants to refresh their knowledge (I learned a few new things from this guide),
- you want to educate your client so that they better understand what you are doing, why you are doing it, and what needs to be done,
- you have read the Basic SEO Guide and want to reinforce your freshly acquired knowledge.
I divided the SEO tips into different categories for your convenience. Okay, so let’s jump right into my notes!
You will also like these guides:
- How To Do An SEO Audit
- 99+ SEO Mistakes
- 15 Example Technical SEO Issues
- SEO Tips (A List Of 70+ Tips For SEO)
- 15 Ways To Check Why Your Site’s Traffic May Be Down
- How To Use Google Search Console For Keyword Research
❓Looking to hire someone to audit your website or do SEO? Make sure to check the SEO services I offer including SEO consultations and monthly SEO services.
👉 Contact me for more information or if you have any questions or learn more about why you want to hire me as your SEO consultant.
SEO indexability & crawlability best practices
- Use the
site:command to check if your site is indexed by Google.
NOTE: Keep in mind that the
site:command will only give you a rough number of pages indexed by Google. Check the entire list of Google search operators.
- Google may not index your site for many reasons, such as there are no links pointing to your site on the internet, your site is brand-new and hasn’t been crawled by Google yet, your site is designed in a way that makes it impossible for Google to crawl and render it correctly, there were server errors when Google was trying to access your site, or it is blocked from indexing.
- You don’t need to submit your site to Google to get in indexed. Google crawls and indexes the web automatically. However, on the flip side, there is no guarantee that Google will find and index your site on its own.
NOTE: If you care about your site’s organic growth, always submit it to Google using GSC.
- Google Search Console (GSC) allows you to submit your site to Google and monitor its performance. If you are an advanced user, you may want to check my guide on how to audit a site using Google Search Console.
- Every new site owner should pay attention to vital SEO aspects, such as whether their site is indexed by Google, whether it offers quality to users, whether their local business is listed in GMB, whether the site is accessible and fast, and whether it is secure.
- The best way to help Google find your site is to create and submit an XML sitemap.
NOTE: Most content management systems will generate an XML sitemap automatically. Check how to find the sitemap of a site.
- Google can also learn about your site by simply following links on other websites pointing to your site.
robots.txtto block specific parts of your website from crawling.
- Subdomains are treated as separate websites, so you need to have a separate robots.txt for each subdomain.
robots.txtfile needs to be put in the root directory of the site.
- Use the Google Search Console robots.txt Tester to test your robots.txt file.
NOTE: Most content management systems (like WordPress) allow you to edit and modify robots.txt without the need to manually upload the file to the root directory. Check my guide on how to modify robots.txt in WordPress.
- Keep in mind that blocked (disallowed) pages may still be crawled by disobedient search engines that do not comply with the Robots Exclusion Standard.
- Anyone can view your
robots.txtfile and see what you are blocking, so this is not the place to block pages containing sensitive information.
- To prevent pages with sensitive information from being seen, use password-protection or remove those pages entirely.
- You should block internal search result pages from crawling.
- Blocking a URL in
robots.txtdoes not prevent it from being indexed. A blocked page may still be indexed if there are links pointing to it on the internet.
- If a blocked URL gets indexed, then only its URL will be shown in SERPs (with no title or meta description being displayed).
- To prevent a page from being indexed and shown in Google, use the noindex tag.
NOTE: To only remove a given page from SERPs without removing it from the index, use the Removals tool in GSC.
- The page should look the same both for users and search engine robots.
- The Google Search Console URL Inspection Tool lets you check how Google sees and renders your page.
SEO best practices for titles
<title>tag informs both users and search engines about the topic of a given page.
<title>tag you specified in a web page may be displayed in SERPs but may also be rewritten by Google.
- Create a unique
<title>tag for each web page and place it within the
<head>section of the page.
- Make sure your titles are both short and descriptive.
- Overly long titles may not be fully shown in SERPs. Google may choose to show only a fragment of the title.
NOTE: Recent events regarding Google’s update of titles show that Google does not always show the most desirable fragment of a long title tag.
<title>tag for the homepage should include the name of the site and include some basic information about it (e.g. the physical location).
- Do not create titles that do not relate to the content of the page.
- Do not use titles that contain default values like “Home”, “Untitled” etc.
- Do not use the same title for a group of similar pages.
- Do not stuff keywords in your title tags.
NOTE: A lot has been going on regarding titles recently, so make sure to check the Google Search Central Blog post about how Google generates titles for web page results.
SEO best practices for meta descriptions
- The meta description tag is supposed to be the summary of the content of the web pages. It should contain information that will let users decide whether they can find what they are looking for on a given page.
- The meta description tag can contain one or two sentences or even a short paragraph.
- The meta description tag is placed within the
- Google may use the description tag as snippets in SERPs.
- In many cases though, Google generates the snippet on its own based on the query typed by the user.
- Adding the meta description tag to the pages is not a requirement but is a good SEO practice.
- There is no maximal or minimal recommended length of the meta description. However, it’s recommended to create meta description tags long enough to be fully shown in snippets.
- Do not stuff keywords into meta description tags.
- Do not use generic descriptions, such as “This is a web page about SEO”.
- Do not write meta descriptions that do not relate to the content of the page.
- If possible, create unique meta descriptions for all web pages.
- If not possible (i.e. the site has thousands of pages), automatically generate meta descriptions based on the content of the page.
NOTE: Most content management systems (including WordPress with an SEO plugin like Rank Math installed) will automatically generate meta description elements based on the first sentences of text.
SEO best practices for headings
- Use headings to indicate important topics within a web page.
- Headings help create a hierarchical structure of the content of the web pages.
- Think about headings as outlines for a large paper with main points and sub-points.
- Don’t place random text into headings. Only place text that will help indicate the structure of the page.
- Don’t use headings for styling purposes. Use
- Aim for the logical structure of headings.
NOTE: You can use the Chrome Web Developer plugin to check the structure of headings on any site. Go to Information > View document outline. You may also want to check my entire list of SEO Chrome extensions.
SEO best practices for structured data
- Structured data is there to help search engines better understand the content of your web pages.
- Thanks to structured data, search engines can display your web pages in a more attractive way in SERPs, which can encourage more users to click on your snippet.
NOTE: In other words, structured data (rich results) can help increase the CTR of pages.
- This enhanced representation of pages using certain types of structured data is called rich results.
- You can use a variety of entities to mark up your business in search. Some examples include products, business location, videos, opening hours, recipes, and more.
- The Data Highlighter and Markup Helper will help you add the markup to the HTML code of the pages of your site.
- Use the Rich Results Test to check if your markup is valid and your pages can be displayed in the form of rich results.
- Use the Google Search Console Rich Results reports to monitor and troubleshoot the pages that contain specific types of rich results.
SEO best practices for URLs
- Search engines need a unique URL to crawl and index a given piece of content.
- Different types of content should be placed on different URLs.
- URLs are divided into different sections, such as
protocol://hostname/path/filename?querystring#fragmentand on the example of a real URL this may look like
- It’s recommended to use the https:// protocol.
- The domain name is in other words the hostname.
- Google differentiates between the www and non-www versions of URLs. It also differentiates between the http and https versions.
NOTE: Each variation is a separate URL to Google.
- Path, filename, and query string determine what content can be accessed from the server.
- Path, filename, and query strings are case-sensitive, which means that FILE is a different resource than file.
- The hostname and the protocol are not case-sensitive. It makes no difference whether you type
- It makes no difference if you put a trailing slash after the homepage (the hostname). Both
https://seosly.com/point to the same content.
- It makes a difference if you put a trailing slash after the path in the URL.
- If you don’t use the trailing slash like in
https://seosly.com/seo, then it will signal that this is the file.
- If you use the trailing slash like in
https://seosly/com/seo/, then it will signal that this is the directory.
NOTE: Content management systems like WordPress automatically add
/at the end of URLs.
- Create a simple directory structure that organizes the content of the site well and allows visitors to know where they are on the site.
- You may try using the directory structure to indicate the type of content at a given URL like
- Use directory names that relate to the content present in a given directory.
- Do not use a complex structure of deep nesting many subdirectories like
- Create friendly and descriptive URLs that are more useful and easily understandable.
- Avoid using long and cryptic URLs that contain few recognizable words like in
- Avoid using generic names in URLs like “page”.
- Use real and meaningful words in URLs.
- Avoid keyword stuffing in URLs like
- Remember that URLs are displayed in some form in search results.
- Provide one version of a URL to reach a specific piece of content and refer only to this one version in your internal linking structure.
- Having different URLs for the same or very similar content can split the reputation between these URLs.
- If users are accessing the same content through different URLs, you can implement a 301 (permanent) redirect from the non-preferred to the preferred URL.
- You can also use the
rel="canonical"link element to indicate the preferred version of a URL.
NOTE: Remember that
rel="canonical"is treated as a hint by Google. 301 redirect is a much stronger signal.
- Navigation is important both for users and search engine robots.
- Navigation can help both users and search engines understand the most important content on the website.
- Google pays attention to the site navigation to better understand the role a given page plays in the overall structure of the site.
- The homepage is usually the most important and the most often visited web page of the site and is the starting place of navigation for both users and search engine robots.
- Unless your site has few pages, you should think about where the homepage directs users and search engine robots.
- The homepage usually should link to the more specific web pages and/or groups of specific pages (e.g. category pages).
- Breadcrumbs are a great way to help users quickly navigate to the previous section or the homepage.
- Breadcrumbs have the most general web page (the homepage) usually placed as the first (the leftmost link) and the most specific one as the last (the rightmost link).
- It’s recommended to use breadcrumb structured data for breadcrumbs.
- Create a navigational page for users, an HTML sitemap that would show the entire structure of the website and help users better understand the hierarchy of the website and the topics it covers.
- Create a navigational page for search engines, an XML sitemap to help search engines discover new and updated content on your site.
- An XML sitemap should list all relevant URLs of the site together with the last modification dates.
- Make sure that navigational pages (whether it be an XML sitemap or an HTML sitemap) do not contain broken links.
- You should create a naturally flowing hierarchy in which users can navigate from more general content to more specific content. To achieve this, you should create navigation pages and make use of internal links.
- All the pages of the site should be accessible through internal links.
- It is a good idea to link to related pages where it makes sense.
- Avoid creating overtly complex navigational structures where every page on the site links to every other page or where pages are 5+ clicks away from the homepage.
- Make sure to use text links for navigation. It makes it easier for search engines to crawl and understand the site.
- Avoid building navigation based on images only.
<a>elements with URLs as href values and generate menu items on page load.
- Make sure your site has a custom 404 page that guides users back to a working page or the homepage and is in line with the design of the site. You may add links to popular or similar pages on your custom 404 page.
- Don’t allow 404 pages to be indexed.
- Don’t block 404 pages in
SEO best practices for content optimization
- Create compelling and useful content.
- Use the Google Ads Keyword Planner to discover the keywords your users may use when looking for the content your site offers and learn their approximate search volumes.
- Write easy-to-read and easy-to-follow content.
- Avoid writing sloppy text with spelling and grammatical errors.
- Don’t embed text in images and videos in the case of textual content. Search engines cannot read this form of text.
- Clearly organize the topics you cover.
- Break up long content into logical chunks or bullet points (like this one) to make it easier for users to find the content they are interested in.
- Create fresh and unique content on a regular basis.
- Avoid having duplicate or near-duplicate versions of your content across your site.
- Create content for users but also make sure it’s accessible to search engines.
- Don’t insert unnecessary keywords only aimed at search engines.
- Don’t add frequent misspellings of keywords to your content with the purpose of ranking for those keywords!
- Don’t hide text from users while showing it to search engine robots.
NOTE: And vice versa.
SEO best practices for E-A-T & YMYL
- Aim at providing expertise and trustworthiness in your specific niche.
- Provide information about who is behind the site, who writes its content, and what the goals of the site are.
- For e-commerce or financial transaction websites, always provide clear and satisfying customer service information.
- For a news site, provide information on who is responsible for the content of the site.
- Use a secure connection.
- Make sure that your site and its content are created and edited by experts on a given topic.
- Avoid representing topics and conclusions that go against the established scientific consensus.
- Make sure the content you provide is factually accurate, comprehensive, and clear.
- Avoid using distracting ads that make it difficult to access the main content of the site.
- Use text links.
- Write good link text that is descriptive and concise (a few words or a short phrase).
- Remember that link text (also called anchor text) informs both users and search engines about the topic of the page to which it points.
- There are two types of links, internal and external. Internal links point to other pages on your site while external links point to other sites.
- Use descriptive text for text links so that it conveys at least a basic idea of what the linked page is about.
- Don’t use generic and meaningless anchor texts like “click here”, “read more”, etc.
- Don’t use anchor texts that are unrelated to the topic of the linked site.
- In most cases, you don’t want to use the URL as the anchor text.
- Don’t use entire sentences or paragraphs as anchor text.
- Make sure users can recognize links easily (e.g. use a different color).
- Don’t style links as regular text so that users can accidentally click them.
- Pay a lot of attention to the anchor text of internal links. This may help both users and search engines better navigate and understand your site.
- Don’t overdo internal links by stuffing unnecessary keywords in their anchor text.
- By linking to another website you may confer some of your site’s reputation to it.
- If you don’t want to confer your reputation to the site you are linking to, use the nofollow attribute.
- If you are using a third party’s widget, make sure that it does not contain links and if it does, add the nofollow tag to them.
- Nofollowing a link means adding the
rel="nofollow"or a more specific attribute like
"sponsored"to the link element.
- To nofollow all links on a page, use the tag
<meta name="robots" content="nofollow">.
- Add the nofollow or ugc tag to user-generated content. This includes the comment section, forums, guest books, etc.
- Most content management systems (WordPress including) automatically nofollow user comments which are prone to spam.
- One of the ways to deal with automatically-generated spammy comments is to use CATCHAs.
SEO best practices for images
- To embed images on your site use
<picture>element allows for specifying multiple options for different screen sizes for responsive images.
- Use the
loading="lazy"attribute for images to make your pages load faster.
- Don’t use CSS to display images that you want to get indexed.
- Use the alt attribute and a descriptive filename.
- The alt attribute is the text that will be shown if the image cannot be displayed.
- The alt attribute is also extremely helpful for people using screen readers.
- The alt text also acts as the anchor text for graphic links.
- The alt text also helps search engine robots better understand the images on your site.
- Create an image sitemap to help search engines find your images and increase their likelihood of being found in Google Image search.
- Use standard image formats, such as JPEG, GIF, PNG, and WebP.
SEO best practices regarding mobile-friendliness
- Keep in mind that having a mobile-friendly website is critical nowadays. Check my guide on how to check if a site is mobile-friendly.
- The recommended mobile strategy is to use responsive web design.
- Use the Mobile-Friendly Test to check if the pages of your site are mobile-friendly. You may want to check my guide on how to check if a site is mobile-friendly.
- Use the Google Search Console Mobile Usability report to check if there are mobile-friendliness issues across all your web pages.
- Use the meta
name="vieport"tag to instruct the browser to adjust the content to the screen size of the device.
- Make sure all the resources of the site are crawlable or Google may not detect that your site is mobile-friendly.
- Avoid using full-page interstitials and anything that may cause a poor user experience.
- Make sure that the site is fully functional on all devices.
- Make sure that all important images and videos are accessible on both mobile and desktop.
- Make sure that structured data and other metadata are present on all versions of pages.
SEO best practices for promoting the site
- Active promotion of your website can help your site grow faster and quicker.
- In some cases, offline promotion (listing your site on business cards, posters, etc.) can also be helpful.
- Another way to promote your business and your products is to send out recurring newsletters to clients to inform them about new content on your site.
- For local businesses, creating a Google Business Profile will help reach local customers on Google Maps and Google Search.
- Use social media to promote your big campaigns
- Avoid promoting each and every new piece of content on your site on every possible social media channel.
- Reach out to sites that cover similar topics to yours.
- Avoid spamming link requests to sites related to yours.
- Avoid purchasing links from other sites with the purpose of increasing your authority.
SEO best practices for analyzing your site and users
- Use Google Analytics (GA) to monitor and analyze the users of your site.
- Use Google Search Console (GSC) to monitor and analyze how your site is doing in search.
I hope that thanks to my notes you were able to get even more out of the Google Basic SEO Guide. If you like this article, please share it with other SEOs so that they can become even better SEOs.