Updated: May 7, 2023.
Here is the SEO glossary that contains all the SEO terms and definitions every SEO should know.
I wish someone had given me a similar glossary of SEO terms at the start of my SEO career. This would have made things so much easier and clearer.
I’ve created this list to save you a lot of trouble and time. I don’t want to seem braggy but this is probably one of the most thorough and in-depth guides to SEO terminologies on the web!
Note that I am constantly updating this list and adding new terms, so that is always stays up to date.
Use the list overview below or Control+F (Windows) or Command+F (Mac) to find a specific SEO term.
SEO Glossary A-Z
The terms are in alphabetical order.
Above The Fold
Above-the-fold is usually the first thing website users see when they land on your webpage without scrolling down. The term “above the fold” comes from newspapers since they used to be delivered folded in half, and the top half of the newspaper’s first page was considered essential since that was the first thing people saw (without unfolding the newspaper).
Algorithms (also referred to as an algo or a Google algorithm) are the formulas used by search engines to rank websites. They are always evolving and changing, and that’s why it is important for digital marketers to stay up-to-date on all the latest developments. Websites like Google, Bing, Yahoo, and DuckDuckGo use algorithms to determine where your website ranks in search engine results pages (SERPS).
An algorithm update (or an algo update) is frequent changes made to search engine algorithms in order to provide better search results. Google is always updating its algorithm so that it can be more efficient and accurate on the web. Some updates are minor while others might be more significant, such as a change in the ranking system.
ALT text (alternative text) is the HTML attribute of ALT text (Alternative Text) that tells the appearance and description of a picture on a website. It’s important for both users who rely on screen readers and search engine robots because it provides the meaning of an image.
Anchor text is hypertext that displays as a blue underlined phrase or word when placed within a hypertext link. It is usually found within the body of an online article and, by following the link, will take the reader to another document. Anchor text is a very important part of SEO as it can increase your rankings on search engine results pages (SERPs) if you use it correctly. Anchor texts are used to associate your content with keywords, help users navigate between different pages on your site, and improve search engine optimization (SEO).
Artificial Intelligence (AI)
Artificial Intelligence (AI) is the simulation of human intelligence processes by machines, especially computer systems. It can be applied to many aspects of human life and society. For example, it could be used in education, healthcare, and even law enforcement. Artificial Intelligence is a broad term that can be used to describe any task where a machine uses some form of algorithm to complete the task, often with limited or no human input.
ASO, or App Store Optimization, refers to the process of improving the visibility and rankings of apps on app stores. The purpose and goal of ASO is to be visible within app stores (mainly on Google Play and iTunes) as this will increase downloads which in turn can increase revenue for your business.
Authority is the degree to which a website or webpage is linked with other relevant websites and pages. A website’s authority can be increased by acquiring inbound links from other authoritative websites. A high-quality webpage will typically generate more inbound links than a low-quality one. This makes it easier for search engines to understand the content of the webpage and rank it accordingly.
Baidu is the dominant search engine in China. Baidu is a Chinese-language search engine, an online advertising platform, an artificial intelligence platform, and a maker of mobile applications and services. The company offers a range of products, including Baidu App Search Engine, Baidu App Ads Platform, Baidu Post Bar, Baidu Takeout Delivery Service etc.
A backlink (also referred to as an incoming link, inbound link, or inlink) is a link from one website to another. Any external link pointing to your website is a backlink.
Below the fold
Below the fold refers to what is visible on a webpage when it’s loaded on a browser and scrolled down. Below the fold is used to describe content at or below the top of the webpage when it’s loaded. It can be used to denote where a user should scroll in order to see more of the page content.
Bing (or Microsoft Bing) is the second most popular search engine that is operated by Microsoft. Similar to Google, Bing also offers different search services, such as web search, video, maps, images, and more. Bing takes into account 1000+ factors when evaluating and ranking websites. The global market share of Bing is 6% with Google having a market share of 88%.
⚡ Check the list of Bing search operators to discover advanced search features of this search engine.
BERT is a new, powerful type of AI that is capable of understanding language in a more comprehensive way. BERT stands for Bidirectional Encoder Representations from Transformers and it is a neural network that has been designed to accurately process language by exploring the different ways in which words can be combined.
Black hat SEO
Black Hat SEO is a term for web optimization services that are against search engine guidelines. These include techniques like keyword stuffing, link spamming, and the use of automated programs or bots to manipulate rankings. Users can also use black hat tactics to rank their sites. For example, they can purchase a list of links from reputable sites and add those links to their site without any knowledge of the content on those sites. This technique is often used by spammers and hackers to make their site look legitimate.
Bounce rate is the percentage of your website viewers who “bounce away” from your website without taking any action, i.e., clicking on something on the page they landed on or going to another page. Let’s say a user clicked on your website’s link from Google SERPs, scrolled down to read the complete page (scrolling is not considered an “action”), and didn’t interact with the website in any other way (clicking on another link, filling up a form, etc.). That visitor would be counted as a “bounce.”
A branded keyword is an identifier that distinguishes your company and its products or services from those of other companies. Branded keywords can be a key component in a marketing, branding, and SEO strategy. They help to make your business appear as the definitive choice in the eyes of potential customers.
A breadcrumb is a navigational aid that shows the user the current location in a website or a web application, the previous locations, and (typically) possible actions. Breadcrumbs have been around for quite some time. They originated from a story about Hansel and Gretel leaving bread crumbs to find their way back home through the woods. But now they are more commonly used on websites and apps to lead users back to their original destination.
A broken link is a URL that doesn’t return a webpage. That page may have been removed from the website, doesn’t exist (wrong URL), may have been moved to a newer page without a proper redirect. When a user clicks on a broken link, they are greeted with a 404 error page. Broken links can be harmful to User Experience (UX), and by extension, hurt a website’s SEO.
Cache means something stored (or hidden) in a nearby place for future use. In computers, the cache can be hardware (like the cache in processors) memory or a software allocation that saves some important or most frequently used files temporarily so they can be served faster. A web cache is the storage of part of web data in a device (user’s computer or smartphone), so if the website is accessed again by the same device, it can be served quicker than before. A web cache is stored in the browser files.
A Cached page can mean a webpage that has been saved in its entirety by the servers of a search engine (like Google or Bing) so it’s accessible to users even if the website is down, or a webpage is saved by a user’s browser (on the local device), so even if the user doesn’t have access to the internet, they might have access to some (or all) of the information on that page (through cached data). Cached pages sometimes help users find obsolete information that has been updated on the website and no longer exists on any “live” webpage.
Call to action (CTA)
A call to action (CTA) is a phrase that encourages the reader to take an action. A call to action can be a single word or sentence. For example, “order now” or “take advantage of this offer.” Calls to action are designed to get the reader engaged with the content in order for them to convert into customers.
ccTLD is a domain extension that is used for a specific country code like .co.uk. and it stands for country code top-level domain. These codes are used to differentiate between all the websites that have been registered, so they can be easily found by users.
Clickbait is a form of digital content to attract attention, usually with a link. It is typically used by websites that generate online advertising revenue, often at the expense of quality or accuracy.
Cloaking is the practice of using a web page to send search engine crawlers to a different page than users. Cloaking, when used as a tactic in SEO, is often deliberately used to mislead and manipulate search engines in order to boost rankings.
Comment spam is a type of spam that can be found on blogs, forums, and other web-based platforms. It is usually in the form of visible comments on blog posts. The goal of comment spam is to make the comment look like legitimate feedback from readers. Comment spammers usually use fake email addresses and post links to their sites or products they are trying to sell. They also post links that contain malware, which can infect your computer when clicked.
Crawlers are programs used by search engines to find the content that you have published on your site. A crawler or robot visits a page, analyzes it, then extracts information from it for use in an index to present in search results.
Crawl budget is the amount of time and resources that a search engine will spend crawling and indexing pages on your website. I also have the entire guide on the crawl budget optimization which I recommend you check.
A canonical URL is simply a web page that is deemed to be canonical, or authoritative, and identified as such using an HTML link element. The purpose of using a canonical URL is to help search engines identify which version of a web page they should use in their results. This helps avoid situations where search engine results show different versions of the same web pages and therefore provide less clarity for users.
In SEO, the process of canonicalization is used to choose one URL as the main one and show it as the sole representative to search engines. It can replace a number of other similar URLs.
Core Web Vitals
Core Web Vitals are a set of Google performance metrics that highlight the key aspects of the user page experience. Core Web Vitals include the 3 following metrics:
- Largest contentful paint (LCP) which focuses on loading time
- First input delay (FID) which is about interactivity
- Cumulative layout shift (CLS) which focuses on visual stability
Content marketing is all about the activities that take place around creating, distributing, and promoting content with a business goal. It includes creating relevant content, promoting content on social media, participating in online discussions about the topic, and spreading information through email marketing.
Competitor analysis is a keyword research technique that helps you to find out what your competitors are doing online. When you analyze your competitors, you can learn about the keywords they rank for, where they rank, and what their conversion rates are.
The cornerstone content is usually the main section of the website. It is the most important content that every visitor needs to read.
Crawlability refers to the process of enabling the crawlers to easily navigate, understand and efficiently find content on your website. When done well, it can improve SEO rankings, reduce bounce rates and increase click-through rates.
A crawler is a program that automatically searches the web for content, typically following hyperlinks.
Crawling is a process to help search engines find & index content on the internet more easily. They do so by following links. This includes web pages, image files, and videos.
CTR is short for “Click Through Rate”. The click-through rate is the number of people that click on a given link or banner advertisement divided by the number of people who see it. CTRs can be calculated using a formula of CTR=(links clicked)/(impressions).
CSS is a web development language that lets designers and developers control the look and feel of the website. The site’s appearance is determined by CSS, which can be changed according to needs.
Deep links are hyperlinks that serve a visitor to a website directly to an internal page on that website. Deep linking is a technique used by site owners and webmasters to balance out the distribution of traffic across pages on their sites.
Disavowing is the opposite of endorsing a link or page. When you disavow a link, you are essentially saying that the link does not come from you and that it is not relevant to the content on your site. The Disavow Tool is an advanced Google tool that will let you disavow links from your site. You can use the Disavow Tool to say that you don’t want Google to take these links into account when indexing and ranking your site.
A dofollow link is a hyperlink that points to another site. Google (or any search engine) will follow this hyperlink and crawl the linked site.
A doorway page is a webpage that leads visitors to another webpage, often a landing page, which is the target of the doorway page
Domain Authority (DA)
Domain Authority (DA) is a metric created by the SEO company Moz that combines several ranking factors and gives you a rough quantifiable idea of how “authoritative” your website looks to Google and how likely it is to rank higher than similar websites. A similar tool called “Website Authority” is created by Ahrefs.
Google neither uses nor endorses Domain Authority. However, it’s speculated that Google might have their term, one metric, or a combination of metrics that refer to a website’s authoritativeness. The term domain authority is sometimes also used to define a domain’s relative strength as strong or weak, but it’s not a quantified metric.
DuckDuckGo is a privacy-focused search engine that doesn’t profile its users and shows every user the same results for a given phrase (search term), though the search results may vary by location. It doesn’t personalize search results, doesn’t collect data from its user, and generates its revenue (in part) by displaying keyword-relevant ads.
⚡ Make sure to check the full list of search engines that don’t track to discover more similar privacy search engines like DuckDuckGo.
Duplicate content is when a webpage has content that’s either a word-to-word or slightly reworded copy (with the same information “value”) of another webpage on the internet. When Google sees multiple pages with the same duplicate content, it has difficulty ranking them, so it might dilute the ranking of all of them.
Dwell time is the amount of time a visitor spends on your site before they leave. Someone who spends five minutes on your site will have a higher dwell time than someone who spends one minute.
The longer people stay on your website, the more likely they are to convert into customers. So A dynamic URL is a URL with variables that changes depending on the user’s input, which makes it more interactive and engaging.
A dynamic URL is a URL with variables that changes depending on the user’s input, which makes it more interactive and engaging.
E-A-T is an acronym for expertise, authoritativeness, and trustworthiness. It’s an idea that originated in Google Search Quality Raters guidelines, but it has been adopted by content marketers across the world as a way to judge the quality of content.
404 errors are instances where a webpage is not found on a computer or in the internet. They happen when the webpage has been moved or deleted, its name is misspelled, or there is an unrecognized file.
External links are links from other websites that point to your website. They are very useful in SEO because they increase the visibility of your content. If you have a web page with a lot of external links pointing to it, it will appear on the first page of Google and other search engines.
A featured snippet is a different way to show search results on Google. Instead of a link, it usually displays one sentence from an article or website in order to answer a user’s query quickly.
Geotargetting is a technique that uses maps to target certain users with specific content. A common use of geotargeting is targeting people who live in a specific city or who are browsing from a certain country.
Googlebot is a program from Google that crawls the internet for data. It is used to index and rank content on the web. Robots like Googlebot have existed since the launch of the internet and are essential for finding and indexing all the pages on a website so you can find them through a search query.
Google Alerts is an automated email service that sends you a notification when your search terms are mentioned on the internet. You can set up alerts for multiple topics and get notified when something new comes up.
Google Analytics (GA)
Google Analytics is a website analytics service developed by Google. It offers an analysis of traffic, demographics, and behavior on the Internet. Google Analytics helps website owners know how visitors engage with their website. GA is the most widely used web analytics and measurement tool in the world.
Google Search Console (GSC)
Google Search Console (GSC) is a free tool from Google that offers a way to see the traffic coming to your website from Google. GSC helps you identify issues with your site that may be reducing your search performance- such as indexing errors or crawl errors. It also provides insights into how Google crawls and indexes your site, including which pages are crawled and which files are downloaded by Googlebot.
Google Autocomplete is a search tool designed to help users find what they are looking for faster. This tool predicts the queries you may be typing and starts suggesting them right away. It’s an excellent way to find all the information you need without wasting time typing out your entire query.
A Google bomb is when enough backlinks force the Google algorithm to rank a webpage higher than it should be on its core merits for unrelated and irrelevant keywords. Two famous examples of Google Bombs are:
- In 1999, the search term: More even than Satan himself took users to the Microsoft homepage.
- After December 2003, if you searched for miserable failure, it resulted in US President George W Bush. Google fixed it and many other Google bombs by 2007.
Google bombing is almost obsolete now, thanks to the advances in Google search algorithms.
Google dance referred to the volatility and drastic ranking changes for websites that used to happen when Google updated its ranking algorithms and factors every month. Websites that conformed to the new rules jumped in rankings, and previously high ranking websites saw aggressive drops. Since Google has changed this ranking factor and algorithm updating frequency and methodology, these Google dances are almost a rarity nowadays.
Google Drive is a cloud-based Google storage service that allows for storing files on Google servers, synchronizing files across different devices, and sharing files with other users. To be able to use Google Drive, you need to have a Google account. Every user gets 15 GB of space for free.
⚡ Check the list of Google Drive search operators to learn how to search Google Drive like a pro.
Google Hummingbird was a significant update to Google’s search engine algorithm that happened in 2013. Unlike previous updates that were improvements over the core algorithm, the Hummingbird update included major changes to the core algorithm itself. It’s where Google’s aim of improving “semantic search” and displaying results that take into account a user’s search intent, instead of just the literal meaning of the keywords, was achieved.
Google Keyword Planner
Google Keyword Planner is an easy-to-use free tool for finding out what people are searching for in Google. You can use it to find keywords and phrases related to your business. Keyword planner also helps you determine which keywords or phrases are most likely to give you a high search volume and quality score.
Google Mobile-Friendly Test
Google Mobile-Friendly Test helps in checking if the website is optimized for mobile navigation. This test verifies that the website works properly on all devices and has a user-friendly responsive design.
Google My Business
Google My Business (also abbreviated as GMB) is a free Google tool for managing online presence in Google and Google maps for businesses or organizations that have a physical address. Google My Business (GMB) helps customers find your business and the services you offer. GMB is a basic tool for local SEO.
Google PageSpeed Insights
PageSpeed Insights (PSI) is an online tool that analyzes how fast a website loads. It will tell you about the performance of your site, including an analysis of code quality and functionality. PageSpeed Insights has its own ranking system that measures the speed of websites based on mobile, desktop, and web performance.
⚡ Check my list of tools to check Core Web Vitals.
Google Passage Indexing
Google Passage Indexing (also called passage ranking, passage-based ranking, or passage-based indexing) means that when ranking a web page (and determining its relevance) Google may take into account only one specific passage from it instead of analyzing the whole web page and all of its content.
⚡ Feel free to check my whole guide on Google passage ranking.
Google Panda is an algorithm that was designed to give sites with high-quality content a higher ranking and lower ranking to sites with low-quality content. Google announced this update in February 2011, claiming the update would affect between 3 and 3.5% of all queries. The update targeted sites that were using thin content or duplicate content to manipulate Google’s search results.
Google’s Penguin algorithm, which was first launched in April 2012, uses a variety of signals like link schemes and spammy anchors to determine whether a site is manipulative or not. The Penguin update has had many impacts on the way SEO professionals do their work. Among them, there are changes in the way backlinks are valued, the amount of traffic required for a site to rank well, and the number of sites that can actually rank in Google’s top 10.
Google Panda was Google’s ranking algorithm update in 2011, which allowed the search engine to weed out low-quality, think content, and rank good-quality, authoritative content higher. Many of the excellent content practices can be traced back to Google Panda because before it, many websites focused more on producing a lot of low-quality content (content farming) that provided little or no value to the reader.
Google Pigeon was a 2014 algorithm update that focused on local search results. It optimized the searches for proximity and displayed results that were closest to the searcher’s physical location. It was done by connecting Google’s local search algorithm more closely to the web search algorithm.
Google Sandbox (not to be confused a Google program of the same name) is a supposed phenomenon that prevents new websites from ranking higher on Google SERPs. While Google denies such a thing exists, many SEO experts believe it exists from their efforts to rank more recent websites.
If you consider ranking factors like authoritativeness and backlinks that new websites generally lack (even if they tick most or all of the other SEO checkboxes), it’s easy to see why the Google Sandbox phenomenon exists (if it does).
However, if we consider a few “breadcrumbs” dropped by Google employees, a website might need to be a few months old to get out of the supposed sandbox. That’s probably enough time for search engine algorithms to learn enough about your new websites.
Google Search Console Crawl Stats Report
Google Search Console Crawl Stats Report (or simply crawl stats report) is a relatively new GSC feature that allows you to monitor and analyze how Google is crawling your website. The report lets you analyze the availability of your hosts, the number of crawl requests, average response time, and more. You can also group crawl requests by response, file type, purpose, or Googlebot type.
⚡ Check my full guide to the Google Search Console crawl stats report.
Google Search Operators
Google search operators give you quick, easy ways to filter out and refine your searches. They do this by making use of special commands that you can type right into Google.
Google Search Quality Evaluator Guidelines
Google Search Quality Evaluator Guidelines is a publicly available document that is given to raters, people employed by companies working for Google. The document is designed to help the raters evaluate the quality of websites and their content.
A Google penalty is a situation where a website drops out of the search engine index or its rankings for given keywords drop dramatically. A Google penalty is almost always the result of breaking Google webmaster guidelines. There are two main types of penalties you can get: manual and algorithmic.
Google Trends is a free public service by Google that allows users to search for specific topics and compare search volume, geographical interest, and time trends.
Google Webmaster Guidelines
Google Webmaster Guidelines are a set of rules and best practices for web designers and developers. The guidelines were originally created by Google engineer Matt Cutts in 2000. These guidelines are used to help website owners improve their search engine rankings on Google’s search results.
Grey hat SEO is a term used to describe the balance between white hat SEO and black hat SEO. Instead of going the whole hog and breaking all of the guidelines, it just skirts around the edges. For example, maybe they are using some cloaking or excessive interlinking.
Headings are a way to both organize and communicate. Headings help both users and search engine robots better understand the content of a web page. There are five types of headings (from H1 to H5) with the H1 tag being the most important one. H1 tags should be used for major topics of a web page, usually at the top of the page.
Hidden text is text that is not visible to the reader but is still used in the code of the page.
A homepage is the main page of a website and usually appears on the top of the screen in most browsers. There are many different types of web pages, but the homepage is one of them.
Hreflang attributes are used on .html and .xml files to tell search engines what language the content is written in. It is important to include this tag when your site has multiple languages or when you want to ensure that your site’s content performs well everywhere it is viewed.
HTML is a markup language that defines the structure of web pages, both for human readers and computers. It also includes instructions for images, videos, forms, links and other web-based media that can be displayed on a webpage. HTML is an abbreviation of HyperText Markup Language. A markup language is a means to define the content of a document in order to make it more readable for humans or computer programs.
HTTP or Hypertext Transfer Protocol is a system of rules that govern how information and data are conveyed over the internet and is considered the foundation of data communication and transfer over the web. HTTP connects a user to a website using a client-server model, where the client is the device the user is employing to access the internet (computer, phone, tablet), and the server is where that website lives. The content and data that are transferred using HTTP are in “Cleartext,” i.e., legible.
HTTPS or Secure Hypertext Transfer Protocol is simply HTTP with an additional security layer, i.e., encryption. Unlike data that’s transferred through an HTTP connection and is in Cleartext which is vulnerable and open to exploitation, data transmitted through HTTPS is encrypted. So even if someone can siphon off data when it’s being transferred between your device and a web server, it would be meaningless gibberish to them.
⚡ Check my article about the difference between HTTP and HTTPS.
Image SEO includes adding alt attributes to images and using appropriate image titles. It also includes making sure that your images are relevant to the content on your website. Image SEO can also be referred to as image search optimization or image marketing.
An image sitemap is an XML document that lists the images on a website and their respective alt text.
An index is a directory, a database of every webpage on every website that a search engine knows about. That’s why every new website or webpage needs to be indexed before it starts appearing on search results because if it’s not in the index, Google (or another search engine) won’t show it in their search results. Google’s index contains hundreds of billions of websites (for now), and it might soon reach trillions.
Indexability is often defined as a search engine’s capability of indexing a webpage, but it’s actually the webpage’s characteristic. An indexable webpage is one that a search engine can’t just crawl but can add to its index. Indexability is one of the first things you need to check for if you are looking into a website’s online presence because if a webpage is not indexable, Google doesn’t know about it, and it won’t show up in searches.
An indexed page is a webpage that Google (or another search engine) knows about and is part of its index. The search engine knows what’s on the page, so if the user types in a relevant query, the indexed page will turn up in the results. A webpage can exist on the internet without being indexed, but it will only be accessible if a user types in the exact URL and won’t turn up in related search queries.
Indexing means storing the content discovered during the process of crawling and rendering. The information and data gathered during the crawling process are stored in the index. Indexing is not equal to ranking. Just because a web page has been indexed does not mean that it will achieve high rankings.
Information retrieval is a process that information in computer databases can be retrieved and presented to the user.
An internal link is a hyperlink that, if clicked, will take you to another webpage on the same website (hence the name). Internal links are necessary for both users and web crawlers. It helps you guide a user through multiple related resources on your website (or through a sales funnel), and it helps crawlers see your website as a connected whole.
Interstitials are advertisements that appear before and/or after videos, audio, or other interactive content. They can be text-based or video-based advertisements. These ads typically fall into the category of intrusive advertising and are not permitted to interrupt or intrude on the content experience.
Instagram SEO refers to the process of optimizing an Instagram account and the content published through it. The purpose of Instagram SEO is to increase organic visibility both within Instagram and search engines like Google which can index Instagram profiles and posts. Instagram SEO includes things, such as optimization of an Instagram profile and making it public, choosing a proper profile image, using a searchable business name, adding keywords as hashtags, putting keywords in the name or username of an account, or adding a trackable link in a profile’s bio.
A keyword is a word or phrase that someone might type into a search engine to find information on the internet. Keywords are critical for reaching potential customers when they are online. Search engine optimization experts believe that picking the right keywords is just as important as anything else you do in your SEO campaign.
Keyword cannibalization is when one or more keywords are used in such a way that they do not have enough relevance to the content. Keyword cannibalization happens when the same keyword is used multiple times in an article. It can happen with different variations of the keyword, as long as it is present for more than just one time. This can lead to a number of consequences, including Google’s search engine thinking that your content is irrelevant and ranking it lower than other articles.
Keyword research is the process of finding out which words or phrases people are using to find content on the internet. It can also refer to the specific process of understanding how a certain word or group of words is searched on the internet.
Keyword density is simply the ratio between the number of times a keyword is used in a piece of content and the total words of the content. For example, if the phrase “keyword density” is used five times in a 500-word blog post on the subject, the keyword density would be 1%. It’s an important metric for content optimization.
It’s argued how important keyword density (and placement) is to the ranking and how much it helps the search engine algorithms understand what your content is about, but it’s always recommended to use your keywords as naturally as possible and focus on creating valuable content.
Keyword stuffing refers to the unnaturally high usage of the primary keyword (or multiple important keywords) in your content in order to rank higher. It used to work for a relatively simplistic search engine algorithm (like Google’s used to be) which ranked web pages based on the number of keyword hits and keyword frequency.
The Knowledge Graph is a database of the world’s knowledge created by Google that enables search engines to have a better understanding of relationships between words or phrases. It helps them return a more accurate set of results when we query them.
Landing Page (LP)
A landing page is a web page that is designed to encourage visitors to take some action, such as signing up for a service, donating money, or buying an item. The purpose of a landing page is to convert site visitors into leads and customers.
Latent Semantic Indexing Keyword
A Latent Semantic Indexing Keyword is a special word or phrase that is not explicitly stated in the text, but often appears in it. LSI keywords are used by search engines to determine which words are most important to a given topic. Many people believe these keywords will help them improve their rankings in the search engine results pages of major search engines like Google.
Link equity is the value of links to your website. Link equity is a term used in SEO theory to measure the strength of a link, and its ability to positively or negatively affect the ranking of the linked web page in search engine results pages.
Link juice is the amount of trust and authority that a site has. When someone links to you, they are giving you link juice. This trust is what helps search engine ranking algorithms determine how relevant a site or webpage is in relation to the topic it’s on.
A link audit is a process of analyzing how strong the backlink profile is for a given site. There are two major types of link audits: external and internal audits. External audits analyze a site’s backlinks from sources outside of the website’s control, while an internal audit analyzes only those links found on the domain being audited.
Link building is the process of acquiring links to your website from other high-quality, thematically related websites. The best way to do this is by creating really great content that others will want to link to naturally.
A link profile is your website’s portfolio of connections (through backlinks) to other websites. For example, if most of your website’s backlinks are from a sister website or lower ranking websites within your niche, it would have a relatively weak link profile since it doesn’t have much spread or authority. Since it combines both the number and the quality of the backlinks your website has, it can be considered an important ranking factor.
A local pack is a grouping of listings that appear in search results in Google Maps. This page of search results displays the addresses and phone numbers of local businesses within a specific radius that people could visit.
Local SEO is the process of optimizing your website and business to rank highly in the search engine results for local people searching for your products or services.
A log file records every request that the server containing your website receives and whether that request was fulfilled or not. For example, if a crawler requested access to ten web pages, got to visit eight of them, and two resulted in a 404-error page, it will be recorded in your log file.
Log files inform you about: URLs that are requested, HTTP status codes, IP address of website visitors (partial demographic data), time/date of each request, and information regarding the request that can help you split between user and crawler traffic. A log file can be a gold mine of information that can help improve your website’s SEO.
Log file analysis
Log file analysis is an element of technical SEO that deals with drawing out valuable insights from data on user/crawler interaction with your website. It helps you understand the crawlability and indexability of your website. It also helps you identify your strongest and weakest pages and give you relevant information about the traffic of your data. It can be both qualitative and quantitative in nature.
When it comes to keywords, long-tail keywords are more specific and can help you get super relevant traffic. They are usually a few words long and can be really helpful when searching for specific products or topics.
Manual action is the penalty given to a website that has been found guilty of using unethical SEO techniques and going against Google’s webmaster quality guidelines by a human reviewer. It’s different from the ranking drops that happen when the search engine algorithm penalizes your website and can be significantly more severe. Google issues a manual action if you try and bypass (or fool) the search engine algorithm to rank higher. A manual action can result in partial or full removal of your website from search results.
A meta description is the small summary of the webpage that’s often (not always) displayed underneath your website link in the search results. It informs your readers what the webpage is about (in addition to the headline) to entice a click. Since the SERP real estate is limited, Meta descriptions should ideally be under 160 characters.
Meta refresh is a technique for refreshing a page using meta refresh tags. This allows the page to reload automatically without requiring the user to click on a link.
Meta tags (also referred to as metadata) provide information about a web page to search engines. They are practically invisible to users because they are small snippets of code. The most important meta tags are meta title, meta description, meta robots, meta viewport, and meta charset.
Mobile-first indexing is Google’s new algorithm that will make it so that mobile versions of sites will be the primary source for Google to crawl and index.
Mobile SEO is the process of optimizing websites for the mobile web. This includes making sure that a website appears, and is usable, on mobile devices and taking steps to improve the rank of its pages in search engine results pages on mobile devices. Mobile SEO is important to prevent customers from abandoning their searches due to a bad experience – which can happen if your site takes too long to load.
⚡ Check my guide on how to check if a site is mobile-friendly.
Negative SEO is a strategy that is designed to harm the online reputation of a website by manipulating its search engine rankings through the use of malicious links pointing to the site. The negative SEO strategy usually involves placing links on websites with low-quality content and spammy backlinks for websites that are irrelevant to the topic or site’s niche.
A nofollow link is a link that is not embedded with the
rel="nofollow" attribute. The “nofollow” attribute tells search engine crawlers not to follow the links on the webpage. This helps to avoid spamming and hassling crawlers with useless content. Links on social media platforms like Twitter and Facebook are not typically followed by search engine crawlers, and therefore do not affect the ranking of those pages in search engine results.
Noindex tag on a webpage tells search engine crawlers that they shouldn’t add this page to their index and display it in search results. You might need to add a no-index tag on a webpage that’s still under construction or is being modified. It will ensure that users don’t get to see incomplete or wrong information on your page, and Google starts penalizing you.
A nosnippet tag in the meta details tells Google that you don’t want any text or video snippet from that page appearing in the search results. This blocks both featured and regular snippets from being displayed, but it can’t prevent a static image from appearing in search results (when it’s relevant and helpful to the user).
Off-site SEO refers to optimizations made outside the website with the purpose of increasing a website’s rankings and visibility in search engines. Off-site SEO aims at collecting external signals, such as those present on social media, to inform your website’s rank.
On-site SEO is a content marketing strategy that can help increase traffic to a website through search engine optimization. This approach involves examining and optimizing the site’s on-page factors such as page titles, meta descriptions, headings, keywords, and images. The goal is to make the content more appealing to search engines and improve the chances of it ranking higher on search results pages.
⚡ Check my on-page SEO checklist to learn more.
Organic traffic refers to the traffic that a website gets from organic search engine results. Organic traffic is the type of traffic that is generated by search engine algorithms for a given keyword or keyword phrase. This means that it does not come from paid advertising such as Google Ads, but rather from natural listings on SERPS.
⚡ You can easily check organic traffic in Google Analytics.
An orphan page is a sad name given to pages with no internal links pointing to them, i.e., they are orphaned by the parent website. An orphan page can still be crawlable and indexable if it’s in the XML sitemap (or an external site points to it), but it won’t get any ranking points or link juice from your website, which would be terrible for both page’s and website’s overall SEO. It’s different from a dead-end page that is not linked to any internal or external site.
An outbound link is a type of external link that takes the user away from your website to a different website. These links are helpful because they help your webpage appear more credible by linking to helpful or information-endorsing resources, but they can also drive traffic away from your website. Most reference links are outbound in nature. It’s a common practice to make outbound links “no-follow” so they don’t pass on link equity to another website.
It is a form of advertising whereby advertisers pay Google to enable their ads to appear in the sponsored search results. Paid search is also called sponsored listings because they are paid for by the advertiser. The ads have been integrated into the organic search engine results pages (SERPS) and can be seen on both desktop and mobile devices.
Page speed is the measure of how long it takes for a web page to load. It is also the average time it takes for a browser to download all of the assets needed to display a web page.
PageRank (PR) is a metric used by Google that the company uses to rank its search results. The metric was developed in 1997 by engineer Larry Page and is still in use today. The PageRank algorithm is based on the idea that a site with more links from sites that have high PageRank will get higher rankings in Google’s search results.
Pogo-sticking is when a user searches for a term, clicks on the top link, and immediately moves back to click on a different link because the top link didn’t offer what the user was hoping to find. If it happens a lot, the algorithm might push that webpage down in ranking because it does not satisfy the search query as it should.
Private Blog Network
A Private Blog Network (PBN) is a web content network that consists of at least two websites, with some pages linking to each other in a way that causes them to rank higher in search engine results pages for certain keywords than other sites. A PBN aims to rank highly on specific search results while maintaining its own SEO value.
QDD stands for Query Deserver Diversity is a term that has been introduced by Google and is aimed at ensuring that the process of ranking websites remains transparent and fair. It ensures that the process also supports various types of content promotion practices, including diverse topics and source diversity.
QDF is an acronym for Query Deserves Freshness and indicates that the site will be updated in a timely manner. This is an important metric for Google because it’s looking to provide users with the most up-to-date information on a given topic.
Quality content is content that is both compelling and persuasive. This means that it engages the reader, provides them with new knowledge, or solves their problem. It should be what the reader wants to read. Quality content is defined by its ability to engage the reader and provide them with the knowledge or solve a problem they are having.
RankBrain is a machine learning system that was developed by Google in 2015 and has been applied to the search engine’s web ranking algorithm. RankBrain makes use of artificial intelligence to analyze individual words and find out how they relate to different search queries. It then creates meaningful signals for complex queries, which are too ambiguous for an internet search engine to successfully return results for.
Ranking is the process of determining where a specific piece of content (usually a web page) should appear within a SERP (search engine results page). Good rankings are the main purpose of SEO.
Ranking factors are the criteria that determine how high a website will rank in their search Ranking factors are the criteria that Google uses to determine how high a website will rank in their search engine, as well as how relevant they are to the keywords.
Reciprocal links are a simple give-and-take exchange of links between two or more webmasters, where they agree to create external links to each other’s websites. This can help with two things: driving more traffic towards both sites by pooling the user pool and ranking. But reciprocal links solely for the purpose of improving rankings without providing value to the users can backfire.
A redirect takes users and crawlers to a different webpage from the one they clicked on. It may be temporary (HTTP 307) if the page is unavailable for a limited time while it’s being updated or optimized, and the user/crawler is directed to the next best resource. But when a website is revamped from scratch, you will need to set up permanent redirects (HTTP 301).
A referrer is a webpage that sends traffic your way, i.e., refers them to your website or the webpage. Similarly, you become a referrer when you link out to another website. How much information you want to give away to the other website when you link out to them can be controlled using referral tags. You can choose to pass on no referrer information (your website/resource information) to the website you are linking out to, or you can divulge limited information.
Reinclusion or reconsideration is the request you generate to Google to review and reindex content after your website has been slapped with a manual action, and you’ve fixed that. The Reinclusion request will make Google review your website again to see if it complies with the guidelines, and your website might be reindexed and become part of Google’s search results. The rankings may or may not have dropped once your content is reincluded.
Relevance, or more accurately, content relevance, is about how relevant the content of a webpage is to the topic/keyword it’s trying to satisfy. It’s crucial to good SEO and goes beyond content as well. Like the anchor text should be as relevant to the resource you are linking out to. The content and information for both internal and external links a webpage is linking to should be relevant. And the relevance is about more than just the literal meanings of the words and should take into account user intent as well.
Rendering is the process in which a search engine bot retrieves a web page, runs its code, and evaluates its content to understand its structure and layout. You can use Google PageSpeed Insights to see how Google renders a specific web page.
Responsiveness is a website’s ability to adapt to different devices and change to accommodate the relevant device screen better. For example, a responsive website will shrink for a mobile screen, and the content will be adjusted according to a phone’s viewport, while the same website would utilize the full viewport available on a laptop screen. Responsiveness is important for a better user experience and SEO.
Rich results are the additional information related to a search item that appears together with an organic search result. Rich results are typically displayed in the form of images, videos, text snippets, and other visual elements that provide a better understanding of what the searcher is looking for. They can also be used by marketers to create a more immersive experience for searchers who are interested in their brand or product.
Robots.txt is a text file that the server owner can put in a web server’s root directory to give instructions to the web crawlers. The rules in this file are expressed in a simple format.
Scraping is the practice of collecting data from a website (or multiple websites) manually or automatically. From an SEO perspective, scraping can be used to obtain selective data from your higher-ranking competitor’s website so you can learn from them and create something better.
Search engines are websites and applications that allow users to search for information across the internet, including images, videos, news articles, and more. Search engines use different types of algorithms to rank websites in their search engine results page (SERP). The algorithms are used by the company’s search engine to evaluate a web page based on relevance, authority/trustworthiness, and popularity. This evaluation is done with the main goal of providing users with the most relevant information possible.
Search engine robot
A search engine robot or also known as a crawler is a program that systematically browses the internet in order to create indexes.
Search intent is the reason why the user has entered a query into the search engine. The intent can be informational, navigational, transactional, or commercial. The search engine’s intention is to find web pages that are closely related to the searcher’s query. The pages ranked higher will most likely represent what the searcher was looking for
SEM is an acronym for Search Engine Marketing. SEM is a form of online marketing that involves the promotion of a company’s products or services through search engine advertising.
Search Engine Optimization
SEO refers to the process of optimizing your website so that it will rank higher in the search engine results pages when someone searches for specific keywords. This gives your content exposure so people remain aware of your brand.
Search Engine Optimizer
An SEO is a search engine optimizer, a person whose task is to audit and analyze websites in terms of their compliance with search engine standards and guidelines. An SEO is also often the person who implements on-page optimizations on a website or at least verifies their implementation.
Schema is a markup language that allows web developers to embed additional structured data in their website’s HTML code. This structured data can then be used by Google and other search engines to provide richer, more accurate results for their users.
An SEO audit is a comprehensive analysis of your website’s visibility and performance from an SEO perspective. SEO Audits are necessary to determine what content is missing, what keywords need optimization, how well the site’s link profile is doing in terms of links, and if there are any technical problems that need to be fixed.
Search engine result pages (SERPS) are the pages that you see when you google for something. SERPs have changed the way we find and share information online. Search engine result pages (SERPs) are the pages that you see when you search for something on Google, Bing, Yahoo, etc. SERPS have changed the way we find and share information online as well as how we do business.
Serpstat is an all-in-one SEO platform that has a set of professional SEO tools, such as keyword research tools, site audit tools, backlink analyzer, or rank tracker. Serpstat, in comparison to other major players in the SEO market like Semrush or Ahrefs, offers a very competitive price starting at $55 per month.
⚡ Check my in-depth Serpstat review to learn more about the tool.
Sitelinks appear underneath the name of the website in the search results, and they allow the user to navigate to a different page on the website directly. They are most common in branded searches, i.e., when you search the name of the website and underneath the top search result, you see links to pages like Home, About Us, Contact, etc. This doesn’t just get you more real estate on the result page but also increases your chances of a click-through.
Sitewide links appear on most or all the web pages of a website. They might be blogroll links that can take you directly to a different page or a different section of the website. Sitewide links are also seen on the website, with one footer for all the pages.
Share of voice refers to the ratio between a company’s online mentions and the total number of online mentions for its competitors in the same category. It is used to measure how often a company is mentioned when compared with its competitors.
Social SEO is the process of improving your ranking on social media sites like Facebook, Twitter, YouTube, and Instagram. In order for an individual or a business to rank well on social media sites, they must have high engagement with their followers.
An SSL or Secure Sockets Layer Certificate informs the users of a website that it uses encryption to protect their data and it’s safe to use. An SSL certificate makes a website SSL, and the data is encrypted when it travels from your browser to the website’s server and vice versa. This gives users peace of mind when sharing their identifiable and sensitive information like credit card numbers with a website.
Status codes or HTTP status codes are replies that a website server gives to a browser when the browser sends a request. For example, if you try to access a webpage from your computer and it returns a 404 error page, that’s the website server replying to your browser’s request to access that page by indicating that it doesn’t exist. There are five classes of status codes, most of which you don’t get to see, like 2xx status codes that indicate that the browser’s request has been successfully processed.
Structured data is one of the most important parts of SEO because it provides search engines with valuable information about your company, products, and services. This includes your contact information, physical address, email address, phone number(s), hours of operation, and reviews.
Subdomains are domains that go under the main domain of a website. They can also act as an alternative to subdirectories when it comes to organizing websites. They also allow for more than one website with the same domain name to exist at once since they have different subdomains.
Taxonomy is a hierarchical classification system. It’s often used in SEO to categorize different types of pages so search engines can better organize the content on their site. A taxonomy can be thought of as a tree, branching into more sub-categories and sub-sub-categories as it goes down.
Technical SEO refers to various methods of optimizing a website that are not related to the content on the website. In other words, these are all the things that you do behind the scenes of your site. It is often said that technical SEO is done in an invisible way since it does not involve editing any content on your site. However, it can be just as important as content SEO and sometimes even more important than content SEO.
Technical SEO Audit
A technical SEO audit entails a deep dive into a website, audit of its design& functionality, crawl performance, and site speed analysis. It also includes checking for broken links, duplicate content, crawl barriers, and many other factors that can affect the site’s ranking on SERP.
The title tag is a section of HTML code that appears in the header of a web page and is used to tell search engines what the page is about. Title tags are one of the most important parts of optimizing your website for search engine visibility.
A top-level domain or TLD is one of the most well-trusted, regulated, and used domains (like .com or .org) on the internet. The most commonly used TLDs are .com, .org, .net, and country domains like .cn, or .de. TLDs help a website establish trust with its users.
Traffic is the number of visitors to a web page, blog, or other online location.
Unnatural links are links that were created with the intent to give the page a higher search engine ranking. They are often obtained by asking other webmasters or buying or trading for them.
Universal search is the process of integrating all types of results in the same SERP. This can include images, videos, news, tweets, social media content, and more.
User-Generated Content or UGC is any form of content (text, audio, video, etc.) that’s created by internet users and posted on a website. That’s very different from content created and posted by webmasters. UGCs, like reviews, testimonials, and endorsements, give websites transparency and authenticity (and free marketing) points. UGC can be a great way to promote a website and populate it with relevant content, but it can also be a double-edged sword.
A URL or Uniform Resource Locator is simply a unique web address that points to a unique resource on the internet, i.e., a webpage/website. Like fingerprints, every URL is unique and should ideally tell both crawlers and users about the content that’s in that webpage. URLs of web resources are used for linking as well.
URL parameters are simply values in a URL (following a question mark) that allow users and crawlers to see different variations of the same webpage (or track information related to a click-through for referral links). They are part of the query string of the URLs (hence the question mark), which are always there but not always visible. Different URL parameters can yield different results or return exactly the same page and content while changing something in the background.
A user-agent is what sits between you and a website and facilitates a connection. A web browser, for example, is a user agent that lets you approach and access the data on a website’s servers. User agents help websites identify useful information (like the device the browser is from) and serve a response accordingly. A user agent can be leveraged by webmasters for SEO.
User Experience or UX is how a user feels and experiences a website’s design when interacting with it. A good user experience is likely to keep the user on the website for longer and keep them returning, while a bad UX is likely to drive visitors away. In terms of SEO, UX is crucial when it comes to web design because if a website doesn’t offer the experience its users are expecting, they might bounce and never click it again. This is why Google considers UX when ranking your website.
Vertical search is the process of searching for something using multiple sources at the same time. It can involve searching on different platforms or through different web pages.
Visibility or SEO Visibility or Search Visibility is metric that tells you what percentage of total organic clicks your website is getting when it appears in SERP for a given keyword. If it’s on the second page, your visibility might be lower than 1%, but if it’s ranking in the top spot for a keyword, the visibility might be as high as 50%. It shows how many of the total “eyes” your website gets when people search for the keyword you are trying to rank for.
Voice search is a research field that is related to the study of natural language processing and speech recognition. The goal of voice search is to enable people to use their voice in order to interact with a computing system in order to retrieve information or complete particular tasks. Voice search technology is still in its infancy, with more than 98% of searches still being typed. However, voice recognition technology has become popularized by the adoption of Amazon’s Alexa and Google Home in recent years.
A web page is a single document or a collection of information on one topic (or a few closely related topics) that has its own unique URL. Each webpage of a website will have the same domain name in the URL. A webpage can have different types of content and media included in the body.
Website navigation is the way in which a website is organized and how the user can move from one page to another. It’s important that website navigation be intuitive and easy to use in order for users to have a positive experience.
A website is a collection of several web pages that all share the same domain name. The domain name of the website is usually the name of the business it represents. Most websites serve a distinct purpose (e-commerce, information, banking, etc.).
White hat SEO
White hat SEO refers to the techniques that rely on an ethical approach to search engine optimization. White hat SEOs are very concerned with how they will be perceived by Google and also by their customers. They make sure that they are abiding by the rules of search engines and that their techniques aren’t overused, which is why they are often referred to as “ethical” or “moral” SEOs.
Word count is the number of words in any document, article, or other distinct piece of content. Examples include 1,000-word blog posts, 4,000-word articles or 30,000-word e-books. Different content types have different word count norms and good practices.
WordPress SEO is the practice of optimizing a WordPress website for search engines. It’s a process that starts with the basics and moves into more advanced techniques. All websites, regardless of their content management system, should have some basic steps in place to ensure SEO success.
WP Rocket is a paid WordPress caching plugin that is becoming extremely popular among WordPress users because it significantly improves the speed and performance of websites. The main features of WP Rocket include one-click installation and set-up (no technical knowledge required), advanced page caching, cache preloading, sitemap preloading, GZIP compression, database optimization, minification, JS deferring, DNS prefetching, and more.
⚡ Check my in-depth review of WP Rocket.
An XML sitemap is an XML-format file that lists web pages in a hierarchical fashion, with hyperlinks to each page. These are ideal for helping search engines find and crawl your site more efficiently.
⚡ Check my quick and to-the-point guide on how to find the sitemap of a website.
Yahoo was the first popular search engine that came online in 1994. It started out as a web directory, but when it got sizeable enough, the creators added search functionality to it. It expanded its product range to add several different things, and Yahoo Search was one of them. But its usage has been declining since the rise of Google.
Yandex (or Yandex Search) is the most popular and biggest search engine in Russia. Yandex Search is owned by a company named Yandex that provides internet products and services like email, ads, analytics, and more. Yandex works similar to other search engines like Google or Bing. The global market share of Yandex is 0.5% while in Russia it is as much as 40%.
⚡ To refine and narrow down your search results in the biggest Russian search engine, you can use Yandex search operators.
YouTube SEO is all about having your video show up in the first few results when people use YouTube to search for relevant terms. The key to YouTube SEO is making sure that you have a compelling title and description for your video, and that you are using the right keywords.
⚡ Check the list of YouTube search operators to learn how to search YouTube like a pro!