Last updated on June 22, 2021.
Here is the SEO glossary that contains all the SEO terms and definitions every SEO should know.
I wish someone had given me a similar glossary of SEO terms at the start of my SEO career. This would have made things so much easier and clearer.
I’ve created this list to save you a lot of trouble and time. I don’t want to seem braggy but this is probably one of the most thorough and in-depth guides to SEO terminologies on the web!
Note that I am constantly updating this list and adding new terms, so that is always stays up to date.
Use the list overview below or Control+F (Windows) or Command+F (Mac) to find a specific SEO term.
⚡ If you are new to SEO, make sure to check SEO FAQs where I answer 39+ frequently asked questions about SEO.
SEO Glossary A-Z
The terms are in alphabetical order.
Above The Fold
Above-the-fold is usually the first thing website users see when they land on your webpage without scrolling down. The term “above the fold” comes from newspapers since they used to be delivered folded in half, and the top half of the newspaper’s first page was considered essential since that was the first thing people saw (without unfolding the newspaper).
In the same vein, the first thing a user notices about a website or webpage should be captivating enough to keep the viewer browsing.
Algorithms (also referred to as an algo or a Google algorithm) are very complex tools whose task is to retrieve data from the index and present it in the form of the best and the most relevant search results possible.
Of course, Google uses combinations of different algorithms (also called “baby algorithms”) that take into account different types of signals to rank web pages.
Some of the most popular Google algorithms (or components of the Google search algorithm) are called RankBrain, BERT, or Caffeine.
An algorithm update (or an algo update) means a change (big or small) to the way Google algorithms assess, value, and rank websites. Google algorithms are updated multiple times a day! It is not possible to track all of those changes and adjust your website to each one of them.
PRO TIP: You can use one of many SEO “weather” forecast tools that show rankings fluctuations on a daily basis. One of such tools is Mozcast.
A few times a year Google releases a broad core update which is a major algorithm update that usually impacts the rankings of many websites.
In the past, Google algorithm updates received special names like Panda or Penguin. Now they are simply called core updates like December 2020 Core Update.
ALT text (alternative text) is the HTML attribute used to describe the appearance and content of the image used on a website. ALT text is important both for users of screen readers and search engine robots because it conveys the meaning of the image.
PRO TIP: Remember that ALT text is not the place to stuff your keywords but to describe the image.
Anchor text is the content (usually just a few words) that is hyperlinked to another webpage. Anchor text should ideally be relevant to the page’s topic or the content it’s linking out to. It’s clickable and usually stands out from the rest of the content because it’s a different color and underlined, though you may choose to blend it in the content.
There are different types of anchor text, including exact, partial matches, and generic (like click here). Relevant anchor text helps both users and crawlers understand how the current content is connected to the webpage it’s linking out to. For example, my link to the technical SEO audit guide contains a relevant anchor text.
ASO (App Store Optimization) refers to the process of improving the visibility and rankings of mobile applications (apps) within app stores (mainly Google Play and iTunes). ASO is also referred to as mobile app SEO or app store marketing. The purpose of ASO is mainly to increase app downloads.
There are a lot of similarities between SEO and ASO as both rely on similar ranking signals, such as quality, user experience, indexability, user signals, etc.
Similarly, lots of traditional SEO techniques are also true for mobile app SEO. These include doing keyword research, optimization of app name and title, indexation of apps in Google, app ratings and reviews, optimization of app CTR, and more.
A backlink (also referred to as an incoming link, inbound link, or inlink) is a link from one website to another. Any external link pointing to your website is a backlink.
Google, Bing, and other search engines use backlinks as a ranking factor. They all admit it’s a ranking factor. In this respect, backlinks are treated as “votes of confidence” by search engines.
Since backlinks play a very important role in most search engine algorithms, they are very often overused or created purely with the purpose to mislead search engines.
Bing (or Microsoft Bing) is the second most popular search engine that is operated by Microsoft. Similar to Google, Bing also offers different search services, such as web search, video, maps, images, and more. Bing takes into account 1000+ factors when evaluating and ranking websites. The global market share of Bing is 6% with Google having a market share of 88%.
⚡ Check the list of Bing search operators to discover advanced search features of this search engine.
BERT stands for Bidirectional Encoder Representations from Transformers and is a Google neutral network based machine learning technique used for natural language processing and pre-training.
The purpose of BERT is to help computers better understand natural language and better discern the context of words in complex search queries.
BERT is part of the Google search algorithm and is now used in almost all English queries. Thanks to BERT, Google can return more relevant queries to users.
Black hat SEO
Black hat SEO is the exact opposite of white hat SEO. Black hat SEO refers to the practices that are against search engine rules and recommendations. The purpose of black hat SEO is to deceive a search engine so that it ranks a website higher than it actually deserves.
Some common black hat SEO practices include keyword stuffing, link schemes, cloaking, or hidden text. Google is quite clear about what it considers black hat SEO and describes it in detail in its quality guidelines.
Black hat SEO techniques often result in a website being punished either algorithmically or by means of a manual action. A penalty is usually connected with a huge drop in traffic or a website being removed from search results.
Bounce rate is the percentage of your website viewers who “bounce away” from your website without taking any action, i.e., clicking on something on the page they landed on or going to another page. Let’s say a user clicked on your website’s link from Google SERPs, scrolled down to read the complete page (scrolling is not considered an “action”), and didn’t interact with the website in any other way (clicking on another link, filling up a form, etc.). That visitor would be counted as a “bounce.”
As a webmaster, your job is to keep your bounce rate relatively low, but a low bounce rate doesn’t necessarily mean your webpage is not doing its job or the content needs to be changed. Use time-on-page to evaluate that.
Breadcrumb navigation (also known simply as breadcrumbs) is a secondary type of navigation that helps users better understand their location within a website’s structure.
Breadcrumbs are an important element of the correct internal linking structure. They are usually located on top of a web page and contain links to higher-level pages like the category or homepage.
A broken link is a URL that doesn’t return a webpage. That page:
- May have been removed from the website.
- Doesn’t exist (wrong URL).
- May have been moved to a newer page without a proper redirect.
When a user clicks on a broken link, they are greeted with a 404 error page. Broken links can be harmful to User Experience (UX), and by extension, hurt a website’s SEO.
Cache means something stored (or hidden) in a nearby place for future use. In computers, the cache can be hardware (like the cache in processors) memory or a software allocation that saves some important or most frequently used files temporarily so they can be served faster.
A web cache is the storage of part of web data in a device (user’s computer or smartphone), so if the website is accessed again by the same device, it can be served quicker than before. A web cache is stored in the browser files.
A Cached page can mean two things:
- A webpage that has been saved in its entirety by the servers of a search engine (like Google or Bing) so it’s accessible to users even if the website is down.
- A webpage is saved by a user’s browser (on the local device), so even if the user doesn’t have access to the internet, they might have access to some (or all) of the information on that page (through cached data).
Cached pages sometimes help users find obsolete information that has been updated on the website and no longer exists on any “live” webpage.
Crawl budget is the number of pages that Google indexes (or go through) every time it crawls your website. Once your website exhausts that budget, the crawler moves on to a different website, but that doesn’t mean it would go through the “known” old webpage and miss any new ones you’ve set up.
The budget is set and updated automatically by Google. It’s based upon your website size (number of web pages), update frequency, website speed, and internal linking, so it usually takes into account any new pages you have set up and updates your crawl budget accordingly.
Crawl budget doesn’t impact most websites’ SEO or ranking (simply because crawlers are efficient and the budget is almost always adequate). But big websites with several thousand pages and where new pages are added very frequently should optimize for crawl budget.
⚡ Check my guide to the crawl budget optimization.
A canonical URL (also referred to as a canonical tag or a canonical link) is the URL of a web page which is the main or the “preferred” version of a given piece of content.
It helps prevent the problem of duplicate content when there are many similar web pages on a website.
In SEO, canonicalization is the process in which, out of a number of similar URLs, one URL is chosen as the main one and the representation of other URLs.
If you have multiple web pages with similar data that can be considered duplicate content by Google, you can tell the search engine which one is the “master copy” and thus deserves to rank higher by adding a canonical tag on a webpage. This is, however, only a hint and Google may still choose another URL as canonical if it thinks it is more relevant.
Core Web Vitals
Core Web Vitals are a set of Google performance metrics that highlight the key aspects of the user page experience. Core Web Vitals include the 3 following metrics:
- Largest contentful paint (LCP) which focuses on loading time
- First input delay (FID) which is about interactivity
- Cumulative layout shift (CLS) which focuses on visual stability
The Core Web Vitals update that makes Core Web Vitals (and other Google page experience signals) a ranking factor has started rolling out and will be fully rolled out by end of August. That’s why it is important to start optimizing for them right now.
Crawling is the process in which a search engine bot discovers and analyzes new or updated content on the internet. A web page, an image file, and a video are all a type of content. Search engine bots discover content via links.
Crawling is not equal to indexing. Just because a website has been crawled does not mean that is has been or will be indexed.
CTR stands for a click-through rate which is the ratio of users who click on a specific link to users who view a given link. CTR is often used in paid ads (Google Ads) to measure the success of an ad campaign.
In SEO, CTR tells you what percentage of users who viewed your web page in search results actually clicked on it. You can check the CTR of your web pages in Google Search Console (under Performance).
Disavow means telling Google (or another search engine) not to consider specific backlinks when ranking your website. You cut any positive or negative ties to a backlink when you disavow it, so even if it’s functional (still points to your website, and you might get traffic from it), it’s “dead” in terms of SEO. You disavow any backlinks that you think are doing your website more harm than good.
When used effectively, disavowing a link prevents a spammy site from bringing your website ranking down by backlinking to it. But it can also be a double-edged sword because disavowing a link also means that it stops passing any link juice to your website/webpage.
Disavowed links are not removed, and you would need to contact the webmaster of the site or original poster to get them removed.
A Do-follow link is any backlink that Google considers when ranking your website. By default, all backlinks are do-follow. They have to be switched to no-follow (or you need to disavow them). If you don’t want Google to associate your website with those links, they might be coming from ill-reputed sources and get your ranking downgraded.
Do-follow links enable Google to determine which websites are linking back to you and calculate your ranking “worthiness.” You don’t have to add the do-follow attribute to make it a do-follow link as it’s the default configuration.
Domain Authority (DA)
Domain Authority (DA) is a metric created by an SEO company Moz that combines several ranking factors and gives you a rough quantifiable idea of how “authoritative” your website looks to Google and how likely it is to rank higher similar websites. A similar tool called “Website Authority” is created by Ahrefs.
It gives you a score on a hundred-point logarithmic scale, which means going from a DA score of 20 to 30 would be radically different from 10 to 20.
Google neither uses nor endorses Domain Authority. However, it’s speculated that Google might have their term, one metric, or a combination of metrics that refer to a website’s authoritativeness. The term domain authority is sometimes also used to define a domain’s relative strength as strong or weak, but it’s not a quantified metric.
DuckDuckGo is a privacy-focused search engine that doesn’t profile its users and shows every user the same results for a given phrase (search term), though the search results may vary by location. It doesn’t personalize search results, doesn’t collect data from its user, and generates its revenue (in part) by displaying keyword-relevant ads. DuckDuckGo doesn’t have a browser yet, but you can use its browser extension to leverage the privacy that this search engine offers.
⚡ Make sure to check the full list of search engines that don’t track to discover more similar privacy search engines like DuckDuckGo.
Duplicate content is when a webpage has content that’s either a word-to-word or slightly reworded copy (with the same information “value”) of another webpage on the internet. When Google sees multiple pages with the same duplicate content, it has difficulty ranking them, so it might dilute the ranking of all of them.
It doesn’t even have to be the whole page since Google defines duplicate content as “substantive blocks of content” that is either an exact copy or are “appreciably similar” to other content on the internet.
If the duplicate content is from your own website because several web pages on your website have almost identical content, you can use “canonicalization” to tell Google which webpage it should “prefer” and rank higher than others for a search query that all those pages satisfy.
E-A-T stands for expertise, authoritativeness, and trustworthiness and is a concept originating from Google’s Search Quality Raters’ Guidelines.
Your task as an SEO is to make sure that a website has a decent amount of E-A-T. If it does not, then you need to provide recommendations on how to improve it.
E-A-T is becoming more and more important, especially for YMYL (Your Money Your Life) websites. In some niches like medicine or law, it is now practically impossible to rank without a decent amount of E-A-T.
A featured snippet is a different form of presentation of a search result. Instead of being a link to the site, a featured snippet usually displays one or two sentences from a website in an attempt to directly answer a user’s query. A featured snippet can also be a list or a table.
Featured snippets are usually placed above organic search results and below the sponsored results (ads). Another name for featured snippets is answer boxes.
Googlebot is the name given to the algorithms and code (i.e., the “crawler”) that go through every website and tell Google what it’s about. Googlebot is responsible for discovering and indexing all the web pages on the internet so that Google knows about almost every piece of content (on all the webpages) that’s currently available online.
It allows Google to display the most relevant and up-to-date results when someone types in a term (or phrase) for Google to search. A website should be easy for the Googlebot to crawl through. Otherwise, Google won’t know about all the web pages you have on that website. As per Google, there are two different types of Googlebot crawlers:
- A Googlebot that stimulates and looks at the web from a desktop user’s perspective.
- A crawler that sees the web through a mobile user’s lens.
Google Autocomplete is a feature available in Google Search designed to help users complete their searches faster. Once a user starts typing, Autocomplete starts to automatically predict the queries based on what is being typed.
Autocomplete allows for completing your search without the need to type all the letters and words of your query. This is a huge time saver, especially for mobile users.
A Google bomb is when enough backlinks force the Google algorithm to rank a webpage higher than it should be on its core merits for unrelated and irrelevant keywords. Two famous examples of Google Bombs are:
- In 1999, the search term: More even than Satan himself took users to the Microsoft homepage.
- After December 2003, if you searched for miserable failure, it resulted in US President George W Bush. Google fixed it and many other Google bombs by 2007.
Google bombing is almost obsolete now, thanks to the advances in Google search algorithms.
Google dance has two definitions:
- Google dance referred to the volatility and drastic ranking changes for websites that used to happen when Google updated their ranking algorithms and factors every month. Websites that conformed to the new rules jumped in rankings, and previously high ranking websites saw aggressive drops. Since Google has changed this ranking factor and algorithm updating frequency and methodology, these Google dances are almost a rarity nowadays.
- When relatively new websites jump ranks because a new, even minor ranking algorithm or indexing change favors it (without disrupting the other websites’ ranks too much for the same query), it might be considered Google dance.
Google Drive is a cloud-based Google storage service that allows for storing files on Google servers, synchronizing files across different devices, and sharing files with other users. To be able to use Google Drive, you need to have a Google account. Every user gets 15 GB of space for free.
⚡ Check the list of Google Drive search operators to learn how to search Google Drive like a pro.
Google Hummingbird was a significant update to Google’s search engine algorithm that happened in 2013. Unlike previous updates that were improvements over the core algorithm, the Hummingbird update included major changes to the core algorithm itself.
It’s where Google’s aim of improving “semantic search” and displaying results that take into account a user’s search intent, instead of just the literal meaning of the keywords, was achieved. Google Hummingbird was considered the most significant update to the search engine algorithm since 2001.
Google My Business
Google My Business (also abbreviated as GMB) is a free Google tool for managing online presence in Google and Google maps for businesses or organizations that have a physical address.
Google My Business (GMB) helps customers find your business and the services you offer. GMB is a basic tool for local SEO.
Google Passage Indexing
Google Passage Indexing (also called passage ranking, passage-based ranking, or passage-based indexing) means that when ranking a web page (and determining its relevance) Google may take into account only one specific passage from it instead of analyzing the whole web page and all of its content.
When announcing the feature for the first time, Google used the unfortunate wording “passage indexing” which caused a lot of stir in the SEO community. Google, however, confirmed that this is not indexing but rather a ranking change. Google passage ranking does not require any optimizations from the side of a website owner. Its purpose is to help websites with poor SEO optimizations and diluted content rank better.
⚡ Feel free to check my whole guide on Google passage ranking.
Google Panda was Google’s ranking algorithm update in 2011, which allowed the search engine to weed out low-quality, think content, and rank good-quality, authoritative content higher. Many of the excellent content practices can be traced back to Google Panda because before it, many websites focused more on producing a lot of low-quality content (content farming) that provided little or no value to the reader.
Google Panda also made it difficult (even impossible) for webpages with duplicate, untrustworthy, user-generated, and mismatched content to rank higher and allowed good content/websites that offered real value to the user, a fighting chance.
Google Penguin refers to a significant 2012 update to Google’s search engine and ranking algorithm, which went through several updates till 2017 and is now part of the core algorithm. The Google Penguin update penalized websites that boosted their rankings through keyword stuffing or black hat backlinking.
Websites that previously relied on buying or cultivating backlinks unrelated or low-ranking sites simply to rank higher by the sheer virtue of numbers had to revert to acceptable SEO practices to rank higher on SERPs. It affects websites as a whole, instead of just specific webpages.
Google Pigeon was a 2014 algorithm update that focused on local search results. It optimized the searches for proximity and displayed results that were closest to the searcher’s physical location. It was done by connecting Google’s local search algorithm more closely to the web search algorithm.
Both relied on different metrics and signals for ranking, and Google pigeon increased the overlap between the two. It had a drastic impact on local business rankings, especially in the following weeks, and it also changed the number of top location results from seven to three.
Google Sandbox (not to be confused a Google program of the same name) is a supposed phenomenon that prevents new websites from ranking higher on Google SERPs. While Google denies such a thing exists, many SEO experts believe it exists from their efforts to rank more recent websites.
If you consider ranking factors like authoritativeness and backlinks that new websites generally lack (even if they tick most or all of the other SEO checkboxes), it’s easy to see why the Google Sandbox phenomenon exists (if it does).
However, if we consider a few “breadcrumbs” dropped by Google employees, a website might need to be a few months old to get out of the supposed sandbox. That’s probably enough time for search engine algorithms to learn enough about your new websites.
Google Search Console
Google Search Console or simply GSC (formerly knows as Google Webmaster Tools) is a free Google tool that allows website owners to monitor the process of indexing and crawling their website.
GSC also provides a wealth of data about the rankings of a website, number of clicks, CTR, and more. It also helps website owners better understand how their website is performing, what issues it is facing, and how to troubleshoot potential problems.
⚡ If you have a new WordPress website, you might want to check how to verify Google Search Console in WordPress. If you already have Google Search Console set up and want someone else to analyze its data, check how to add a new user to Google Search Console.
Google Search Console Crawl Stats Report
Google Search Console Crawl Stats Report (or simply crawl stats report) is a relatively new GSC feature that allows you to monitor and analyze how Google is crawling your website. The report lets you analyze the availability of your hosts, the number of crawl requests, average response time, and more. You can also group crawl requests by the response, filetype, purpose, or Googlebot type.
⚡ Check my full guide to the Google Search Console crawl stats report.
Google Search Operators
Google Search Operators (also called search commands or search parameters) are special commands and characters that you can type straight into Google to filter, refine, or narrow down the results. With search operators, it is way easier to find specific information or search a specific website. The most common Google Search operators include
site: (displaying the results only for the domain specified),
"" (forcing an exact word match), or
- (excluding a certain word from your search).
⚡ My list of Google search operators contains 50+ search operators that will let you search Google like a pro!
Google Search Quality Evaluator Guidelines
Google Search Quality Evaluator Guidelines (also called Google Search Quality Raters Guidelines) is a publicly available document that is given to raters, people employed by companies working for Google. The raters’ task is to rate websites based on those criteria.
The raters do not influence the rankings of a website. Their ratings are used to improve Google algorithms so that they can return better results in the future.
The guidelines outline all the necessary elements that should be taken into account when evaluating a website. One of the very important elements discussed within guidelines is E-A-T (expertise, authoritativeness, and trustworthiness).
Search Quality Evaluator Guidelines contain a ton of useful information and should be read by absolutely each and every SEO several times at least.
A Google penalty is a situation where a website drops out of the search engine index or its rankings for given keywords drop dramatically. A Google penalty is almost always the result of breaking Google webmaster guidelines. There are two main types of penalties you can get: manual and algorithmic.
Google is getting smarter and smarter at recognizing different mischievous SEO techniques. That’s why more and more penalties are now handled algorithmically.
Google Webmaster Guidelines
Google Webmaster Guidelines are a set of rules and guidelines provided by Google with regards to optimizing a website for search and doing it the white hat SEO way.
Google webmaster guidelines should be your SEO bible. I’m not kidding. Make sure to read Google webmaster guidelines if you still haven’t done it.
Grey hat SEO is an area somewhere between back hat SEO and white hat SEO. More often than not it is being on the verge of breaking Google webmaster guidelines or breaking just a few of them without going to a total extreme.
A grey hat SEO may have quite a decent site with nice content but still rely on tactics, such as buying relatively cheap links from guest posts on blogs that are part of a PBN (private blog network).
PRO INSIGHT: It is very easy to go to the back hat side of SEO once you start to see that some of your grey hat SEO tricks work.
Headings help both users and search engine robots better understand the content of a web page. There are five types of headings (from H1 to H5) with the H1 tag being the most important one.
Headings define different parts of a web page and how they relate to one another. They should contain the keywords you are optimizing a given web page for. Headings are also very useful for users of screen readers.
You can treat headings as chapters in a book where the web page is that book. Below is the structure of headings on my guide to Google crawl stats report.
HTTP or Hypertext Transfer Protocol is a system of rules that govern how information and data are conveyed over the internet and is considered the foundation of data communication and transfer over the web.
HTTP connects a user to a website using a client-server model, where the client is the device the user is employing to access the internet (computer, phone, tablet), and the server is where that website lives. The content and data that are transferred using HTTP are in “Cleartext,” i.e., legible.
HTTPS or Secure Hypertext Transfer Protocol is simply HTTP with an additional security layer, i.e., encryption. Unlike data that’s transferred through an HTTP connection and is in Cleartext which is vulnerable and open to exploitation, data transmitted through HTTPS is encrypted. So even if someone can siphon off data when it’s being transferred between your device and a web-server, it would be meaningless gibberish to them.
The encryption algorithm scrambles the data when it’s being transferred, and it’s unscrambled when it reaches the other end, so it’s legible to you or the server. An HTTP website can be turned HTTPS by buying and adding an SSL certificate.
⚡ Check my article about the difference between HTTP and HTTPS.
An index is a directory, a database of every webpage on every website that a search engine knows about. That’s why every new website or webpage needs to be indexed before it starts appearing on search results because if it’s not in the index, Google (or another search engine) won’t show it in their search results. Google’s index contains hundreds of billions of websites (for now), and it might soon reach trillions.
Indexability is often defined as a search engine’s capability of indexing a webpage, but it’s actually the webpage’s characteristic. An indexable webpage is one that a search engine can’t just crawl but can add to its index. Indexability is one of the first things you need to check for if you are looking into a website’s online presence because if a webpage is not indexable, Google doesn’t know about it, and it won’t show up in searches.
An indexed page is a webpage that Google (or another search engine) knows about and is part of its index. The search engine knows what’s on the page, so if the user types in a relevant query, the indexed page will turn up in the results. A webpage can exist on the internet without being indexed, but it will only be accessible if a user types in the exact URL and won’t turn up in related search queries.
Indexing means storing the content discovered during the process of crawling and rendering. The information and data gathered during the crawling process are stored in the index.
Indexing is not equal to ranking. Just because a web page has been indexed does not mean that it will achieve high rankings.
An internal link is a hyperlink that, if clicked, will take you to another webpage on the same website (hence the name). Internal links are necessary for both users and web crawlers. It helps you guide a user through multiple related resources on your website (or through a sales funnel), and it helps crawlers see your website as a connected whole.
Internal links are essential for good SEO as they create an “information tree” in your website and allow you to “help” your low-ranking webpages climb up by internally linking them to higher ranking and more visited pages.
Instagram SEO refers to the process of optimizing an Instagram account and the content published through it. The purpose of Instagram SEO is to increase organic visibility both within Instagram and search engines like Google which can index Instagram profiles and posts.
Instagram SEO includes things, such as optimization of an Instagram profile and making it public, choosing a proper profile image, using a searchable business name, adding keywords as hashtags, putting keywords in the name or username of an account, or adding a trackable link in a profile’s bio.
A keyword (also known as a keyword or key phrase) is a word or a combination of words that define and describe a given topic. They usually refer to online content, such as articles or blog posts.
In SEO, keywords are the terms and phrases that users enter into search engines to find the information they are looking for. Keywords understood as words typed into a search box in a search engine are also called queries or search queries.
Keyword research is the process used by SEO professionals to find and analyze keywords (or search terms) that users type into search engines to find specific information, products, or services.
With keyword research, you can learn what keywords are most often searched for, how competitive they are, and how difficult it is to rank for a given keyword.
The easiest way to find keywords for SEO is to use one of many available SEO keyword tools. There are many both free and paid keyword tools you can use.
The most popular keyword tool is Google Keyword Planner which gives you keyword data straight from Google.
Keyword density is simply the ratio between the number of times a keyword is used in a piece of content and the total words of the content. For example, if the phrase “keyword density” is used five times in a 500-word blog post on the subject, the keyword density would be 1%. It’s an important metric for content optimization.
It’s argued how important keyword density (and placement) is to the ranking and how much it helps the search engine algorithms understand what your content is about, but it’s always recommended to use your keywords as naturally as possible and focus on creating valuable content.
Keyword stuffing refers to the unnaturally high usage of the primary keyword (or multiple important keywords) in your content in order to rank higher. It used to work for a relatively simplistic search engine algorithm (like Google’s used to be) which ranked webpages based on the number of keyword hits and keyword frequency.
Now that the algorithm has been evolved and focuses more on the value that a webpage provides to the reader, keyword stuffing has become obsolete and can negatively impact a page’s ranking as a black hat SEO technique.
A link audit is a process of analyzing all the incoming links to a website in an attempt to assess if they may be hurting the website and negatively impacting its rankings.
Doing a link audit and submitting a disavow file to Google Search Console may prevent a website from being punished for link manipulations or may help it recover from a manual penalty. Semrush and Ahrefs will help you do a link audit.
Link building is the process of acquiring links to a website from other high-quality and thematically related websites. Acquiring does not mean buying but rather earning those links by creating such high-quality content that others genuinely want to link to it.
Examples of high-quality shareable content are stats reports, in-depth guides, or research results, to name just a few.
A link profile is your website’s portfolio of connections (through backlinks) to other websites. For example, if most of your website’s backlinks are from a sister website or lower ranking websites within your niche, it would have a relatively weak link profile since it doesn’t have much spread or authority. Since it combines both the number and the quality of the backlinks your website has, it can be considered an important ranking factor.
Local SEO is the process of optimizing a website for visibility in local search results. Local SEO, just like regular SEO, is about increasing organic (also called free or natural) traffic to a website. However, the difference between SEO and local SEO is that the latter focuses on increasing the visibility of local and usually smaller businesses that have physical locations.
A geographic location is a vital component of local SEO. An example of a local query might be “seo near me” or “seo los angeles” (assuming I am located in LA). All of the technical and mobile SEO techniques, of course, still apply to local SEO. However, there are a few local SEO only tactics as well.
The basic and the most important component of local SEO is to claim your firm’s Google My Business listing.
A log file records every request that the server containing your website receives and whether that request was fulfilled or not. For example, if a crawler requested access to ten webpages, got to visit eight of them, and two resulted in a 404-error page, it will be recorded in your log file.
Log files inform you about: URLs that are requested, HTTP status codes, IP address of website visitors (partial demographic data), time/date of each request, and information regarding the request that can help you split between user and crawler traffic. A log file can be a gold mine of information that can help improve your website’s SEO.
Log file Analysis
Log file analysis is an element of technical SEO that deals with drawing out valuable insights from data on user/crawler interaction with your website. It helps you understand the crawlability and indexability of your website. It also helps you identify your strongest and weakest pages and give you relevant information about the traffic of your data. It can be both qualitative and quantitative in nature.
A long-tail keyword is longer and more specific than a regular keyword. When users type in a long tail keyword they know exactly what they are looking for. Long-tail keywords are usually made of several words.
Long-tail keywords are usually much less competitive and are searched for much less frequently. However, there are many more long-tail keywords than regular keywords. About 18% of all the searches made on Google each day are completely new and have never been done before.
Manual action is the penalty given to a website that has been found guilty of using unethical SEO techniques and going against Google’s webmaster quality guidelines by a human reviewer. It’s different from the ranking drops that happen when the search engine algorithm penalizes your website and can be significantly more severe. Google issues a manual action if you try and bypass (or fool) the search engine algorithm to rank higher. A manual action can result in partial or full removal of your website from search results.
A meta description is the small summary of the webpage that’s often (not always) displayed underneath your website link in the search results. It informs your readers what the webpage is about (in addition to the headline) to entice a click. Since the SERP real estate is limited, Meta descriptions should ideally be under 160 characters.
Meta tags (also referred to as metadata) provide information about a web page to search engines. Meta tags are practically invisible to users from the level of a web page because they are actually small snippets of code.
The most important meta tags are meta title, meta description, meta robots, meta viewport, and meta charset.
Search engines use the information contained in meta tags to display a web page in SERPs. Meta tags also help search engines better understand what a web page is about. High-quality and clickable content of meta tags may help improve the CTR of a web page.
Mobile-first indexing means that Google takes into account the mobile version of a site when it comes to crawling, rendering, indexing, and ranking.
In the past, users mostly viewed websites from their desktop computers, so Google primarily used the desktop version to index and rank a site. Now the majority of Internet users use mobile devices, so the mobile version has become the new “default”.
Mobile SEO refers to the practice of optimizing the website with a focus on mobile users, mobile page experience, mobile-first indexing and the upcoming Core Web Vitals.
The purpose of mobile SEO is to make sure that a website is mobile-friendly, meets Core Web Vitals, and provides the best possible mobile page experience.
⚡ Check my guide on how to check if a site is mobile-friendly.
Negative SEO refers to black hat SEO techniques and practices which aim at harming a competitive website’s rankings and visibility in search engines.
Some negative SEO practices include building thousands of spammy links pointing to a competitive website, hacking the website, or copying its content, and putting it in tons of different places over the Internet.
Google is getting smarter and smarter at recognizing when someone tries to harm a website with negative SEO techniques. In most cases, Google simply ignores these practices and the website under the attack hardly ever experiences negative results.
Nofollow links are the links that you don’t want Google to associate with your website when determining your ranking. You add a no-follow tag to a backlink to tell Google that it shouldn’t pass any “link juice” to your website. Nofollow links help you ensure that your website is only connected to well-reputed and authoritative sources. Unlike do-follow, no-follow is not a default attribute, and you have to add the tag to each link you don’t want to be associated with your website.
Noindex tag on a webpage tells search engine crawlers that they shouldn’t add this page to their index and display it in search results. You might need to add a no-index tag on a webpage that’s still under construction or is being modified. It will ensure that users don’t get to see incomplete or wrong information on your page, and Google starts penalizing you.
A nosnippet tag in the meta details tells Google that you don’t want any text or video snippet from that page appearing in the search results. This blocks both featured and regular snippets from being displayed, but it can’t prevent a static-image from appearing in search results (when it’s relevant and helpful to the user).
Off-site SEO refers to optimizations made outside the website with the purpose of increasing a website’s rankings and visibility in search engines. Off-site SEO aims at collecting external signals, such as backlinks.
There is also off-page SEO which aims at collecting external signals to a specific web page instead of on the whole website.
Off-site SEO is most often associated with link building which is its main part. But there are other off-page SEO techniques as well.
Off-site SEO includes any promotional activities that take place off your website. It’s (high-quality) guest posting, getting mentions from other experts in the field, your activity on social media, or even being trusted and recognized by other authorities.
On-site SEO refers to the optimizations made on the website itself, such as meta tags, headings, ALT tags, content, speed, URLs, internal links, and more. It includes both the optimization of the website’s content and its HTML code.
On-site SEO should focus on making the website accessible and easy to understand both for users and search engine robots.
⚡ Check my on-page SEO checklist to learn more.
Organic traffic is the visits to your website that come from organic listings in SERPs (search engine results pages). You can easily check organic traffic in Google Analytics.
The opposite of organic traffic is paid traffic which relates to visits that are generated through paid ads. Paid visits stop the moment you stop paying, which is not the case with organic visits.
An orphan page is a sad name given to pages with no internal links pointing to them, i.e., they are orphaned by the parent website. An orphan page can still be crawlable and indexable if it’s in the XML sitemap (or an external site points to it), but it won’t get any ranking points or link juice from your website, which would be terrible for both page’s and website’s overall SEO. It’s different from a dead-end page that is not linked to any internal or external site.
An outbound link is a type of external link that takes the user away from your website to a different website. These links yare helpful because they help your webpage appear more credible by linking to helpful or information-endorsing resources, but they can also drive traffic away from your website. Most reference links are outbound in nature. It’s a common practice to make outbound links “no-follow” so they don’t pass on link equity to another website.
PageRank (PR) is an algorithm for calculating and evaluating the web page based on the quantity and quality of links pointing to it. PageRank was developed by Larry Page and Sergey Brin, the founders of Google.
Since its creation in 1998, PageRank has changed and evolved but it still operates on the same core principles. The PageRank score is not public and is now only used internally by Google.
PRO FACT: In the past, the PageRank score was public in the form of Google’s Toolbar PageRank which was regularly updated. However, the PR score was overused and manipulated by black hat SEOs, so Google removed it entirely in 2016 (and stopped updating it in 2013).
Pogo-sticking is when a user searches for a term, clicks on the top link, and immediately moves back to click on a different link because the top link didn’t offer what the user was hoping to find. If it happens a lot, the algorithm might push that webpage down in ranking because it does not satisfy the search query as it should.
The term gets its name from Pogo-stick since they can’t stay on one spot for very long. However, Google denies that it uses pogo-sticking as a ranking factor. It’s a possibility that Google algorithms might alter the search result for that user in a given timeframe without dropping a website in ranking.
RankBrain is a machine learning-based system within Google’s core algorithm. The purpose of RankBrain is to identify and determine the most relevant results to user queries. It also helps Google better understand user queries and their intent.
Ranking is the process of determining where a specific piece of content (usually a web page) should appear within a SERP (search engine results page). Good rankings are the main purpose of SEO.
Ranking factors are the criteria that search engines use to assess and rank websites in search results. There are hundreds of both less and more important ranking factors (or ranking signals) influencing the website’s visibility in search.
Examples of ranking signals include:
- a website’s backlink profile,
- technical optimization and accessibility to internet robots,
- on-site SEO optimization,
- speed and performance,
- user behavior and experience,
- internal linking,
- security (HTTPS),
…to name just a few.
PRO TIP: The 3 most important Google ranking factors are content, backlinks, and internal linking.
Search engines are usually quite vague about what is a ranking factor and what isn’t. However, they give a ton of useful guidelines on how to make your website the best of its kind making it friendly both for users and search engines.
⚡ Check my list of 25 alternative search engines (other than Google). Did you know that there are so many of them?
Reciprocal links are a simple give-and-take exchange of links between two or more webmasters, where they agree to create external links to each other’s websites. This can help with two things: driving more traffic towards both sites by pooling the user pool and ranking. But reciprocal links solely for the purpose of improving rankings without providing value to the users can backfire.
A redirect takes users and crawlers to a different webpage from the one they clicked on. It may be temporary (HTTP 307) if the page is unavailable for a limited time while it’s being updated or optimized, and the user/crawler is directed to the next best resource. But when a website is revamped from scratch, you will need to set up permanent redirects (HTTP 301).
A referrer is a webpage that sends traffic your way, i.e., refers them to your website or the webpage. Similarly, you become a referrer when you link out to another website. How much information you want to give away to the other website when you link out to them can be controlled using referral tags. You can choose to pass-on no referrer information (your website/resource information) to the website you are linking out to, or you can divulge limited information.
Reinclusion or reconsideration is the request you generate to Google to review and reindex content after your website has been slapped with a manual action, and you’ve fixed that. The Reinclusion request will make Google review your website again to see if it complies with the guidelines, and your website might be reindexed and become part of Google’s search results. The rankings may or may not have dropped once your content is reincluded.
Relevance, or more accurately, content relevance, is about how relevant the content of a webpage is to the topic/keyword it’s trying to satisfy. It’s crucial to good SEO and goes beyond content as well. Like the anchor text should be as relevant to the resource you are linking out to. The content and information for both internal and external links a webpage is linking to should be relevant. And the relevance is about more than just the literal meanings of the words and should take into account user intent as well.
Rendering is the process in which a search engine bot retrieves a web page, runs its code, and evaluates its content to understand its structure and layout. You can use Google PageSpeed Insights to see how Google renders a specific web page.
Responsiveness is a website’s ability to adapt to different devices and change to accommodate the relevant device screen better. For example, a responsive website will shrink for a mobile screen, and the content will be adjusted according to a phone’s viewport, while the same website would utilize the full viewport available on a laptop screen. Responsiveness is important for a better user experience and SEO.
Rich results (also called rich snippets) are regular search results with some additional information and data displayed. Rich results are displayed thanks to the presence of structured data in the HTML code of a web page.
Rich results provide additional information to the user and help search engines better understand the topic of a web page. Examples of rich snippets include recipes, reviews, FAQs, or any other visually enhanced type of search result.
Here you can see the rich results for my WP Rocket review.
A robots.txt file tells search engine bots which pages or files should or should not be crawled. Note that the purpose of a robots.txt is to protect your website from overloading it with requests. It’s not a way to keep a web page out of the Google index.
Here is how the robots.txt file of my website looks like.
⚡ If you use WordPress, you might want to check my guide on how to access robots.txt in WordPress.
Scraping is the practice of collecting data from a website (or multiple websites) manually or automatically. From an SEO perspective, scraping can be used to obtain selective data from your higher-ranking competitor’s website so you can learn from them and create something better.
Mostly, pre-made scraping software or a new scraping code is used for extracting data from a website because manually scraping for information can take up a lot of time. Nowadays, scraping is mostly “white-hat,” but there are some legal and ethical issues that can cloud the practice of scraping, so make sure you are aware of them and employ good practices and scraping.
A search engine is a program (an online tool) that provides the results based on the query (keyword) a user submits. When a user types in a query, a search engine searches for relevant information in its database and displays it in the form of a SERP (search engine results page).
Search engines discover, analyze, understand, and organize the content on the web. Their main purpose is to provide the most relevant and the best results to their users.
The most popular search engine is – not surprisingly – Google with more than 90% of the market share. Other notable search engines are Bing (by Microsoft), Baidu (popular in China), or Yandex (popular in Russia). YouTube (owned by Google) is also a search engine and it’s the second-largest search engine in the world.
Search engine robot
A search engine robot (also referred to as a bot, crawler, or spider) is a tool that search engines, such as Google or Bing use to gather information about websites and add it to its database. Search engine robots are similar to web browsers in how they operate except that they don’t need human interaction.
Bots access web pages mostly through links placed on other websites or XML sitemaps. The most popular search engine robots are, of course, Google crawlers.
Search intent (also referred to as keyword intent or user intent) is the reason why a user types in a specific query into a search engine. Search intent describes the goal a user wants to achieve with the help of a search engine.
There are three main types of search intent:
- Informational intent is when a user is simply looking for information and wants to learn something. For example, a query like “how to learn SEO”.
- Navigational intent is about a user wanting to visit a specific website. Instead of typing the website’s address, they type its name into a search box. For example, “seosly seo guide”.
- Transactional intent is when a user is ready to make a purchase and is already in the buying mode. For example, “buy technical seo audit”.
SEM stands for search engine marketing which refers to using paid strategies to gain visibility in search engines. This means paying for ads that appear directly in SERPs (search engine results pages) and look similar to organic search results.
The biggest difference between SEM and SEO is that with SEM your website stops being displayed in search results the moment you stop paying. Organic listings, on the other hand, are free and don’t disappear the moment you stop doing SEO.
In some highly competitive businesses, SEM makes more sense while in others it’s only SEO or the combination of the two.
Search Engine Optimization
SEO stands for search engine optimization which simply means optimizing your website in such a way that it can be found in organic search results of search engines. In other words, it is a method of influencing or controlling what appears in search results.
The better your website is in terms of SEO, the higher its visibility in search engines is and the higher organic (free) traffic it gets. And more free traffic usually means more customers and more revenue.
Search Engine Optimizer
An SEO is a search engine optimizer, a person whose task is to audit and analyze websites in terms of their compliance with search engine standards and guidelines. An SEO is also often the person who implements on-page optimizations on a website or at least verifies their implementation.
I am an SEO and love it!
An SEO site audit (or simply an SEO audit or website audit) is the process of analyzing and evaluating a website in terms of its search engine optimization and visibility in search engines.
An SEO audit checks if a website complies with webmaster guidelines and other best SEO practices.
SERPs stand for search engine results pages which are the web pages displayed to users when they type in a search query (a keyword) into a search engine, such as Google or Bing.
SERPs contain 10 organic results on the first page, sponsored results (ads) usually on top, featured snippets (usually above organic results), and other elements like maps, images, videos, and others, depending on the query type.
In the past, some 15 or more years ago, SERPs were made of 10 blue links. You probably noticed that this is not the case anymore.
Serpstat is an all-in-one SEO platform that has a set of professional SEO tools, such as keyword research tool, site audit tool, backlink analyzer, or rank tracker. Serpstat, in comparison to other major players in the SEO market like SEMrush or Ahrefs, offers a very competitive price starting at $55 per month.
⚡ Check my in-depth Serpstat review to learn more about the tool.
Sitelinks appear underneath the name of the website in the search results, and they allow the user to navigate to a different page on the website directly. They are most common in branded searches, i.e., when you search the name of the website and underneath the top search result, you see links to pages like Home, About Us, Contact, etc. This doesn’t just get you more real estate on the result page but also increases your chances of a click-through.
Sitewide links appear on most or all the webpages of a website. They might be blogroll links that can take you directly to a different page or a different section of the website. Sitewide links are also seen on the website, with one footer for all the pages.
Social SEO refers to the process of gaining signals from social media with the purpose of improving the visibility of a website and its rankings in search engines. Signals from social media websites do not directly influence the website’s ranking in search engines.
However, social media can bring a website a lot of valuable traffic and users if, for example, a piece of content goes viral. This may indirectly help improve the overall visibility of a website.
An SSL or Secure Sockets Layer Certificate informs the users of a website that it uses encryption to protect their data and it’s safe to use. An SSL certificate makes a website SSL, and the data is encrypted when it travels from your browser to the website’s server and vice versa. This gives users peace of mind when sharing their identifiable and sensitive information like credit card numbers with a website.
Status codes or HTTP status codes are replies that a website server gives to a browser when the browser sends a request. For example, if you try to access a webpage from your computer and it returns a 404 error page, that’s the website server replying to your browser’s request to access that page by indicating that it doesn’t exist. There are five classes of status codes, most of which you don’t get to see, like 2xx status codes that indicate that the browser’s request has been successfully processed.
Structured data in the context of SEO simply refers to providing information about a web page or a piece of content on it in a specific format, which is, in this case, Schema.org. Structured data do not directly improve rankings but they may influence how a web page looks in search results.
In this way, structured data help search engines better understand what a web page is about and what other topics or entitles it relates to.
Technical SEO focuses on technical aspects of a site when it comes to improving its visibility in search engines.
Technical SEO includes, among others, optimizations, such as accessibility to internet robots, or how robots understand a website’s content. Site performance and its speed are also part of technical SEO. And so is the security (using HTTPS).
When doing technical SEO for a website, you should know and understand terms, such as an internet crawler (robot), crawling, indexing, rendering, structured data, breadcrumb navigation, or internal linking. You also need to understand what a robots.txt file and an XML map are and how to fix possible issues with them.
Technical SEO Audit
A technical SEO audit focuses mainly on technical errors and optimizations to be made within a website. A technical audit usually analyzes things, such as a website’s indexability, speed, performance, XML sitemaps, and robots.txt.
Technical SEO audits should highlight quick fixes and quick wins, such as noindexing thousands of thin content web pages or implementing correct redirects from outdated and irrelevant web pages.
A title tag is an HTML element that tells the users and the crawlers what the title of your page is. It’s what is displayed as the main clickable heading when you search for something in Google. Title tags are important for both SEO and click-through, but they are more important for your users than crawlers because when they put a link in the top spot, they take the whole content of the page into account. In contrast, online users will probably make up their mind about clicking after reading the title.
A top-level domain or TLD is one of the most well-trusted, regulated, and used domains (like .com or .org) on the internet. The most commonly used TLDs are .com, .org, .net, and country domains like .cn, or .de. TLDs help a website establish trust with its users.
Universal Search (also referred to as Enhanced Search or Blended Search) is the integration of different types of results within a SERP (search engine results page).
Universal Search means that there are different types of media in a SERP instead of just 10 blue links. Universal search has been around since 2007. If you don’t remember how Google looked back in the old days, type Google in 1998, and see for yourself.
Universal Search results include assets like images, videos, local businesses, rich snippets, featured snippets, maps, shopping results, and more. These elements can be displayed among, below, above, or alongside regular organic listings in search engines like Google or Bing.
User-Generated Content or UGC is any form of content (text, audio, video, etc.) that’s created by internet users and posted on a website. That’s very different from content created and posted by webmasters. UGCs, like reviews, testimonials, and endorsements, give websites transparency and authenticity (and free marketing) points. UGC can be a great way to promote a website and populate it with relevant content, but it can also be a double-edged sword.
A URL or Uniform Resource Locator is simply a unique web address that points to a unique resource on the internet, i.e., a webpage/website. Like fingerprints, every URL is unique and should ideally tell both crawlers and users about the content that’s in that webpage. URLs of web resources are used for linking as well.
URL parameters are simply values in a URL (following a question mark) that allow users and crawlers to see different variations of the same webpage (or track information related to a click-through for referral links).
They are part of the query string of the URLs (hence the question mark), which are always there but not always visible. Different URL parameters can yield different results or return exactly the same page and content while changing something in the background.
A user-agent is what sits between you and a website and facilitates a connection. A web browser, for example, is a user agent that lets you approach and access the data on a website’s servers. User agents help websites identify useful information (like the device the browser is from) and serve a response accordingly. A user agent can be leveraged by webmasters for SEO.
User Experience or UX is how a user feels and experiences a website’s design when interacting with it. A good user experience is likely to keep the user on the website for longer and keep them returning, while a bad UX is likely to drive visitors away. In terms of SEO, UX is crucial when it comes to web design because if a website doesn’t offer the experience its users are expecting, they might bounce and never click it again. This is why Google considers UX when ranking your website.
Visibility or SEO Visibility or Search Visibility is metric that tells you what percentage of total organic clicks your website is getting when it appears in SERP for a given keyword. If it’s on the second page, your visibility might be lower than 1%, but if it’s ranking in the top spot for a keyword, the visibility might be as high as 50%. It shows how many of the total “eyes” your website gets when people search for the keyword you are trying to rank for.
In a broader perspective, visibility can also mean online visibility of your website and your business. It means even if you are not ranking higher and getting more organic clicks, your website’s traffic is increasing because it’s getting more visibility in social media.
A Web page is a single document or a collection of information on one topic (or a few closely related topics) that has its own unique URL. Each webpage of a website will have the same domain name in the URL. A webpage can have different types of content and media included in the body.
A website is a collection of several webpages that all share the same domain name. The domain name of the website is usually the name of the business it represents. Most websites serve a distinct purpose (e-commerce, information, banking, etc.).
White hat SEO
White hat SEO refers to the website’s optimization tactics and methods that comply with official search engine recommendations and practices, such as Google Webmaster Guidelines. In practice, this means playing with Google instead of trying to trick it into thinking that our website is better than it really is.
White hat SEOs focus on providing valuable and useful content thanks to which they can build trust and authority. They care about users and their experiences, at the same time making the website as friendly and accessible as possible.
Word count is the number of words in any document, article, or another distinct piece of content. Examples include 1,000-word blog posts, 4,000-word articles or 30,000-word e-books. Different content types have different word count norms and good practices.
WordPress SEO simply refers to the search engine optimization of a WordPress-based website. The general SEO recommendations apply to any website regardless of its CMS.
However, the methods of implementation of some of these SEO optimizations differ depending on the CMS used. In the case of WordPress sites, the implementation of SEO optimizations is usually very quick and easy.
WP Rocket is a paid WordPress caching plugin that is becoming extremely popular among WordPress users because it significantly improves the speed and performance of websites. The main features of WP Rocket include one-click installation and set-up (no technical knowledge required), advance page caching, cache preloading, sitemap preloading, GZIP compression, database optimization, minification, JS deferring, DNS prefetching, and more.
⚡ Check my in-depth review of WP Rocket.
An XML sitemap is a text file that tells search engine robots which pages should be indexed. An XML sitemap can also provide links to other types of content like videos or images. An XML sitemap informs Google (or other search engines) about the most important web pages of the website.
PRO INSIGHT: An XML sitemap is especially important for huge websites that have deep information architecture. Small websites (up to a few hundreds of web pages) can do without an XML sitemap. However, it’s still a good practice to have one!
⚡ Check my quick and to-the-point guide on how to find the sitemap of a website.
Yahoo was the first popular search engine that came online in 1994. It started out as a web directory, but when it got sizeable enough, the creators added search functionality to it. It expanded its product range to add several different things, and Yahoo Search was one of them. But its usage has been declining since the rise of Google.
Yandex (or Yandex Search) is the most popular and biggest search engine in Russia. Yandex Search is owned by a company named Yandex that provides internet products and services like email, ads, analytics, and more. Yandex works similar to other search engines like Google or Bing. The global market share of Yandex is 0.5% while in Russia it is as much as 40%.
⚡ To refine and narrow down your the search results in the biggest Russian search engine, you can use Yandex search operators.
YouTube SEO is the process of optimizing a YouTube channel (along with its videos) so that it ranks highly in YouTube (and Google). SEO for YouTube involves techniques, such as optimizing a channel (its description, metadata) and its videos (optimizing keywords, title, description, adding transcriptions).
YouTube is the second-largest search engine in the world (guess who the number one is). That’s why, if you produce video content, then it’s definitely a good idea to invest your time or money in this type of SEO.
⚡ Check the list of YouTube search operators to learn how to search YouTube like a pro!
Olga Zarzeczna is a senior SEO specialist with 8+ years of experience. She has been doing SEO for both the biggest brands in the world and small businesses. She has done more than 100+ SEO audits so far. Olga has completed SEO courses and degrees at universities, such as UC Davis, University of Michigan, and Johns Hopkins University. She also completed Moz Academy! And, of course, has Google certifications. She keeps learning SEO and loves it. Olga is also a Google Product Expert specializing in areas, such as Google Search and Google Webmasters.