Episode #61 of the SEO Podcast by #SEOSLY features an SEO expert, Kaspar Szymanski, former senior member of the famed Google Search Quality team.
I invite you to subscribe to the SEO Podcast by #SEOSLY to keep learning SEO with me and the SEO experts I interview.
Watch the interview with Kaspar Szymanski
Listen to the interview with ex-Googler Kaspar Szymanski
You can listen to episode #61 of the SEO Podcast by #SEOSLY below.
Top SEO Tips from Ex-Googler
Kaspar Szymanski is a leading search expert, having worked on Google’s webspam team and search quality operations. He recently shared his extensive SEO knowledge and insights in an interview.
Here are 10 of Kaspar’s top SEO tips explained in depth:
Use server logs for detailed SEO analysis
Server logs record detailed technical data about how search engine bots crawl and access your site. This data allows granular analysis of bot activity. Saving raw server logs for extended periods (at least 6 months, ideally over 1 year) provides historical data to identify trends and changes over time.
Focus your server log analysis specifically on bot activity rather than user traffic. The key is identifying patterns in how Googlebot and other search crawler bots access URLs – which parts of your site they prioritize, crawl frequently, and return specific responses for. Combining this data with Google Search Console can reveal insights like pages ranking well but returning ‘soft 404s’, indicate they are obsolete or have thin content.
For large ecommerce sites, server logs also help diagnose issues like products ranking but going out of stock. Overall, server logs let you align bot crawling closer to your ideal rankings and content priorities.
Compare Bing and Google organic traffic
Search engines can demote sites for technical issues, poor quality, or manual spam actions. Checking both Bing and Google traffic provides useful diagnostic data to identify the likely cause of declines.
If organic traffic drops across both Bing and Google concurrently, it likely indicates a technical issue like site errors, a core web vital problem, or markup errors. Issues that hinder all search engine crawlers.
However, a drop only affecting Google is more likely a Google-specific penalty or algorithmic demotion. So compare traffic sources to determine if the problem is technical or a Google action, and investigate accordingly.
Optimize for user experience signals
At its core, SEO is about sending the right signals to search engines to indicate your content will provide a great user experience. Crawling, indexing, ranking, and click-throughs are all tied to predictive user satisfaction.
A simple yet effective tactic is checking Google Search Console for any pages with high impressions but low click-through rates. These present “low hanging fruit” optimization opportunities, as improving the clickability of titles and descriptions takes minimal effort but can significant boost click-throughs on pages already getting traffic.
You’re optimizing to send user experience signals, not just targeting rankings. Pages may already rank well but fail to convert visitors into satisfied users.
Build your online brand as a marketing channel
Brand recognition provides no direct SEO benefit, but brand building remains crucial for driving organic growth. A stronger brand means higher recognition and loyalty among searchers.
Even if you rank well, searchers are more likely to click and convert on brands they trust and find familiar. A brand website feels more authoritative.
Brand also reduces dependence on SEO as a single traffic source. Building your brand across channels means search visibility isn’t necessary for users to find you. Brand building can be a channel itself.
This all feeds back into user experience – searchers have confidence in clicking a familiar, trusted brand from the SERP. Brand equity improves your click-through rate and conversion rate.
Leverage your site’s Unique Selling Proposition (USP)
Defining and communicating your site’s unique value and differentiation is key in SEO. Help search engines instantly understand in listings what makes your business stand out as the right match for a user’s intent.
A compelling, concise USP in title tags, meta descriptions, and content helps increase click-throughs by piquing interest and relevance. Communicate your USP consistently across pages to send clear signals about your niche value.
For example, a brand of eco-friendly cleaning products can include “non-toxic” or “natural ingredients” in snippets to stand out among competitors. Identifying your USP and highlighting it sends signals to align user expectations with your content.
Follow Google’s Search Quality Evaluator Guidelines, not the Radar
Google’s Search Quality Evaluator Guidelines largely outline their ideal practices for user experience. However, Kaspar cautions against obsessing over the outdated Quality Rater’s Radar.
The Radar lists subjective rating factors like page quality, page performance, etc. However, most of these aren’t direct ranking factors. Focus instead on the core Search Quality Guideline principles like E-A-T (expertise, authoritativeness, trustworthiness) and your unique value (USP).
The Radar is outdated and distracts from actual optimization insights. Align with Google’s stated Search Quality Guidelines for positive organic visibility.
Be brief and specific in reconsideration requests
If hit with a Google manual penalty, the path to getting reinstated is requesting a reconsideration. In these reconsideration requests, be brief and specific explains Kaspar.
Clearly state what steps you have already taken to address the specific penalty reason cited by Google. Then express your commitment to following their guidelines moving forward.
Provide only information directly relevant to the penalty and changes made. Additional context about your business, employees, etc. does not help show you resolved the flagged issues.
Understand the different types of manual penalties
Beyond links, many other types of violations can trigger a Google manual penalty. Common categories include…
- Content quality
- Thin content with little substance
- Doorway pages
- Rich snippet markup abuse
- Selling links
- Pure spam sites
Identify the exact violation your site was flagged for, then address that issue directly in your reconsideration request. There are also “soft” penalties impacting only ranking, not deindexing. Familiarize yourself with the extensive range of potential manual actions.
Fix technical problems and site errors first
Before worrying about manual penalties, ensure your site has no crawling, indexing, or functionality issues. Technical problems negatively impact user experience, so fix them first.
Common technical problems include:
- Crawl errors from blocked paths or parameters
- Slow page speed and web vitals
- Broken links and site errors
- Invalid markup and structured data
- Redirect chains and crawler traps
Resolve critical technical errors and stability issues so search engine crawlers can easily access, render, and evaluate your content. Penalties come after the basics of good site health and quality.
Take an long-term approach to sustainable SEO
Good SEO requires playing the long game, as Kaspar emphasizes. Google rewards sites focused on building expertise and trustworthy reputations with their content.
Don’t take shortcuts chasing quick wins, as this risks penalties. Consider that penalties, when resolved properly, can prompt business introspection and catalyze even greater organic growth.
Build your site’s foundations through honest signals aligned with search engine goals – relevance, reliability, satisfaction. Sustainable SEO comes from providing long-term value to searchers.
Follow SEO Podcast by #SEOSLY
You can subscribe to the SEO Podcast by #SEOSLY on popular podcast platforms or check the website of the podcast.
Follow Kaspar Szymanski
If you aren’t following Kaspar, just stop and make sure to follow him across the following platforms:
- X: https://twitter.com/kas_tweets
- LinkedIn: https://www.linkedin.com/in/kasparszymanski/
- Search Brothers
Make sure to subscribe to my SEO newsletter to stay fully informed and be notified when a new episode comes out.
You can subscribe to my newsletter using the form below.
The SEO Podcast by #SEOSLY is sponsored by JetOctopus
JetOctopus is a cloud-based website crawler and SEO log analyzer. The tool allows you to analyze your website structure, check for broken links, detect technical SEO issues, and monitor your website’s ranking in search engines.
JetOctopus is the fastest and most affordable SaaS crawler and logs analyzer without limits.
Want to become a guest or provide ideas for the show?
If you want to become a guest on the show or you want to share ideas, feel free to contact me. I will be happy to hear from you. Contact me at [email protected].