Uploaded by Damian Korszun

The Evergreen SEO Framework - By Schieler Mew

advertisement
The Evergreen SEO Framework - By
Schieler Mew
Foreword:
Hey guys, Schieler here! If you have found your way to this E-Book then I assume you are
looking for an evergreen approach to SEO - if you’re looking for the next shiny tool or process,
then I would recommend not wasting your time as there is no secret sauce available here
One of the biggest points of concern I have seen in my career when working with SEOs is that
they don’t understand the fundamental workings of search engines. A lot of base-level
knowledge gets skipped in hopes to rank keywords out of the gate without the ability to apply
critical thinking skills due to following the latest trend, course, or guru that the community is
recommending
Let me ask you this though - if we do not understand how search engines work, how can we
ever hope to know if the data being presented is accurate or true to our case use and
application? We can’t
In this E-Book I have taken the time to cover the aspects of SEO that don’t change over time. I
hope you enjoy the read, as I have done my best to make it as interesting and actionable as
possible
If there’s anything I would like you to walk away with after reading this, it’s:
1. An in-depth understanding of how search engines rank pages
2. Why iterative SEO is important
3. How courses, new tools, and re-named methods distract from evergreen SEO
practices that you should be using
4. Bringing everything together to rank a page in Evergreen ways
Welcome to the Evergreen SEO Framework
Chapter 1: Understanding How Crawlers and
Algorithms Work
Before we can begin SEO, it’s important to understand how search engines work in the first
place. While this step is often overlooked, having a detailed knowledge of the process helps you
understand the logic behind the search engine results page (SERP) and how things end up
there
The Search Engine Ecosystem
As SEOs we tend to think of an index as a single entity, but in reality, there’s a lot more that
goes into this.
Below I have outlined the information in a concise way that aims to help you understand how
pages get from point A (when you create them) to point B (ranked on Google) and the diverse
system that has to work in tangent to rank pages efficiently
While the following information acts more as a glossary of terms and definitions, it’s important to
understand every step of the process so we can build upon our knowledge base later
1. Web Crawlers
Web crawlers, also known as web spiders or bots, are automated programs designed to
systematically browse the internet by following hyperlinks from one web page to another.
They collect information from websites and index the content they encounter. For
example, a web crawler like Googlebot scans websites and records data to be used in
search engine results.
2. Index
The index is a vast and structured database that stores information collected by web
crawlers. It categorizes and ranks web pages, images, videos, and other online content
based on factors such as relevance and quality. This organization ensures efficient
retrieval of information when users perform online searches. For instance, when you
search for "cat videos," the index helps locate and present relevant cat videos quickly.
3. Repository:
A repository serves as a secure storage facility where copies of the data gathered by
web crawlers are stored. It ensures the long-term preservation and accessibility of web
content for future reference. Think of it as a digital archive that safeguards internet data.
4. Barrels
Barrels are thematic divisions or categories within the repository where data is further
organized. Similar to labeled shelves in a library, barrels group related content together,
making it easier to retrieve specific information. For example, in a repository, there might
be barrels for "news articles," "scientific papers," and "entertainment videos." Note: this
is why silos work so well, as they mimic thematic barrels
5. URL Server and Resolver
The URL Server and Resolver are components responsible for translating search
queries into specific web addresses (URLs). They decode user search input and
generate the precise URL that leads to the desired online resource. When you enter a
query like "openai.com" into your web browser, these components convert it into the
URL for OpenAI's website.
6. Lexicon
The lexicon is a comprehensive collection of words and phrases meticulously indexed
and cross-referenced. It serves as a linguistic resource for search engines to understand
user queries. By identifying relevant keywords and their meanings, the lexicon helps
search engines provide accurate and contextually appropriate search results. For
instance, when you search for "machine learning," the lexicon helps the search engine
recognize the importance of these terms in delivering relevant results. Note: this
information is also used in retroactive data analyzation for related entities
7. Doc Index
The Doc Index is analogous to a detailed card catalog in a library, maintaining precise
references to all webpages, images, and other content stored in the index. When you
search for information, the Doc Index quickly identifies and retrieves the relevant items,
enabling efficient search result presentation.
8. Links
Links are the interconnected pathways that connect webpages on the internet. They
serve as the navigation system of the web, allowing both web crawlers and users to
move from one page to another. Like roads on a map, links facilitate seamless
exploration and discovery of online content.
9. PageRank
PageRank is a complex algorithm used to assess the popularity and importance of web
pages. It assigns a numerical score to each webpage based on factors such as the
quality of content and the number of incoming links. Higher-ranking pages appear more
prominently in search results, helping users find valuable and authoritative information.
10. Searcher
The searcher is the user of the search engine, like you. Your search query initiates the
entire search process, guiding web crawlers, indexing, ranking, and retrieval. The search
engine collaborates with its components to deliver a customized search results page,
tailored to your query and preferences.
Based on the above information, there is a complex set of tasks taking place that help your
pages and websites rank, and take a full assemble to manage where you come up in the
SERP
Note, as AI becomes more accessible more pages and websites are created leaving Google
with the task of completing the above process for all of the content. This is speculatively why
we now have hard limits in place in Google Search Console for manually indexing pages,
and why some websites get high priority as the above process is resource and cost intensive.
Especially if we take into consideration that the above doesn’t cover algorithms or their integral
role in scoring a page to get to the searchers web browser
The Role of Algorithms
Similarly to how indexing a page works, there are a lot of processes taking place that help
search engines like Google understand what place, value or “Rank” a page document
(website page) should be returned at. These exist as a part of the indexing process, but also
outside of it as multiple computational systems
Think of it as a brain if you will, in which you have multiple areas with each one being dedicated
to its own set of processes (vision processing, hearing processing etc). Synonymously, Google
also has a divisive system with different parts that all house task-oriented operations to
assess algorithms
Generally speaking, these algorithms analyze various factors to assess the relevance, quality
and authority of web pages through things like MUM, RankBrain, PageRank etc, as well as
historical values like user metrics to understand how valuable the page is to the searcher in
relation to all the surrounding pages about the same topics and themes
We can thus break down algorithms into just a few high-level categories
●
Relevance: Relevance can be seen as the state of being closely connected, or related
to something. For example, what is the relevance of a page or documents topic to the
query submitted (this is why Google uses thematic barrels for proper sorting)
Relevancy Signal Examples:
Page headings
Keyword density
Associated terms and entities
●
Authority: Authority is the power to influence thought, opinion, and behavior. When
thought of this way, you can understand why Google tries to be selective in what
websites and platforms they allow to shape the aforementioned
Authority Signal Examples:
Backlinks
Brand mentions
Author scores
●
User Experience: User experience is what dictates how a customer, client, or purchaser
feels about their needs being met in the context of a website.
User Experience Signal Examples:
Time on page
Clicks to action
User return rate
●
Proximity: Proximity is the state of existence in time, space or relationship. Mostly
thought of in terms of a GMB or GBP, proximity also extends to contextual relationships
and how words are used to convey meaning
Proximity Signal Examples:
Physical distance to business
Near me searches
City mentions near keywords
Now that we have discussed these categories, let’s talk about how they evolve and change
through algorithmic updates, as they are never in a constant state; in fact, there’s dozens of
algorithmic updates per day
How Algorithm Rollouts Work:
Search engines continuously refine their algorithms to provide better search results. Algorithm
updates can be major or minor, and they are rolled out periodically, some daily. The process
generally involves the following steps:
1. Algorithm Development: Search engines like Google have teams of engineers and
data scientists who work on improving algorithms, in addition to machine learning
(discussed below). They develop and test new algorithms or updates to existing ones.
2. Testing: Before a new algorithm is deployed widely, it undergoes rigorous testing.
Search engines use a controlled environment to evaluate how the algorithm performs
and whether it improves search results (branch algorithm).
3. Rollout: Once the algorithm is deemed effective and stable, it is gradually rolled out to a
small percentage of users. This allows the search engine to monitor its impact and
address any unforeseen issues.
4. Monitoring and Feedback: During and after the rollout, search engines closely monitor
user feedback and the impact on search results. If there are significant issues or
negative feedback, adjustments may be made. For algorithms that are created and
deployed by machine learning, this gives data scientists the opportunity to evaluate what
was rolled out.
5. Full Deployment: If the algorithm update is successful and well-received during the
testing phase, it is deployed to the entire search engine user base.
Machine Learning in Search Algorithms:
Briefly discussed above, machine learning is a subset of artificial intelligence (AI) that involves
training algorithms to learn patterns from data and make predictions or decisions. In the context
of search algorithms, machine learning can be applied to:
●
Ranking: Machine learning models can learn from user behavior and feedback to
improve the ranking of search results. They can adapt to user preferences and trends.
●
Content Analysis: Machine learning can be used to analyze the content of web pages,
categorize topics, and identify entities (e.g., people, places, products) within the content.
Supervised and Unsupervised Models:
When it comes to their modeling, two primary approaches stand out for their distinct methods of
training algorithms: Supervised and Unsupervised Learning, both of which are used frequently
in algorithmic changes
●
Supervised Learning: In supervised machine learning, algorithms are trained on
labeled data, meaning they are provided with input-output pairs. For example, in search,
algorithms can be trained to predict user clicks based on past click-through data.
●
Unsupervised Learning: Unsupervised learning involves training algorithms on
unlabeled data, and the algorithms must find patterns or groupings on their own. In
search, this can be used for clustering similar search queries or identifying related topics.
Retroactive Data Processing for New Algorithms:
A lot of algorithms use Retroactive data analysis for processing new algorithms, which is how
we end up with “Broad Core Updates”.
For example, if Google, Bing etc is monitoring billions of pages and it sees that over the last 12
months users have been exhibiting a certain behavior, the algorithm then uses this data
retroactively to predict future behaviors and creates a machine learned branch algorithm
in which it begins testing if the new way of expressing data in the SERPs leads to higher
satisfaction in users than before
If it does over time, then search engines will consider making this a permanent change to their
data sets
It’s important to note that while machine learning algorithms are given goals and allowed to
complete their supervised or unsupervised models, they are processed by humans who are
inherently biased (we call this human bias impact)
This could mean that if a model is identified that leads to a better search engine goes against a
company's primary goal, technical feasibility or roadmap, branch learned algorithm may never
make it into a broad core update
Pro tip: This is why sometimes when you do nothing to your website between algorithm
updates or don’t make a huge effort to change things it begins to re-rank back to its previous
position if it dropped. The retroactive data can show that your website or property is still the
property of choice and re-ranks on the next update
Chapter 2: The Evergreen Framework:
Identifying SEO Fundamentals That Endure
Algorithm Changes
Alright, since you’ve made it to chapter two and have a pretty good understanding of how
indexing works and what algorithms are, let’s discuss what evergreen frameworks are as it
pertains SEO
It may come as a shocker considering all the newest courses that are out, but staying ahead of
the game means not just understanding the latest trends and algorithm updates but also
recognizing the timeless principles that underpin SEO success
Imagine, if you will, that SEO as a vast forest where the trees symbolize the various strategies,
tactics, and techniques you can employ to improve your website's visibility in the search engine.
Some of these trees are evergreen, keeping their leaves and vitality throughout the seasons,
while others shed their leaves, becoming obsolete with each algorithm shift.
The Evergreen Framework is all about focusing on those enduring evergreen principles that
withstand the seasons and lead to long-term yields without algorithmic penalties due to
performing the fundamentals of search engine optimization, rather than the use of tricks,
gimmicks and the sauce
The 5 Pillars of Evergreen SEO Fundamentals
To keep things short and to the point, I’ve created the five pillars of Evergreen SEO
Fundamentals that will help you understand the larger picture taking place
In essence, these pillars reflect how we send “signals” to the above categories in our algorithmic
data sets
It is important to note however that many algorithms go into these broader pillars and
categories. Thousands in fact, and that often times there are algorithms for algorithms to make
sure there is a level of redundancy taken into account should the algorithm have an if or else
condition that wasn’t thought of during its initial rollout phase
1. High-Quality Content : At the heart of the Evergreen Framework lies the belief that content
is king. Producing high-quality, relevant, and valuable content for your audience remains a
timeless SEO strategy. Search engines reward websites that consistently deliver content that
satisfies user intent (Transactional, Informational, Directional)
2. User Adoption and Return: Regardless of algorithm updates, user experience is paramount,
including user adoption and return (direct traffic and brand searches). In addition, ensuring your
website loads quickly, is mobile-friendly, and offers a seamless and intuitive navigation
experience is essential. A user-centric approach is always in style.
3. Relevant Keywords and Entities: While the way keywords are used may change, the
importance of understanding your target audience's search intent and incorporating relevant
keywords into your content remains constant. Keyword research and optimization are
Evergreen SEO practices.
4. Backlinks from Authority Sites: Building a strong backlink profile from authoritative and
relevant websites is an enduring strategy. Quality over quantity is the key here. Natural and
valuable backlinks stand the test of time.
5. Technical SEO: A well-structured website that's easy for search engines to crawl and index
is a fundamental that never goes out of fashion. This includes optimizing title tags, meta
descriptions, headings, and fixing technical issues promptly.
These pillars serve as the On Page, Off Page and Technical SEO aspects that drive a website
to higher rankings over time and drive relevance, authority, user experience and proximity
signals to our main categories
However, even though we are taking an Evergreen approach to our SEO, it is important to
remember that these signals and algorithms develop and evolve over time. Because of this we
must always prioritize adaptation of Evergreen principals
Adapting Evergreen Principles and Iterative SEO for
Algorithm Changes
Before moving forward, I would like to make something clear, The Evergreen Framework
doesn't mean being oblivious to algorithm changes; it means using these fundamental principles
as a compass to navigate those changes effectively over time on your website through on-going
maintenance and adaptations
So what are algorithmic changes and adaptations? I’ve gone ahead and listed examples below
for you to help you connect the dots of understanding, as well as the most commonly used tools
to continue optimizing your content and adapting to algorithm updates
1. Content Adaptation:
Algorithmic Impact:
●
●
●
●
●
●
Algorithm Updates: Adjust content to align with changes in algorithm preferences.
User Intent Shifts: Refine content to match evolving user search intent.
Emerging Trends: Incorporate emerging trends and topics into your content strategy.
Keyword Relevance: Regularly update keywords to maintain relevance.
Competitor Analysis: Stay ahead by analyzing and surpassing competitors' content.
User Feedback: Listen to user feedback to make content improvements.
Tools for Monitoring and Measuring:
●
●
●
●
Google Analytics: Track user behavior and engagement with your content.
Google Search Console: Monitor keyword performance and crawl stats.
SEO keyword research tools (e.g., SEMrush, Ahrefs): Identify emerging trends and
analyze competitor content.
User feedback platforms (e.g., surveys, user testing tools): Gather insights from your
audience.
2. User Experience Enhancements:
Algorithmic Impact:
●
●
●
●
●
●
Page Speed: Continuously optimize for faster load times.
Mobile Optimization: Ensure a seamless mobile experience.
User Behavior: Monitor user behavior and make navigation improvements.
Accessibility: Implement accessibility features for a broader audience.
A/B Testing: Experiment with design and layout for better user engagement.
Feedback Channels: Gather user feedback for ongoing improvements.
Tools for Monitoring and Measuring:
●
●
●
●
●
●
Google PageSpeed Insights: Evaluate page speed and receive optimization
suggestions.
Mobile-friendly testing tools: Check mobile compatibility.
User behavior analytics tools (e.g., Hotjar, Crazy Egg): Track user interactions.
Accessibility auditing tools (e.g., WAVE, Axe): Ensure accessibility compliance.
A/B testing platforms (e.g., Optimizely, Google Optimize): Experiment with design
variations.
Surveys and feedback collection tools (e.g., SurveyMonkey, UsabilityHub): Gather user
insights.
3. Keyword Evolution:
Algorithmic Impact:
●
●
●
●
●
●
Keyword Research: Regularly conduct keyword research to stay updated.
User Queries: Adapt content to match new user search queries.
Long-Tail Keywords: Incorporate long-tail keywords based on trends.
Semantic Search: Optimize for semantic search and natural language queries.
Ranking Analysis: Analyze keyword ranking changes and adjust strategy.
Content Expansion: Expand content to cover related keyword variations.
Tools for Monitoring and Measuring:
●
●
●
●
●
Keyword research tools (e.g., Google Keyword Planner, Moz Keyword Explorer):
Discover new keywords.
Rank tracking tools (e.g., SEMrush, Serpstat): Monitor keyword rankings.
Content optimization tools (e.g., SurferSEO, Clearscope): Optimize content for specific
keywords.
Semantic analysis tools: Identify related terms and concepts.
Competitor analysis tools: Analyze competitors' keyword strategies.
4. Link Building Strategy:
Algorithmic Impact:
●
Backlink Quality: Focus on acquiring high-quality, authoritative backlinks.
●
●
●
●
●
Link Audits: Periodically audit and disavow low-quality or toxic backlinks.
Outreach Strategy: Adapt outreach methods for natural link acquisition.
Competitor Backlinks: Analyze competitors' backlinks for opportunities.
Niche Relevance: Prioritize backlinks from relevant and niche-specific sources.
Link Velocity: Monitor the rate at which backlinks are acquired.
Tools for Monitoring and Measuring:
●
●
●
●
●
Backlink analysis tools (e.g., Moz, Majestic, Ahrefs): Assess backlink quality and
quantity.
Link auditing tools (e.g., SEMrush Backlink Audit, Google Disavow Tool): Identify and
disavow toxic backlinks.
Outreach and prospecting tools (e.g., BuzzStream, Pitchbox): Manage link-building
campaigns.
Competitor analysis tools: Analyze competitors' backlink profiles.
Niche-specific directories and communities: Identify niche-relevant link opportunities.
5. Technical SEO Updates:
Algorithmic Impact:
●
●
●
●
●
●
Site Speed: Continuously optimize for faster site loading.
Mobile-Friendliness: Ensure mobile compatibility and responsiveness.
SSL/HTTPS: Maintain a secure browsing experience with HTTPS.
Crawl Errors: Regularly address crawl errors and broken links.
Structured Data: Implement structured data for rich snippets and SERP enhancements.
Robots.txt: Keep robots.txt files updated to control crawlers' access.
Tools for Monitoring and Measuring:
●
●
●
●
●
●
Website performance testing tools (e.g., GTmetrix, Pingdom): Evaluate site speed.
Mobile-friendliness testing tools: Check mobile compatibility.
SSL certificate management tools: Ensure HTTPS implementation.
Website auditing tools (e.g., Screaming Frog, Sitebulb): Identify crawl errors and
technical issues.
Structured data testing tools (e.g., Google's Structured Data Testing Tool): Validate
structured data markup.
Robots.txt testing tools: Check robots.txt file for errors.
6. Ongoing Maintenance:
Algorithmic Impact:
●
●
Content Audits: Periodically audit and update content for accuracy and relevance.
Link Maintenance: Monitor backlinks and address broken or outdated ones.
●
●
●
●
Security Updates: Stay informed about security updates and implement them.
Algorithm Monitoring: Keep an eye on major algorithm updates and adjust strategies.
Content Calendar: Maintain a content calendar for regular updates and new content.
User Experience Checkups: Continuously assess and enhance the user experience.
Tools for Monitoring and Measuring:
●
●
●
●
●
●
Content audit tools: Identify outdated or low-performing content.
Backlink monitoring tools: Track the health of your backlink profile.
Security monitoring tools: Detect and respond to security threats.
Algorithm tracking and SEO news sources: Stay informed about algorithm updates.
Editorial calendar and project management tools (e.g., Trello, Asana): Plan and schedule
content updates.
User feedback and analytics tools: Gather insights into user experience.
All in all, adaptations in SEO refer to the specific changes and updates made to align with
evolving search engine algorithms and user behavior. This includes updating content,
refining keywords, incorporating emerging trends, and enhancing user experience based on
algorithmic impacts and user feedback
Adaptations are reactive and focused on specific areas such as content, user experience,
keywords, link building, and technical SEO. These changes are made in response to specific
shifts like algorithm updates, user intent shifts, and emerging trends
We can apply these adaptations through iterative SEO
Understanding Iterative SEO for Evergreen Principals As a Scientific
Method
Usually misunderstood and under-utilized, iterative SEO is a methodical process of planning,
executing, analyzing, and refining your SEO strategies over time. It involves a cyclical approach
rather than a linear one, and while similar to adaptation processes attempts to take a proactive
approach, rather than retroactive after algorithmic shifts
Here's how it works:
1. Planning: Begin by setting clear and measurable SEO goals. Identify target keywords,
user intent, and competitive landscape. Create a roadmap for implementation.
2. Implementation: Put your SEO plan into action, making necessary on-page and
off-page optimizations, content enhancements, technical improvements, and more.
3. Analysis: Regularly monitor the performance of your SEO efforts. Utilize analytics tools
to track rankings, traffic, conversions, and user behavior. Identify what's working and
what's not.
4. Refinement: Based on your analysis, adjust your SEO strategies. Refine keyword
targeting, update content, fix technical issues, and adapt to algorithmic changes.
5. Repeat: The iterative process doesn't end; let me repeat, it does not ever end.
Continuously revisit your SEO plan, execute improvements, analyze results, and refine
your strategies.
Why Iterative SEO Matters for Evergreen Growth
1. Adaptability to Algorithm Changes: Search engines frequently update their
algorithms. An iterative approach ensures you can quickly adapt to these changes, or
aren’t hit by them, allowing for the ongoing maintenance and improvement of your
rankings..
2. Staying Relevant to User Intent: User behavior evolves, and so do search queries.
Iterative SEO allows you to align your content and keyword strategy with current user
intent, ensuring your content remains valuable and evergreen.
3. Content Relevance: Evergreen content should remain relevant and up-to-date. Iterative
SEO enables you to refresh, expand, or reoptimize content as needed, keeping it
valuable to users.
4. Performance Monitoring: Iterative SEO provides ongoing insights into your website's
performance. You can identify issues like traffic drops or declining rankings early and
take corrective actions promptly.
5. Improved User Experience: User experience is a critical factor in SEO. Iterative
improvements to site speed, mobile-friendliness, and navigation enhance the user
experience, contributing to better rankings.
6. Measurable Results: An iterative approach allows you to track the impact of each
change you make. You can measure what works, which strategies drive traffic and
conversions, and allocate resources effectively.
Examples of Iterative SEO Strategies:
1. Keyword Optimization:
○
Definition Regularly review and update your target keywords based on shifts in
user search behavior and keyword performance.
○
Action: Research new keywords, optimize existing content for relevant
keywords, and adjust keyword strategy.
2. Content Updates:
○
Definition : Periodically revisit and refresh your evergreen content to ensure it
remains accurate and relevant.
○
Action: Update statistics, add new information, improve formatting, and enhance
the overall quality of the content.
3. Technical SEO Audits:
○
Definition : Conduct regular technical audits to identify and address issues that
may impact site performance and visibility.
○
Action: Fix crawl errors, optimize site speed, improve mobile-friendliness, and
ensure secure browsing with HTTPS.
4. User Experience Enhancements:
○
Definition : Continuously monitor user behavior and gather feedback to enhance
the overall website experience.
○
Action: Make navigation improvements, optimize page layouts, and implement
user-centric design changes.
5. Competitor Analysis:
○
Definition : Regularly analyze competitor strategies to identify gaps and
opportunities.
○
Action: Adjust your own SEO tactics based on competitive insights and strive to
outperform competitors.
6. Performance Tracking:
○
Definition : Monitor key performance metrics such as organic traffic, conversion
rates, and keyword rankings.
○
Action: Identify trends, anomalies, or drops in performance and take corrective
actions accordingly.
7. Content Calendar:
○
Definition : Maintain an editorial calendar for content planning and updates.
○
Action: Schedule regular content audits, refreshes, and new content creation
based on keyword research and user intent.
8. Social Media Integration:
○
Definition : Monitor the effectiveness of social media integration in boosting
organic visibility.
○
Action: Adjust social media posting schedules, content sharing strategies, and
engagement tactics.
Iterative seo can be seen as a systematic approach that encompasses planning, executing,
analyzing, and refining SEO strategies over time to make sure that you head-off algorithm
updates before they take place
Iterative SEO is about proactively seeking improvements and optimizations in a cyclical
manner that increase rank month over month
Additional List of Areas for Iterative SEO:
1. On going keyword research and strategy
2. On-page content optimization
3. Technical SEO audits and improvements
4. User experience enhancements
5. Competitor analysis and benchmarking
6. Analytics and performance tracking
7. Content calendar management and blogging
8. Social media integration and engagement
9. Mobile optimization and responsiveness
10. Voice search optimization
11. Local SEO optimization
12. E-commerce SEO optimizations (product listings, reviews, etc.)
13. Video SEO (YouTube, Vimeo, etc.)
14. Multilingual and international SEO
15. Image optimization and alt text management
16. Structured data markup for rich snippets
17. Page speed improvements and optimization
18. Website architecture and URL structure
19. Conversion rate optimization (CRO)
20. Site search functionality optimization
21. Blog and news section management
22. User-generated content and reviews optimization
Please remember that iterative SEO practices should be designed as an SOP by you as a
freelancer, or your agency/company and that the above list is a starting point
Chapter 3: Pulling It All together
So now that we have the fundamentals of how evergreen SEO works, I’m sure you want to
know how all of the above applies right?
In this chapter we will discuss how your hosting, author profiles, backlink selection, conversion
rate optimization, page silos, brand mentions and even social signals weigh in on our
algorithmic categories and pillars of evergreen SEO
With that said, let’s start at the beginning in a cascading fashion
Domain Registration, Hosting and IP Addresses
Domain Registration
When most people think of who they’re going to purchase their website from and the platform
they are hosting through, they can’t tie it back to the fundamentals of SEO
In fact, they believe it holds no weight or bearing on how it applies to SEO and such they end up
going with the first solution they find - usually the cheapest
But let’s talk about a few things for a second and how what you choose does matter
There’s tons of registrars out there as it is called, some subsidiaries even re-sell other
companies domains under their own brand and website
But where you buy your domain actually does mean more than you would think. If you
remember back to our previous chapter, the first step in the process of indexing requires a
search engine to be able to find your domain, right? Well not all domain registrars update search
engines with their compiled lists of newly registered domains through third parties, and even
less so if you turn off your Who Is information
When I was deep into technical SEO and doing Log File analysis of domains, I took the time to
analyze over 100,000 domain purchases on different registrars (this is public info) and then
wrote a program to check each domain hourly in Google with a “site:” operator and then logged
which domains started indexing first
Here were my findings and the order of top 3 in that which I saw newly registered domains
become available in Google’s Index specifically:
1. Google Domains (Now Squarespace)
2. GoDaddy
3. BlueHost
Now, please keep in mind this was about 6-7 years ago at this point, right after Google Domains
became available. I had theorized that Google was picking up their own domains at a higher
rate through crawling because the registration of the domain was being processed directly
through Google’s own systems; which proved to be true GoDaddy came in a close second
Pro Tip: It’s important to remember that while turning off your Who Is could lead to less
bothersome calls, there are studies conducted throughout the SEO industry that have discussed
how this could be a “trust” signal when left on. It could also lead to faster initial indexing times
due to Google being able to find the newly registered domain more quickly through registries
Hosting Providers and IP Addresses
After you have decided where to register your domain the next logical step in the sequence is
finding a hosting provider
Without getting into who I think is the best host, because that is an entirely different topic and
rabbit hole, here are a few things to consider about hosts:
Cheaper hosts provide what is known as shared hosting. With shared hosting your total
resources are limited because they are drawn from a “pool”, rather than a dedicated hosting
solution
The reason this can matter at times is because if the pool that all of these IP addresses share
have PBN networks attached to them, then the whole “neighborhood” can be penalized. While
not typically the case any longer due to Google recognizing this conundrum with shared hosting,
it is something to be aware of
Additionally, cheaper hosting providers usually don’t have redundant IPs. Redundant IPs are
when two IPs resolve to the same url address. This means if one IP goes down the other IP
picks up in its place, leading to virtually almost no downtime
The reason the above matters can be summed up into two words. Can you guess? User
Experience
In the past we have seen websites lose rank when they went down due to maintenance issues,
for existing in a bad neighborhood, and just overall being incredibly slow due to selecting budget
hosting solutions
While these can be small signals, if Google is trying to crawl and resolve your website during its
scheduled crawl budget time for your domain and it gets 500 errors due to server problems it’s
going to be less likely to recommend your website in the SERP. Especially if it happens
continuously. Search engines will then associate this downtime with poor user experience
because if the spiders can not crawl and resolve your URL then either can humans
STILL BEING WRITTEN….
Glossary of Terms
1. Evergreen SEO Framework: A systematic approach to search engine optimization that
focuses on enduring principles and strategies that remain effective over time, regardless
of algorithm changes.
2. SEO: Search Engine Optimization - the process of improving a website's visibility in
search engine results to increase organic (non-paid) traffic.
3. Search Engines: Online platforms like Google, Bing, and Yahoo that allow users to
search for information on the internet.
4. SERP: Search Engine Results Page - the page displayed by a search engine in
response to a user's query, showing a list of relevant web pages.
5. Web Crawlers: Automated programs or bots that systematically browse the internet,
collect data from websites, and index the content.
6. Index: A structured database that stores information collected by web crawlers and
categorizes web pages, images, videos, and other online content.
7. Repository: A secure storage facility for copies of data gathered by web crawlers,
ensuring long-term preservation and accessibility of web content.
8. Barrels: Thematic divisions or categories within the repository where data is further
organized for easier retrieval.
9. URL Server and Resolver: Components responsible for translating search queries into
specific web addresses (URLs) and generating precise URLs for desired online
resources.
10. Lexicon: A collection of words and phrases indexed and cross-referenced, serving as a
linguistic resource for search engines to understand user queries.
11. Doc Index: An index that maintains references to all webpages, images, and content
stored in the index, enabling efficient search result retrieval.
12. Links: Interconnected pathways that connect webpages on the internet, facilitating
navigation for web crawlers and users.
13. PageRank: A complex algorithm used to assess the popularity and importance of web
pages, influencing their ranking in search results.
14. Searcher: The user of a search engine who initiates the search process by entering a
query.
15. Algorithm: A set of rules and calculations used by search engines to assess and rank
web pages based on various factors.
16. Relevance: The state of being closely connected or related to a topic or query, an
important factor in ranking web pages.
17. Authority: The power to influence thought, opinion, and behavior, considered when
evaluating the credibility and influence of websites.
18. User Experience: How a user feels about their needs being met when interacting with a
website, affecting rankings.
19. Proximity: The state of existence in time, space, or relationship, considered in local
search and contextual relationships of words.
20. Algorithm Rollouts: The process of search engines refining and deploying algorithm
updates to improve search results.
21. Machine Learning: A subset of artificial intelligence (AI) involving training algorithms to
learn patterns from data and make predictions or decisions.
22. Supervised Learning: A machine learning approach where algorithms are trained on
labeled data with known input-output pairs.
23. Unsupervised Learning: A machine learning approach where algorithms are trained on
unlabeled data and must find patterns or groupings on their own.
24. Retroactive Data Processing: Using historical data to predict future behavior and
create machine-learned branch algorithms.
25. Broad Core Updates: Significant algorithm updates that impact search results and
ranking factors.
26. Iterative SEO: A methodical process of planning, executing, analyzing, and refining
SEO strategies over time to adapt to changes.
27. Content Audits: Periodically reviewing and updating content for accuracy and
relevance.
28. Backlink Monitoring: Tracking the health of a website's backlink profile and addressing
broken or outdated links.
29. Security Updates: Implementing security measures to protect a website from threats.
30. Performance Tracking: Monitoring key performance metrics such as organic traffic,
conversion rates, and keyword rankings.
31. Content Calendar: Planning and scheduling content updates based on keyword
research and user intent.
32. Social Media Integration: Incorporating social media strategies to boost organic
visibility.
33. Mobile Optimization: Ensuring a website is compatible and responsive on mobile
devices.
34. Voice Search Optimization: Optimizing content to rank well in voice-based search
queries.
35. Local SEO Optimization: Strategies for improving a website's visibility in local search
results.
36. Structured Data Markup: Adding structured data to web content to enhance rich
snippets and SERP features.
37. Page Speed Optimization: Techniques for improving website loading times and
performance.
38. Conversion Rate Optimization (CRO): Strategies to enhance the likelihood of visitors
taking desired actions on a website, such as making a purchase.
39. Site Search Functionality Optimization: Improving the search functionality on a
website for better user experience.
40. User-Generated Content: Content created by website users, such as reviews and
comments, that can be optimized for SEO.
Download