23 Latest SEO Trends that Work in 2023

23 Latest SEO Trends that Work in 2023

As technology continues to evolve and search engines become more sophisticated, it is important for businesses to stay up-to-date with the latest SEO trends in order to remain competitive in the online space. In this article, we will explore 23 SEO trends that are expected to work well in 2023, helping businesses improve their search engine rankings and drive more organic traffic to their websites. From voice search optimization to user experience and content creation, these trends are essential for any business looking to stay ahead of the curve in the rapidly changing world of SEO.

seo trends

1. Using Brand Entity Schema to Optimize for ChatGPT Results

 Rise of ChatGPT and AI Based Search Platforms

ChatGPT is a popular language model that is used by millions of people to find information on a wide range of topics. 

ChatGPT is trained on billions of data around the world and is updated regularly in order to fetch the most updated result for any search query.

Unlike Search Engines, ChatGPT specialised in giving a defined answer to the User’s search query, and it also improves upon it by adding information into relevant topics that might be useful

As a brand, appearing in ChatGPT’s search results can be a valuable way to increase your online visibility and reach a wider audience. One way to achieve this is through the use of entity optimization.

What Are Entities

Entities are objects, concepts, or things that have a distinct identity and are recognized as unique by search engines like ChatGPT. This can include brand names, products, locations, and people. By optimizing your brand’s entities, you can increase the likelihood of appearing in ChatGPT’s search results.

How To Define Entities on the Website

From the point of view of technical SEO, perhaps the strongest method of defining entities in your content is through schema markups. 

Schema markups are enhanced description of specific objects or information on the website which also appears as various features in SERP.

Schema Markups can be used link entities to specific identifiers which can help to define it. It can also be used to create semantic relationships between different entities. 

When entities are defined, they can be linked to various objects using schema markups to make them part of Google’s knowledge graph.

The search engine knowledge graph defines a web of information for Google which links various information across the web and helps search engines to create the context for the information that it crawls.

How To Define Entities on the Website

Using the Main Entity Schema

Indicates the primary entity described in some page. It can be used the Identify the Brand as an Entity and can be implemented in the Home Page.

Using the Main Entity Of Page Schema

Helps to indicate a particular property (thing, object or brand) to be the main entity that is defined by the web page.

This schema type when implemented on the Home page can also help in authenticating the Brand Entity to AI tools like ChatGPT.

Leveraging Local SEO

If your brand has physical locations, optimizing your entities for local search is essential. 

This can include optimizing your location data, creating local content, and using local keywords. 

By optimizing your brand’s entities for local search, you can increase the likelihood of appearing in ChatGPT’s local search results and attract more local customers to your business.

2. Advanced SEO Perspectives to Consider in a Content Audit

Are All Your Content Discoverable By Search Engines?

When search engine bots are unable to identify and crawl critical content on your pages, this is one of the most common JavaScript issues. This can be due to general coding errors or because the content is not easily accessible for rendering and indexation.

Essentially, we’re comparing differences between the raw HTML and the rendered HTML – as well as the visible content on the page.

How To Audit This?

Setup the following Crawl Configuration as below

This will display the original source HTML and the rendered HTML side by side in the View Source pane, allowing you to compare differences and assess whether or not critical content and links are actually interpreted in the rendered DOM. 

The Show Differences checkbox above the original HTML window speeds up the comparison process even more.

Are the URLs Using JavaScript Fully Accessible by Search Engines?

Step 1:

To see if your URLs are available and indexable, crawl the site with Screaming Frog and create a dashboard of URLs and the data linked with them.

Step 2:

By running a crawl on screaming frog you can quickly identify the potential crawl ability and indexation issues that can be caused by improper rendering of javascript.

If you have certain URLs on your site that should be indexed but aren’t being found, there are problems that need to be addressed. Some of the most common issues to consider on our pages:

with blocked resources

that contain a noindex

that contain a nofollow

that contain a different canonical link

redirects being handled at a page level instead of at the server request

that utilizes a fragment URL (or hash URL)

Core Web Vital Audit

Simply defined, these metrics are intended to assess both page speed and user experience. The top three Core Web Vitals are as follows:

Largest Contentful Paint (LCP) – measures the time it takes for the primary content on a page to become visible to users. Google recommends an LCP of fewer than 2.5 seconds.

First Input Delay (FID) – measures a page’s response time when a user can interact with the page, such as clicking a link or interacting with JavaScript elements. Google recommends an FID of 100 milliseconds or less.

Cumulative Layout Shift (CLS) – measures the number of layout shifts that reposition a page’s primary, which ultimately affects a user’s ability to engage with content. Google recommends a CLS score of 0.1 or less.

We use the Chrome User Experience report Using Lookup Studio

To Generate a Report

Detecting Index Bloat and Content Pruning

What is Index Bloat?

Pages with poor quality material, duplicate content, cannibalising content, or no content should be excluded from search results. These low-value pages squander the crawl budget, dilute keywords, and cause index bloat. 

As a result, auditing index bloat is a powerful activity designed to address this issue.

Main Causes of Index Bloat

  • Dynamically generated URLs (unique and indexable pages created by functions like filters, search results, pagination, tracking parameters, categorization, or tagging).
  • User-generated content (UGC).
  • Coding mistakes (e.g broken URL paths in a site’s footer).
  • Subdomains (thin or non-search value pages on domains you accidentally aren’t paying attention to.
  • Orphan Pages

How to Detect Index Bloat?

The most common URL indexation issues occur with the following types:

  • All HTML URLs that are non-indexable
  • Blocked Resource under response Codes
  • No Response URLs under response codes
  • Redirection (3XX URLs) under response codes
  • Redirection (Javascript) under response codes
  • Client Error (4xx URLs) under response codes
  • Canonical Issues under Canonicals Tab
  • Sitemap URL issues under Sitemaps.
  • Non Indexable URLs under the Directives Tab

You Can Also Use Screaming Frog Crawl Analysis Feature to Filter Relevant URL Data

Pruning URLs contributing to Index Bloat

Here are a few ways to assess the value of URLs, which can then help you determine how they should be pruned.

  • Review organic metrics in Google Analytics, such as organic search traffic, conversions, user behaviour, and engagement to better gauge how much SEO value a URL has.
  • Review All User segment metrics as well, so you don’t accidentally prune content that’s driving business value. More about this is below.
  • In Google Search Console, use Performance > Search Results to see how certain pages perform across different queries. Near the top are filter options (Search type: Web and Date: Last 3 months will be activated by default; we prefer to review at least 12 months of data at a time to account for seasonality). Add a Page filter to show the search performance of specific URLs of interest. In addition to impressions and clicks from search, you can click into each URL to see if they rank for any specific queries.
  • Use the Link Score metric (a value range of 1-100) from the SEO Spider Crawl Analysis. URLs that have a very low Link Score typically indicate a low-value page that could perhaps be pruned via redirect or noindex/removal.
  • Remove & Redirect – In most cases, the URLs you’d like to prune from index can be removed and redirected to the next most topically relevant URL you wish to prioritize for SEO. 
  • Meta Robots Tags – Depending on the nature of the page, you can set a URL as “noindex,nofollow” or “noindex,follow” using the meta robots tag.
  • Noindex,nofollow prevents search engines from indexing as well as following any internal links on the page (commonly used for pages you want to be kept entirely private, like sponsored pages, PPC landing pages, or advertorials).
  • Disallow via Robots.txt – In cases that involve tons of pages that need to be entirely omitted from crawling (e.g. complete URL paths, like all tag pages), the “disallow” function via Robots.txt file is the machete in your pruning toolkit.
  • Canonicalization – Not recommended as an end-all solution to fixing index bloat, the canonical tag is a handy tool that tells search engines the target URL you wish to prioritize indexing. The canonical tag is especially vital to ensure proper indexation of pages that are similar or duplicative in nature, like syndicated content or redundant pages that are necessary to keep for business, UX, or other purposes.

3. Using X Default and HrefLang Tags | A Short Guide to International SEO

What is International SEO?

  • Normally SEOs of 2022, like to go niche and focus on a specific range of services for specific countries. Local businesses and most businesses have a limited geographical area where they operate, be it a state, city or a particular country. 
  • But what if a business tries to go global? How would businesses try to target and rank each individual country? And then comes the question of language, and how to show the correct language content to the correct user.
  • International SEO is the process of optimizing a website or blog so that search engines identify which countries you want to reach or which languages you should use in your business.

An Introduction to Hreflang Tags and its Usage

Hreflang tags were introduced by Google to help webmasters specify to google the correct version of the website to be shown for the specific type of users, language or location. This was introduced in 2011 and helped users and search engines to serve the right type of content to the right type of users to improve the user experience.

For example, hreflang tags can be used to serve the specific type of content for English Speaking audiences for both US and UK separately. 

Look at the following code snippet

Rel = “alternate” couple with an hreflang tells google that the following HTML link is an alternate variation of similar pages.

href = “url link” tells google that the exact URL mentioned in the quote is the URL under observation.

Implementing HrefLang Tags

Hreflang = “en-gb” tells google that the URL within the quotes is the right variation for UK-based users and uk language speaker

This code snippet clearly differentiates and specifies that there are different types of content for different languages, namely Australia, USA and UK. Each correct variation is properly indicated in under the href = “link” element.

Ways to Implement the X Default Tag

The following methods can be applied to implementing the hreflang tag for your site.

  • HTTP header of each page
  • XML sitemap
  • Header of the HTML code of each page 

4. How to Get Ranked on You.com

How Do Traditional Search Engines Work?

Traditional Search Engines like Google uses a complex AI algorithm which ranks results based on a local index that is created dynamically from a larger index each time a user makes a search query.

  • Crawling
  • Indexing
  • Processing
  • Ranking
  • Results Page

How Does You.com Work?

Unlike Traditional Search Engines, You.com relies heavily on User Feedback to rank results against user queries. It works like this:

  • Based on the Initial AI Training the AI ranks the most relevant results on You.com based on the search query.
  • The user consumes the results and votes on the basis of how helpful was each result for the user.
  • Based on the number of user votes, the search rankings change. This involves real-time changes to rankings as per the continuous change in User Upvotes.
  • At You.com, we believe in democratizing the search experience. That’s why their social voting system is designed to give users direct control over website rankings by relying on their input rather than a complex algorithm.

How to Optimize for You.com?

Optimize with Schema Markups

  • Schema Markup Implementation Checklist :
  • Organization Schema
  • Local Business Schema if Relevant
  • Article Schema
  • Web Page Schema
  • Entity Schema 
  • Main Entity of Page Schema

Here’s an example of an Entity Schema

Build Trust Signals

How to Do it 

  • Build Relevant Backlinks from Trustworthy Pages
  • Distribute your Backlinks to Important Landing Pages, Category and Service Pages
  • Build Domain Authority and Page Authority
  • Maintain a High Trust Flow Score.
  • Follow Google EEAT Guidelines in your Content Strategy.

Expand Website Functionality with Web Apps

How to Do it 

  • You.com have found a new way of integrating App functionality in search engines to enable the user to perform a variety of tasks without ever having to leave the search engine results page.
  • This enables by various web apps which are indexable by You.com. It is hence preferable to improve your website and provide various functionality like Web Applications to further enhance your website visibility on You.com.

Create Apps and Submit to App Stores

How to Do it 

  • You.com is also capable of pulling results directly from the App Store and Play Store. This makes App Store Optimization a very important SEO Strategy to rank well in “Search Engines of the Future”. 
  • You.com can rank such results and also provide various public functionalities that can be performed directly on the SERP itself.

5. The Problem of Index Bloat and Crawl Budget Optimization 

What is Index Bloat?

Index Bloat is a condition when the website has too many non-indexable urls allowed for crawling than the actual number of indexed urls.

Pages with poor quality material, duplicate content, cannibalising content, or no content should be excluded from search results. These low-value pages squander the crawl budget, dilute keywords, and cause index bloat.

Optimizing the Crawl Budget

We first look at the current number of Indexed Pages in Google for the given website.

The observed no. of indexed pages: 847

Next We look at the Daily Crawl Rate as per various Google Bots.

The observed no. of indexed pages: 341

Calculate the Crawl Budget

Crawl budget = (Site Index value) / (Single day crawl stat)

The ideal score is as follows:

1. If the Crawl budget is between 1 – 3 is Good

2. If the Crawl budget is 4 – 10 is bad and needs fix

3. If the crawl budget is 10+ then its worst and needs immediate technical fixes

As per the calculation, 

Crawl budget for https://thatware.co = 847/341 = 2.48

Crawl Budget Calculation

The score for https://thatware.co is 2.48 which is in the ideal zone of 1 – 3.

How to Optimize for Crawl Budget?

Sitemap.xml Optimization

Step 1: 

Make sure the XML sitemap contains the below hierarchy

Should start with Homepage with priority set to 1.0 and change frequency value = Daily

Then it should be followed with Top navigation pages with priority set to 0.95 and change frequency value = Daily

Then it should be followed with top landing and service pages with priority set to 0.90 and change frequency value = Daily

Then it should be followed with Blog pages with priority set to 0.80 and change frequency value = Weekly

Then it should be followed with Miscellaneous pages with priority set to 0.70 and change frequency value = Monthly

Step 2

After that, put the sitemap path in Robots.txt.

Step 3

Also, declare the sitemap on search console as well.

Check and Fix Mobile Usability Issues

As per the above screenshot obtained from the Google search console, the Mobile usability we can see that we don’t have errors on mobile issues which is Good. 

But there is a negative point here,

Remember we had 847 pages indexed but the screenshot above shows that there are only 105 valid mobile pages. 

Hence, we can conclude that there is an issue with mobile-first indexing.

This issue can be solved, once the above sitemap has been optimized properly.

Fix Page Indexing Issues on Priority

Under the Pages section of Google search console, there will appear several search console indexing issues that contributes to index bloat.

Fixing these issues on priority allows Google to crawl only important landing pages and helps in optimizing your websire’s crawl budget.

Removing Unwanted Pages from Google Search

Unwanted pages like website internal assets and plugin pages are unimportant for indexing and need not be shown by search engines.

Hence these urls should be removed from Search.

This is done through the removals tool in Google search Console.

6. Focusing on Content Quality | Google’s EEAT Quality Guidelines

What Does the New E Stand For?

Recently Google added a new dimension to the guideline, known as Experience, thus adding a new dimension to the popular EAT guideline.

Including “experience” shows that Google Quality Raters can also judge content quality by understanding how much the content creator has direct experience in the topic.

Google emphasizes that “trust” is at the center of this idea and is the “most significant part of the E-E-A-T group.” 

Does Search Engine Quality Ratings Directly Influence Search Rankings?

Here’s what Google Had to Say

How Does Search Quality Raters Influence Ranking Overall

First they give rating values to each website in the back end.

This data would then be made available to machine learning systems that would use it to augment the algorithms based on known signal data.

How Does Search Quality Raters Influence Ranking Overall?

For example, if a site or group of similar sites are consistently rated High or better, the system could review all the signal data from the site(s) to look for commonality.

With this, the system would likely:

  • Produce a set of results based on what the new algorithm produces across a variety of phrases and niches.
  • Send the top-ranking sites in that set to the raters.
  • And, assuming the raters favor the new results page, push the signal adjustments into the global algorithms we all know and love, either globally or in testing.

Google Defines the Three Pillars of EEAT

With Trust being the defining factor for assessing a Content’s Quality, Google defines 3 other factors that helps determine the trust score of a Web Page.

The Aspect of YMYL Topics

How Does a Search Quality Rater Distinguish between Harmful and Non Harmful Topics? 

When Does a Web Page Lack EEAT?

Why Should Web Masters or SEOs Care?

Although ratings given by search quality evaluators are not used to determine rankings, they can still help Google refine its algorithms. It is important to look at the changes made in the new version of the document compared to the old version to understand better what websites and web pages Google is more likely to rank. Google has made these amendments for a reason.

The new addition of “experience” is an important factor that Google has been alluding to for some time. It’s great to see them clarify it in the document and recognize it as one of the four core elements that define quality.

7. Leveraging Long Tail Keywords To Drive Quick Traffic and Organic Rankings in Google

What are Long Tail Keywords?

Long-tail keywords essentially search phrases that are longer than two or three words. They are often used to pinpoint what a person is looking for in the search bar. This search keyword is more specific than typing in just “shoe” and is more likely to reveal exactly what the person is looking for.

For example, A person searching for a specific type of “patio chairs” would be more likely to search for “navy blue patio chairs”.

The latter is a long tail keyword since it has a search volume of “90”, much lower than the head term “patio chairs”.

The Show Differences checkbox above the original HTML window speeds up the comparison process even more.

Why are they called Long Tail Keywords

Why are they called Long Tail KeywordsWhy are they called Long Tail Keywords

Contrary to popular belief, the word longtail keywords come from the position of these words in search volume vs no. of keywords plot, also known as the “Search Demand Curve”.

These terms in the top are called head terms, while the tail end containing billions of keywords with low search volume are called long tail keywords.

What Makes Long Tail Keywords an Absolute Gold Mine for Ranking?

Long Tail Keywords are Less Competitive

Short Tail Keywords

Long Tail Keywords

Long Tail Keywords are Intent Specific and hence Easier to Target with Content

Lets look at the content length for a top ranking page ranking for a keyword: “how to mine bitcoin”

Let’s do the same for the long tail keyword: “does bitcoin pay dividends.”

As you can see, the word count of the top-ranking result shows to be only “1519” words.

Now that is far easier to beat than the previous one.

How to Find Long Tail Keywords?

Use Google People Also Ask

Use Google Related Search Section

Use a Chrome Extension like Keywords Everywhere

Dig Deeper into Long Tail Semantically Related Topics using AlsoAsked.com

8. How To Use the Around Operator to find Relevant Keywords to Target?

What is Around Operator?

Google has its own search operator called “AROUND” for finding webpages that include words or phrases which are near to each other.

By finding the most relevant search results that contain keywords that Google considers nearest to our target query, we can extract the words with higher relevancy score and optimize our target landing page with them.

How we Use Around Operator for SEO?

For example, we take “online supplement store” for this niche like below format in the SERP:

“online” around(10) “supplement”

In the same way, we add one more parameter with the existing operator by choosing the second priority keyword with Around 20 like this:

“online” around(10) “supplement” around(20) “store”

Most relevant output we get: https://sourceofsupplements.com/

Again choose third priority keyword Around (30) with the existing one:

“online” around(10) “supplement” around(20) “store” around(30) “men”

Most relevant output we get: https://www.healthkart.com/

Moving Forward we use a Keyword Extractor Tool like Bag of Words that can extract the most relevant keywords between these three landing pages.

The relevancy of the keywords is determined based on the target landing page that is provided by us.

By Optimizing the target landing page content with these keywords, this will highly increase the chances of keyword for the target query.

Keywords Suggested By the Bag of Words Tool

9. HTTP/2 Protocol and its Impact On the World Wide Web

What is the HTTP/2 Protocol?

Introduction to HTTP/2.0 Protocol

HTTP/2 Protocol is a new way of sending and receiving data throughout the world wide web.

With the ever expanding internet, the HTTP 1.0 protocol was not able keep up the pace by refreshing the content as soon as possible making it difficult to maintain highly interactive sites.

The newly formed HTTP/2.0 Protocol fixed many of the inadequacies of the HTTP/1.0 Protocol and helped enhancing the world wide web.

Main Benefits

  • Boost Website Performance
  • Access To The Internet At A Lesser Cost
  • Broad Area Coverage
  • Widespread Media Exposure
  • The Use of Modern Technology is Improved
  • Improved Security

How to Check if your Website Supports HTTP/2.0 Protocol

By Using the Chrome Developer Tools

By Using a Tool like HTTP/2.0 pro

In Terms of SEO what Does HTTP/2 Imply?

Google’s algorithms do not consider whether a site is HTTP/2-ready, but it does encourage sites that provide a seamless user experience. As a result, migrating to HTTP/2 will improve a site’s search engine optimization. 

HTTP/2 will undoubtedly improve mobile performance, which has focused on recent speed-up efforts.

10. WebP and Image Format for the Web And Its SEO Significance

What is Webp Format?

According to Google, WebP is a modern image format that provides superior compression for images on the web.

This means that website owners can create smaller images that are visually indistinguishable from their larger counterparts. It supports both lossy and lossless compression and includes an alpha channel for transparency, which means it can also compress PNG images and keep the transparent background.

Main Benefits of WebP Format

  • WebP lossless images are 26% smaller in size compared to PNGs.
  • WebP lossy images are 25-34% smaller than comparable JPEG images
  • Lossless WebP supports transparency (also known as alpha channel) at a cost of just 22% additional bytes.
  • Lossy, lossless and transparency are all supported in animated WebP images, which can provide reduced sizes compared to GIF and APNG.

Does Webp Really Necessary for Ranking

While there is no direct evidence that using WebP images specifically will improve your website’s ranking, using Webp helps to optimize for speed and performance in various devices and platforms that helps in boosting user experience. Important points are:

  • Smaller Image sizes helps to improve page load times and overall user experience.
  • Webp can reduce the amount of data needed to load images on mobile devices, thus making your webpage mobie optimized
  • Google prioritizes faster websites so Webp optimization shall undoubtedly increase chances for ranking.

How to Optimize for Webp?

Choose the Right File Format?

Don’t compromise on Quality.

Good Compression

Bad Compression

Understand the difference between Lossy and Lossless Compression

Lossy: This is a filter that eliminates some of the data. This will degrade the image, so you’ll have to be careful of how much to reduce the image. The file size can be reduced by a large amount. You can use tools such as Adobe Photoshop, Affinity Photo, or other image editors to adjust the quality settings of an image.

The second photo in previous slide was example of lossy compression.

Lossless: This is a filter that compresses the data. This doesn’t reduce the quality but it will require the images to be uncompressed before they can be rendered. You can perform a lossless compression on your desktop using tools such as Photoshop, FileOptimizer, or ImageOptim.

The first photo in previous slide was example of lossy compression.

Use Image Optimization Plugins

If you are Using WordPress or Shopify there are several Image Optimization Plugins to Choose from:

However that we recommend using:

  • Smush
  • Litespeed Cache(With Hostinger)

11. The Problem of Crawl Depth and HTML Sitemap

What is Crawl Depth and Why is It Bad for SEO?

Crawl Depth

Crawl depth refers to the number of clicks required to reach a particular page on a website starting from the homepage. In other words, it’s the distance of a web page from the homepage in terms of the number of clicks it takes to get there.

Search engines use crawlers to navigate through the website and index its pages.

A webpage with a high crawl depth makes it difficult for search engines bots to discover the page easily.

Without proper discoverability, the webpage might not be indexed at all. Thus never showing up in SERPs.

To improve SEO and user experience, it’s recommended to keep crawl depth to a minimum. 

This can be achieved by designing a website with a clear and intuitive hierarchy, using internal linking to make pages easily accessible, and avoiding unnecessary levels of nesting in the site’s structure.

The recommended crawl depth for a webpage < 3

HTML Sitemap | The Best Fix for High Crawl Depth

An HTML sitemap is a page on a website that lists out all the pages on that website in a hierarchical format, with links to each page. Unlike the XML sitemap, which is designed primarily for search engine crawlers, the HTML sitemap is designed for human visitors to the website.

How Does it Fixes Crawl Depth

When all Pages are Linked to the Home page in a proper hiearchial format you are essentially creating a link path from Home to every page of the site.

By restricting the link path to not exceed more than two hiearchy, than you are automatically optimizing the site for Crawl Depth.

Other Benefits of HTML Sitemap

In Addition to Fixing Crawl Depth an HTML Sitemap can: 

Clear and Intuitive Way for Users to Navigate the Site. If a user is unable to find what they’re looking for through the site’s main navigation, they can turn to the HTML sitemap to locate the page they need.

12. 3 Pack Ranking And Local SEO

What is Google 3 Pack?

The Google 3 pack optimization is pretty much exactly what it sounds like a pack of three local search results. These results are sourced based on a user’s query and their location.

For instance, if you were to type in “SEO Services Kolkata” you’d likely get three suggestions at the very top. Which would be determined by the amount of info Google can collect on them.

Why is it Significant for Local SEO?

Increased visibility: More likely that users will click through to the business’s website or contact them directly.

Credibility: The Google 3 Pack is highly visible and prominently displayed, increasing the credibility of the business.

Increased traffic: Increased traffic to a business’s website or physical location, which can result in more leads and sales.

Competitive advantage: Gives a competitive advantage over other local businesses in the same industry or niche.

How to Rank for Local 3 Pack?

Optimize your Google my Business Categories

Optimize the Business Titles and Description with Local Keywords

Add your Website and Services to the GMB Profile.

Add Regular Updates to the GMB Profile.

Geo Tag your Website to improve Search Query Relevancy.

Build Classified Links to Drive More Traffic.

Build Citations and Business Listings to drive more relevant traffic.

13. How to Optimize for Google Multisearch

What is Google MultiSearch?

Many a Times , a user doesn’t always know the right search term to type in order to search something. 

For example, while shopping if you find an item you donot know the name about but you want to quickly compare the price online.

Google has identified the demand for this type of search and has actively worked to roll out a feature called Google Lens found on mobile and also for Google apps. 

How Does Google Lens Work

SCAN & TRANSLATE TEXT

IDENTIFY PLANTS & ANIMALS

EXPLORE PLACES AROUND YOU

FIND THE LOOK YOU LIKE

KNOW WHAT TO ORDER

SCAN CODES

How Google MultiSearch Works

Google Multisearch uses AI to combine images and text to make sense of the query and return relevant results, mostly product listings, or other relevant formats.

It also claims to explore ways of integrating the MUM algorithm to enhance the search experience. Google MUM Algorithm is a powerful technology with the ability to answer queries audio, video, images, and text and return the relevant information irrespective of the language of the query.

How SEO Can Benefit from MultiSearch?

With the roll out of multisearch Google has tried to shift its focus from text based content to visual content.

It is now far more important to index images, products and videos types of content in Search Engines rather than just text.

Some best practices to optimize for Google MultiSearch

  • Use Images and Videos in Web Pages
  • Apply image Alt Text and Alt Attributes
  • Apply Keywords in Alt Attributes
  • Use Video Tags
  • Use Video Schema and declare images wherever possible under different schema markup rules.

Results from Image Search

Source: Search Console

14. Enhance User Experience by Integrating 3D Images and Augmented Reality

How Google Search Integrate 3D Images and AR?

Do you enjoy seeing animal pictures on the internet? Are you a student and love visualizing physics concepts in 3D that help you understand better. You don’t need to take an advanced course or animal safari package to access these benefits. It can be all experienced from home by a simple Google search.

By 2020, Google has announced the integration of 3d Images and Augmented Reality as a feature in SERP. It allowed users to visualize 3D images and experience them from their smartphone.

How to Visualize 3D Images in SERP?

In order to experience 3D images, you have to do this using a smartphone. Search for a term like a lion. You will see a general knowledge graph opening with the description of a lion. The only difference is there is a section where you can experience a 3D African lion labeled as “View in 3D”.

If you click on the button, you can view a 3d model of your lion in your room using your camera. Just click on “View in your Space,” and it should show.

Embed 3D Elements in Your Website

The best way to create a 3D model is to use Blender, which is perhaps the easiest way to create professional 3D models. Or, if you don’t have the time to do it, you can hire a 3D model designer who can. 

The embedding of the 3D model is done by simply entering an embedded code in HTML. First, you need a 3D image platform to publish your images. 

Embed 3D Structured Data Model

The embedding of the 3D model is done by simply entering an embedded code in HTML. First, you need a 3D image platform to publish your images. 

3D Schema Markup Code Sample

Why Should SEOs Care about 3D Images and AR

  • A Unique Way to Approach New Prospects
  • Capture Untapped Oppurtunities to Rank in Searchs
  • Influence CTR for Ecommerce Products
  • 3D Elements is the next Big thing for Web 3.0 Websites

15. An SEO Guide to WaterFall Diagram and How to Optimize Waterfall

What are WaterFall Diagrams?

A waterfall diagram is a graphical view of all the resources loaded by a web browser to present your page to your users, showing both the order in which those resources were loaded and how long it took to load each resource. 

Analyzing how those resources are loaded can give you insight into what’s slowing down your webpage, and what you can fix to make it faster.

Different Phases of a Waterfall Diagram?

DNS Lookup [Dark Green] – Before the browser can talk to a server it must do a DNS lookup to convert the hostname to an IP Address.

Initial Connection [Orange] – Before the browser can send a request, it must create a TCP connection. This should only happen on the first few rows of the chart, otherwise there’s a performance problem (more on this later).

SSL/TLS Negotiation [Purple] – If your page is loading resources securely over SSL/TLS, this is the time the browser spends setting up that connection. With Google now using HTTPS as a search ranking factor, SSL/TLS negotiation is more and more common.

Time To First Byte (TTFB) [Green] – The TTFB is the time it takes for the request to travel to the server, for the server to process it, and for the first byte of the response to make it make to the browser. We will use the measurement to determine if your web server is underpowered or you need to use a CDN.

Downloading (Blue) – This is the time the browser spends downloading the response. The longer this phase is, the larger the resource is. Ideally you can control the length of this phase by optimizing the size of your content.

Optimizing performance with a waterfall diagram

A Waterfall Diagram can give us an Insight into the Ways by which we can improve website and reducing the width of different aspects of the waterfall diagram. 

A waterfall chart provides us with 3 great visual aids to assist with this goal:

Step 1:

First, we can optimize our site to reduce the amount of time it takes to download all the resources. This reduces the width of our waterfall. The skinnier the waterfall, the faster your site.

Step 2:

Second, we can reduce the number of requests the browser needs to make to load a page. This reduces the height of our waterfall. The shorter your waterfall, the better.

Step 3:

Finally, we can optimize the ordering of resource requests to improve rendering time. This moves the green Start Render line to the left. The further left this line, the better.

Other Factors

Is my server fast enough?

Look at the TTFB measurement. If it is longer than about 500 ms, your server may be underpowered or unoptimized.

Do I need a CDN?

Content Delivery Networks (CDNs) speed up your website by storing copies of your static assets (images, CSS, JavaScript files, etc) all over the world, reducing the latency for your visitors.

16. Entity Stacking for SEO

What is Entity Stacking?

What is Google Entity Stacking in the SEO world? Google Stacking is an authoritative SEO strategy that allows you to build backlinks on several Google platforms to other entity assets like the company’s website.

These “Google Entities” include content in assets like Google Docs, Google Sheets, Google Maps, YouTube and more. Google indexes content on its own domains/assets very quickly as well.

Difference and Benefits

Google Stacking is the Evolution of Previous Link Building Schemes like Web 2.0, which included using Social Bookmarking sites, Blogger sites and other inexpensive sites like Wix, Weebly, HubPages, Squidoo.

The main difference is instead of linking to one specific content platform like Blogger, Google stack points to as many Google entities as possible.

It serves 3 Main Purpose:

  • Link Building
  • Content Promotion and Faster Indexing
  • Reputation Control

Procedures

Creating a Drive Link

Creating a Google Sheet Link

Giving Information In Google Sheets

Create Link through Google Photos

Generate Google Calendar Link

  • Google Forms
  • Google Maps Links
  • Google Docs
  • Google Slides
  • Google Sites

17. BreadCrumb Navigation and its Importance in SEO

What is BreadCrumb Navigation?

Breadcrumb navigation is a type of navigation system that displays the user’s location within a website’s hierarchy. It typically appears as a trail of links at the top of a webpage and helps users to understand where they are on a website and how they got there.

]

Types of BreadCrumb Navigation

  • Breadcrumb Navigation Based on History
  • BreadCrumb Navigation Based on Hiearchy
  • Breadcrumb Navigation based on Attributes

Benefits of BreadCrumb Navigation

Better site structure

Google uses Breadcrumb Navigation to better understand your website structure and categorize your different assets for ranking.

Improved User Experience

90% of the Users say they would never they would never return to a website which is difficult to navigate.

Almost 3 out of 4 businesses fail die to poor user experience.

Indexing API

The Indexing API was launched by Google allowed a site owner to directly notify Google when a particular URL was updated. 

This method of appeared more efficient than any other method.

How it Works

  1. First we need to enable the API in API Explorer in Google Cloud Console.
  1. Then we have to create the approprate service credentials from the create new credentials section.

It will generate an Email and an Unique ID.

  1. The same Email Id has to be added

as new user in Search Console with Owner level priviledges.

  1. The indexing API is then invoked using a Python Script.

As you can see we updated a single url which was not indexed in Google.

Result

Before

Not Indexed

After

Fully Indexed in 1 hour

18. Using the Chrome User Experience Report

Chrome User Experience Report shares valuable Insights into the Core Webi Vital Performance of a Website.

With the help of the API you can get the user experience metrics for any url.

How it Works

  1. First we need to enable the API from the Cloud Console in Google.
  1. Go to the API Section and copy the appropriate API Key to be used to invoke the API.
  1. Head over to the Official API Documentation. The API allows you get the user experience metrics for any website URL.
  1. Copy the Sample Query Code given in the Curl Format.
  1. Open GitBash and insert the copied piece of code. Run it to obtain the result.

Don’t forget to put your API Key and the appropriate url at the appropriate portion of the code.

  1. The program then returns the desired websites’ with the major core web vital metric.

19. Basics of MultiLingual SEO

Why do we need a multilingual strategy to take our Website global?

  • Expanding your Target Market

Only 25% of the Internet users are native English speaker. 75% aren’t. Hindi is the dominant language in India, and in the US, Spanish the second most spoken language.

  • Reduce Bounce Rates

72.1% of consumers spend most or all of their time on websites in their own language.

  • Slashing the competition

English is obviously the most competitive language. However things may not be so rough in other languages.

  • Customer Centric Marketing

Being available in other languages is a modest gesture of personalization. What better way to improve a Brand’s image.

Things to Do before Implementing a MultiLingual Approach

  1. Research Audience

Develop your Customer avatar and identify their demographics, language preferences, where they congregate mostly.

  1. Language and Location

Multilingual SEO is a huge undertaking. Knowing the exact language and location of our target market can save time and improve efficiency.

  1. Keyword Research

Since content needs to be personalized, its a good idea to have keywords ready in local languages, to save time.

  1. Choose the right search engines

Although Google is the dominant search engine, some countries have a preference. Make sure to include that in your research.

20. Steps to Create a Multilingual Website

Choose the right domain and URL Structure

Create Multiple Language Versions Of Content

This is the hardest part… But Why Should you Do it?

A dedicated URL for your translated page can tell search engines that the page corresponds to a specific language so that it knows which one should be shown to speakers of different languages.

Common Mistakes

  • Donot Set Up Automatic Redirects to language specific pages. Instead Google recommends to give internal links to different versions to let the user decide.
  • Flag icons represent nations but not languages. Its best to use links instead.

Translate and Create New content to make it Unique

It is unacceptable to use AI Content Writers or Google Translaters. Why??

  • Google can detect automatically translated content and decrease its position and visibility in the search results.
  • Inaccurate grammer and language nuances can lead to irrelevant content leading to poor user experience and higher bounce rate.

Recommendation:

  • Take help of professional copyrighting services.
  • Cheap Costs and gets the job done.

Use Hreflang Tags

Adding Hreflang tags to your website is the best way to tell search engines how to distinguish which URLs are appropriate for each language on the website.

How to do it?

  • Implement them in your language sitemap.
  • Place them in the head section of the HTML Page.
  • Place it in the http header of non-html pages.

Using a Multilingual Sitemap

If you’re using a CMS like WordPress, you can use a plugin like Yoast SEO to automatically generate a sitemap.

A multilingual sitemap doesn’t differ much from a traditional one. However, it should include additional information about each <loc> entry.

Why Should you Do it?

  • Helps in Gaining Local Authority for Ranking in Local Indexes.
  • Backlinks are more relevant.
  • Reduces Spam

How To Do It?

Participate in Local Events.

Get Listed in local citations.

Influencer Marketing

Guest Posting.

21. Identifying Crawl Traps For SEO

What Are Crawl Traps?

Spider or Crawl Traps refer to a website structure that has technical seo issues. These traps provide infinite urls making crawling difficult for the spider.

Different Types of Crawl Traps

  • Never Ending URLs
  • Mix Match Traps
  • Subdomain Redirect Trap
  • Search based Crawl Traps
  • Time Traps

Never Ending URLs

When using a crawler-based tool, you can identify these traps if any of the following occurs:

The URL keeps getting longer and longer without stopping

The crawl runs smoothly until it reaches your site’s junk pages

If the crawled URLs start taking a strange form that is an extension of the crawled pages

Correction Procedure

This spider trap may be fixed by utilizing a crawler tool and configuring it to order URLs by length. Choose the longest URL to find the source of the problem.

Mix Match Crawl Traps

This problem is most common with e-Commerce platforms that allow consumers to apply many filters to find the proper product.

For a crawler, several product filters per page might cause problems.

Correction Procedure

  • Provide fewer filtering options
  • Use robots.txt to block pages with too many or too few filters  
  • implementing mix-and-match filtering in Javascript

Subdomain Redirect Crawl Trap

When your website is operating on a secure connection, yet every page on the unsecured site is pointed to your secured homepage, you’ve fallen into the trap. The trap makes it difficult for Google bots to reroute outdated, vulnerable pages. 

Correction Procedure

Traps for spiders Misconfiguration of the CMS or web server causes SEO. Edit your web server configuration to fix it. You may also change the CMS and add the request URL redirect string there.

Search based Crawl Traps

The search feature isn’t supposed to be crawled or indexed by search engines. Unfortunately, many website designers overlook this fact. When this happens to your website, anyone with bad intent may easily upload indexable information to it even if they are not signed in.

Correction Procedure

Get the site re-crawled by adding no index no follow metadata to the search results to delete part of the search results from the search engine. Then use robots.txt to block the deleted pages.

Time Traps

Although Google will ultimately identify and delete useless calendars from your site, you may manually detect the trap. Go to the site’s calendar page and continually click the ‘next year’ (or ‘next month’) button. If you can go for several months or years, the site features a calendar trap.

Correction Procedure

Examine your calendar plugin’s settings to see if there are any options to limit the number of months displayed in the future. 

If there isn’t any protection, you’ll need to block the calendar pages by going to the robots.txt file and setting a sensible amount of months into the future.

How Does Spider Traps Affect SEO?

  • Google algorithms reduce the quality of your ranking.
  • Affect the original page’s ranking in circumstances when spider traps result in near-duplicate pages.
  • Search bots waste time loading irrelevant near-duplicate pages, wasting crawl money.

22. Dead Page Optimization

What are Dead Pages?

Those are the pages which can’t generate any traffic for the website.  

  • According to Google Webmaster, those pages have fewer clicks and impressions.
  • According to Google Analytics, those pages have fewer views and active sessions.

How to Detect Dead Pages from Google Analytics?

  1. Go to Google analytics 🡪 Behavior 🡪 Site Content 🡪 All pages
  1. Set the data time range 6 months for better results
  1. Add filter on page views and set the data from lowest to highest
  1. Export the full data from analytics
  1. Remove unwanted pages from this list
  1. List Down the Pages with 1-10 Pageviews
  1. Go to Google search console 🡪Seach Results-> Select the data range: last 6 months
  1. Export all the data and set the clicks range: 0 to 10. And the impressions range: 0 to 1000

How to Optimize Dead Pages?

  • Use those pages as hyperlinks from high-traffic pages.
  • Need to add more click elements and relevant content on those pages.
  • Need to work on advanced off-page for those pages.

23. Image Indexing Guide

Naming Images Properly

Here’s what Google Says: 

It is clear that keeping descriptive files names simply is a better way to tell Google what the image is about.

Yes Alt Tags do help but as SEOs we want to give all information in every way possible.

Use Descriptive Alt Text and Captions

Alt text (alternative text) describes an image. It’s what the browser displays to users with screen readers. Browsers also display alt text if there’s a problem rendering images.

A Few Examples: 

Choosing the Best File Type is Important

Page load time is crucial for SEO. Google has confirmed that it’s a ranking factor on both desktop and mobile. Our task as SEOs is to choose the most appropriate file type for each image—i.e., the one that offers the best compression with the least reduction in quality.

A Few Examples on How this matters: 

Compress Images 

There is little noticeable difference in quality between the two, yet the first image is 58% smaller than the first (31kb vs. 73kb).

Google recommends three open-source tools to help with this: Guetzli, MozJPEG (by Mozilla), and pngquant. These are command line tools.

You can also use a normal tool like Imaghe Optim which is a free tool in Mac

Use Webp Format

Main Benefits of WebP Format

  • WebP lossless images are 26% smaller in size compared to PNGs.
  • WebP lossy images are 25-34% smaller than comparable JPEG images
  • Lossless WebP supports transparency (also known as alpha channel) at a cost of just 22% additional bytes.
  • Lossy, lossless and transparency are all supported in animated WebP images, which can provide reduced sizes compared to GIF and APNG.

Create an Image Sitemap

Here are the different tags for setting up an mage sitemap.

Final Thoughts

Search Engine Optimization is continuously evolving and it is our job as SEOs not to fear the changes happening but to find ways to embrace the changes in order to improve the business’s online visibility in search engines as best possible. 


The above 23 SEO tactics gives a high level idea into the most upto date trends and tactics that is dominating the SEO space. 


If you are confused about where to start, get a Free Customized SEO and Digital Marketing Strategy to Grow your Business and Website’s Online Visibility.

Get the Strategy Today!

Leave a Reply

Your email address will not be published. Required fields are marked *