How To Calculate Organic Search Traffic Growth Rate

How To Calculate Organic Search Traffic Growth Rate

    What is Organic Search Traffic Growth Rate: Organic search traffic growth rate refers to the rate at which the organic (non-paid) search traffic to a website increases over a specific period of time. It measures the percentage change in organic search traffic from one period to another, such as month over month or year over year.

    Purpose: The purpose of tracking organic search traffic growth rate is to assess the effectiveness of your website’s search engine optimization (SEO) efforts and to measure the progress of your organic search visibility over time.

    How To Calculate Organic Search Traffic Growth Rate

    Organic Search Traffic Growth Rate Formula:

    Organic Search Traffic Growth Rate = ((Current Period Organic Search Traffic – Previous Period Organic Search Traffic) / Previous Period Organic Search Traffic) * 100

    On Google Analytics, the growth rate can be seen by comparison, but let’s see what is the calculation behind this comparison result.

    Current Period Organic Search Traffic(May 31 – June 29, 2023):

    Here it is showing that there are 6111 total users(traffic) from 31st May to 29th June

    So, Current Period Organic Search Traffic: 6111

    Previous Period Organic Search Traffic(May 1 – May 30, 2023):

    Here it is showing that there are 3792 total users(traffic) from 1st May to 30th May

    So, Previous Period Organic Search Traffic: 3792

    Calculation Of Organic Search Traffic Growth Rate:

    Organic Search Traffic Growth Rate = ((Current Period Organic Search Traffic – Previous Period Organic Search Traffic) / Previous Period Organic Search Traffic) * 100

    Current Period Organic Search Traffic = 6111

    Previous Period Organic Search Traffic = 3792

    So,

    =((6111 – 3792) / 3792) * 100

    =(2319 / 3792) * 100

    =0.61155 * 100

    =61.155%

    So, Organic Search Traffic Growth Rate is 61.155%

    Now let’s see what is google analytics is showing the traffic comparison value-

    Here it is showing that, the value is 61.16%, and the result from manual calculation is 61.155%~61.16%(approx)

    So the calculation and the formula is correct.

    The ideal score are as follows:

    1. If the Search Traffic Growth Rate is >10% = High traffic growth
    2. If the Search Traffic Growth Rate is 5% – 10% = Good traffic growth
    3. If the Search Traffic Growth Rate is 1% – 5% = Moderate traffic growth
    4. If the Search Traffic Growth Rate is <1% or in negative value = Need to improve traffic rate.

    Now let us compare Search Traffic Growth Rate:

    The score for this website is 61.155% which is more than 10%.

    Hence the Search Traffic Growth Rate of this website is 61.155%, so it is High Traffic Growth Rate. The traffic growth is excellent for this website.

    Finding Low Content Pages Using Python

    Using this Python tool we can analyse the low-content pages of a website, after analysis, we can improve the content on those pages so that the authority and keyword rank will improve.

    Step 1:

    import requests

    from bs4 import BeautifulSoup

    from urllib.parse import urlparse

    def extract_urls(domain):

        # Send a GET request to the domain

        response = requests.get(domain)

        # Parse the HTML content using BeautifulSoup

        soup = BeautifulSoup(response.text, ‘html.parser’)

        # Find all anchor tags (<a>) in the HTML

        anchor_tags = soup.find_all(‘a’)

        urls = []

        # Extract the href attribute from each anchor tag

        for tag in anchor_tags:

            href = tag.get(‘href’)

            if href:

                # Check if the URL is relative or absolute

                parsed_url = urlparse(href)

                if parsed_url.netloc:

                    # Absolute URL

                    urls.append(href)

                else:

                    # Relative URL, construct absolute URL using the domain

                    absolute_url = domain + href

                    urls.append(absolute_url)

        return urls

    def analyze_urls(urls):

        word_counts = []

        for url in urls:

            response = requests.get(url)

            soup = BeautifulSoup(response.text, ‘html.parser’)

            text = soup.get_text()

            # Count the number of words

            word_count = len(text.split())

            word_counts.append((url, word_count))

        return word_counts

    # Example usage

    domain = ‘https://www.minto.co.nz/’

    urls = extract_urls(domain)

    url_word_counts = analyze_urls(urls)

    for url, word_count in url_word_counts:

        print(f”URL: {url}”)

        print(f”Word Count: {word_count}”)

        print()

    Edit the code and replace the domain as per the screenshot –

    Put your desired domain here.

    Now create a folder on desktop –

    And save the code a as python on this folder –

    Step 2:

    Now open anaconda prompt –

    And go to that folder using cd command –

    Now install those PIPs –

    pip install beautifulsoup4

    one by one

    pip install requests

    Now run the python code –

    python urls.py

    We have extracted the word count of all pages.

    Now copy the list on a excel file –

    To excel –

    Now manually analyse the list and delete the landing page which has above 1200 words on a page.

    Also remove the irrelevant pages like contact us, login page, sign-up page etc.

    And make a list of below 1200 word pages for further improvement.

    Recommendation:

    Search related terms using any SEO tool, write a relevant content and implement on that page.

    Search question terms related to that page and create some FAQ and implement it on that page.

    This will improve the page quality and eventually the keyword ranking.

    Leave a Reply

    Your email address will not be published. Required fields are marked *