SUPERCHARGE YOUR Online VISIBILITY! CONTACT US AND LET’S ACHIEVE EXCELLENCE TOGETHER!
This project is designed to solve a significant problem that many modern websites face: search engines often struggle to understand and rank websites built with JavaScript frameworks (like React, Angular, or Vue.js). The purpose of this project is to make websites built with JavaScript more accessible and understandable to search engines, ensuring that they rank higher and attract more visitors.
What Problem Does This Project Address?
- Search Engines Can’t Always Understand JavaScript:
- Websites built with JavaScript are dynamic, meaning they load content on-the-fly rather than showing a preloaded page.
- Search engine bots like Google can’t always process this type of content effectively, leading to poor rankings and reduced visibility.
- SEO Visibility Issues:
- If search engines can’t fully understand a webpage, they may fail to index it properly. This means your website might not show up in search results at all, or it might rank much lower than it should.
- Missed Opportunities for Website Owners:
- Websites with poor SEO visibility lose out on organic traffic, which is crucial for business growth, online presence, and lead generation.
What Does This Project Do?
This project implements Dynamic Rendering, a modern technique to ensure search engines see the website the way real users do. It does this by:
- Detecting Who Is Visiting:
- The system identifies whether the visitor is a search engine bot or a real user.
- Serving Optimized Content:
- For search engines: The project generates a fully-loaded, SEO-friendly version of the website that is easy for bots to understand and index.
- For real users: Visitors get the dynamic, interactive JavaScript experience as intended.
- Ensuring SEO Best Practices:
- Automatically optimizes critical SEO elements like:
- Title tags: The clickable headline in search results.
- Meta descriptions: The short summary under the headline in search results.
- Canonical URLs: Prevents duplicate content issues by showing the preferred page version.
- Structured Data: Provides search engines with extra context about the page content.
- Automatically optimizes critical SEO elements like:
Why Is This Important?
- Improved Rankings: The project ensures that websites rank higher by making them fully understandable to search engines.
- More Organic Traffic: With better rankings, websites attract more visitors from search engines.
- Business Growth: Increased visibility means more leads, more customers, and more success.
- Cost Efficiency: Fixing SEO visibility issues with dynamic rendering is more cost-effective than rebuilding a website.
Key Features of This Project:
- Processing URLs:
- The project identifies each page on the website and ensures it is rendered correctly.
- Rendering for Search Engines:
- It generates a static HTML version of JavaScript-heavy pages, ensuring they are SEO-friendly.
- Validation Reports:
- The system checks each page to ensure it meets SEO standards and reports any errors or missing data.
- Deployment:
- Static HTML files are generated and deployed for search engines to crawl and index.
- Error Handling:
- Identifies common issues like missing structured data or incorrect canonical URLs and recommends fixes.
How Can It Benefit Website Owners?
- Make Your Website Visible:
- Ensures all critical pages are indexed and ranked on search engines.
- Boost Click-Through Rates (CTR):
- Optimized titles and meta descriptions attract more clicks in search results.
- Fix Common SEO Errors:
- Automatically detects and resolves issues like missing metadata or invalid page structures.
- Save Time and Resources:
- Dynamic rendering eliminates the need for manual fixes and long SEO audits.
- Reach a Broader Audience:
- Higher visibility brings more traffic, expanding your reach to potential customers.
What Steps Should a Website Owner Take After Using This Project?
- Review the SEO Reports:
- Check which pages are optimized and if there are any errors to fix.
- Fix Missing Data:
- Ensure titles, descriptions, and structured data are added to all critical pages.
- Monitor Performance:
- Use tools like Google Search Console to track rankings and traffic after deployment.
- Update Regularly:
- Keep adding fresh, optimized content to maintain and improve rankings.
- Optimize Page Speed:
- Make sure the website loads quickly for both users and search engines.
How Does This Help Non-Tech Users?
- For Business Owners:
- They can focus on growing their business while the project ensures their website is visible and optimized.
- For Developers:
- Simplifies the process of making JavaScript-heavy websites SEO-friendly.
- For Marketers:
- Provides actionable insights into what is missing or needs improvement for better SEO performance.
Final Purpose:
This project ensures that websites built with modern JavaScript technologies are easy for search engines to understand and rank. By fixing visibility issues and optimizing the website for search engines, it helps businesses attract more visitors, improve their online presence, and grow successfully. It’s a bridge between advanced web technologies and effective SEO.
Understanding Dynamic Rendering for JavaScript SEO
Dynamic Rendering is a solution used to ensure that websites built with JavaScript frameworks are optimized for search engines and fully understood by their bots. Let’s break this down step-by-step in plain English, so even someone with no technical background can understand.
What is Dynamic Rendering for JavaScript SEO?
- Dynamic Rendering:
- A technique where a website serves two different versions of its content:
- For search engine bots: A pre-rendered, static HTML version.
- For regular users: A fully interactive JavaScript-powered version.
- A technique where a website serves two different versions of its content:
- Why is it Necessary?:
- Search engines, like Google, often struggle to process JavaScript-heavy websites. If the bot cannot understand the content, the page may not rank well.
- Dynamic Rendering solves this issue by ensuring bots get a simplified, static version of the page that they can easily crawl and index.
- How Does it Work?:
- A website detects if a visitor is a search engine bot or a human user.
- If it’s a bot, the website sends a pre-rendered static HTML version.
- If it’s a user, the website sends the fully functional JavaScript version.
Use Cases of Dynamic Rendering for JavaScript SEO
- Improving SEO for JavaScript Websites:
- Websites built using frameworks like React, Angular, or Vue.js often rely on JavaScript to load content dynamically. Dynamic rendering ensures search engines can still access all the content.
- E-Commerce Websites:
- Online stores often have JavaScript-heavy pages with product filters, sorting options, etc. Dynamic rendering ensures product details are accessible to search engines.
- News Websites:
- News websites frequently update content and rely on JavaScript for interactive features. Dynamic rendering ensures the latest articles are indexed quickly.
- Single Page Applications (SPAs):
- SPAs load content dynamically without changing the URL, making it harder for bots to crawl. Dynamic rendering solves this by providing a static version.
- Content-Driven Websites:
- Blogs, educational portals, and resource websites that heavily use JavaScript for animations or dynamic interactions benefit from improved indexing.
Real-Life Implementation of Dynamic Rendering
- Google Search Engine:
- Googlebot sometimes fails to index JavaScript-heavy websites. Websites using Dynamic Rendering ensure Google sees an optimized version, improving their ranking.
- Retail Giants:
- Retailers like Amazon or eBay use dynamic rendering to ensure their millions of product pages are crawled and indexed accurately.
- Travel Booking Websites:
- Travel portals with dynamic pricing and flight/hotel searches rely on JavaScript. Dynamic rendering ensures these pages rank well despite their complex nature.
What Kind of Data Does Dynamic Rendering Need?
- Input Data:
- URLs of Web Pages:
- Dynamic rendering needs to know the URLs of all pages that need to be optimized for search engines.
- Website Content:
- Text, images, meta information (like title and description), and structured data (e.g., JSON-LD or schema markup).
- JavaScript:
- The scripts responsible for loading dynamic content on the website.
- URLs of Web Pages:
What Output Does Dynamic Rendering Provide?
- Rendered HTML Output:
- A fully static, pre-rendered HTML version of your website optimized for search engines.
- SEO Analysis Report:
- A report highlighting:
- Missing titles, meta descriptions, or structured data.
- Errors in rendering specific pages.
- Validation status (whether each page is ready for indexing).
- A report highlighting:
- Deployment Files:
- Static HTML files for all pre-rendered pages, ready for deployment to your server or CDN (Content Delivery Network).
- Performance Metrics:
- Insights into how quickly the rendered pages load, ensuring they meet SEO speed standards.
How Does Dynamic Rendering Ensure Semantic Optimization?
- Title and Meta Tags:
- Dynamic rendering ensures every page has optimized titles and meta descriptions, improving click-through rates.
- Canonical Tags:
- It prevents duplicate content issues by specifying the preferred version of a page.
- Structured Data:
- Adds JSON-LD or schema.org markup to provide additional context about the page to search engines, enabling features like rich snippets.
- Header Tags (H1, H2):
- Ensures proper use of semantic HTML headers for better readability by search engines.
Steps to Implement Dynamic Rendering for a Website
- Identify the Pages to Optimize:
- Gather a list of all the URLs on your website that need to be indexed by search engines.
- Pre-Render Pages:
- Use a pre-rendering service or software to generate static HTML versions of your JavaScript-heavy pages.
- Detect Bots vs. Users:
- Set up a system to differentiate between search engine bots and human visitors (based on the User-Agent header).
- Serve the Correct Version:
- Send pre-rendered pages to bots and dynamic JavaScript pages to regular users.
- Monitor Performance:
- Use tools like Google Search Console to track which pages are indexed and identify any errors.
Why is Dynamic Rendering Useful?
- Boosts SEO Rankings:
- Search engines get a clean, optimized version of the website, improving indexing and rankings.
- Increases Website Visibility:
- Ensures all pages, even those heavily reliant on JavaScript, are visible in search results.
- Enhances User Experience:
- Human visitors still enjoy the interactive, JavaScript-powered experience.
- Saves Resources:
- No need to rebuild the website from scratch to make it SEO-friendly.
Understanding the Importance and Purpose of the Code Below
This code is designed to generate a sitemap.xml file from a CSV dataset, which is particularly useful in the context of Dynamic Rendering for JavaScript SEO. Let me explain this step-by-step in a way that is simple to understand, even for someone without a technical background.
What is a Sitemap and Why is it Important?
1. Definition of a Sitemap:
- A sitemap is an XML file that lists all the important pages of a website.
- It helps search engines like Google and Bing find and index your website’s pages efficiently.
2. Purpose in SEO:
- Ensures all pages (including dynamically rendered ones) are discoverable by search engines.
- Boosts the visibility of web pages by guiding search engine bots to them.
3. Why is it Necessary for Dynamic Rendering:
- Websites using JavaScript frameworks often have pages that are not easily discoverable by search engines.
- The sitemap bridges this gap by listing these pages explicitly, ensuring they are indexed properly.
Purpose of the Code
The purpose of the code is to:
- Transform Screaming Frog SEO Spider Data:
- Use the CSV dataset generated by Screaming Frog (a popular SEO tool) to extract URLs and metadata.
- Generate an XML Sitemap:
- Create a sitemap.xml file that search engines can use to efficiently crawl and index the website.
- Improve SEO for Dynamic Websites:
- Ensure that even JavaScript-heavy pages are indexed correctly by search engines.
Key Components of the Code
Part 1: Generating the Sitemap
File Name Suggestion: generate_sitemap_from_csv.py
1. Loading the Dataset:
- Reads the CSV file (internal_all.csv) using the pandas library.
- The CSV contains columns like URL Encoded Address (the page URL) and Crawl Timestamp (last modification date).
- This data is essential for creating an accurate sitemap.
2. Building the Sitemap XML Structure:
- The <urlset> element is the container for all URLs in the sitemap.
- For each row in the CSV:
- A <url> element is added, representing a single webpage.
- Inside each
: - <loc> specifies the URL.
- <lastmod> indicates when the page was last modified.
- <changefreq> suggests how often the page is updated.
- <priority> indicates the importance of the page (default is 0.5).
3. Saving the Sitemap:
- Converts the constructed XML structure into a string using tostring.
- Formats the XML for readability using minidom.
- Saves the sitemap as a file named sitemap.xml.
Part 2: Previewing the Sitemap
File Name Suggestion: preview_sitemap.py
1. Parsing the Generated Sitemap:
- The generated sitemap.xml file is read and parsed using the ElementTree library.
2. Previewing the Sitemap Entries:
- Extracts the first few entries (default: 10) for preview.
- Displays the fields:
- URL
- Last Modified Date
- Change Frequency
- Priority
- This helps verify that the sitemap is correct before deploying it.
Why is the Screaming Frog Dataset Important?
1. Source of URLs:
- Screaming Frog scans your website and identifies all the URLs (pages) that should be indexed by search engines.
- These URLs form the backbone of the sitemap.
2. Metadata Extraction:
- Columns like Crawl Timestamp provide additional details that enhance the sitemap (e.g., last modification date).
3. Ensures Coverage:
- Using this dataset ensures no important page is missed in the sitemap, especially for JavaScript-heavy or dynamically rendered pages.
How Does This Help Dynamic Rendering for JavaScript SEO?
1. Improves Discoverability:
- Ensures search engines are aware of all pages, even those that rely on JavaScript for content loading.
2. Optimizes Crawling:
- Provides metadata (like last modification dates) to help search engines prioritize which pages to crawl.
3. Fixes JavaScript Visibility Issues:
- Many JavaScript-heavy websites suffer from poor indexing. The sitemap ensures these pages are included in search engine results.
4. Supports Dynamic Rendering:
- While dynamic rendering ensures bots get a static version of JavaScript pages, the sitemap acts as a secondary layer to make sure every page is indexed.
Output Generated by the Code
1. Sitemap File (sitemap.xml):
- Contains all URLs from the CSV.
- Each URL entry includes:
- Location (<loc>): The page URL.
- Last Modified (<lastmod>): The date the page was last updated.
- Change Frequency (<changefreq>): A recommendation for how often the page changes (default: weekly).
- Priority (<priority>): The relative importance of the page (default: 0.5).
2. Preview of the Sitemap:
- Displays the first few rows of the sitemap for verification.
Steps for Using This Code
1. Input Preparation:
- Use Screaming Frog SEO Spider to generate a CSV file of your website’s URLs and metadata.
- Ensure the file has columns like URL Encoded Address and Crawl Timestamp.
2. Run the Code:
- Execute the first script (generate_sitemap_from_csv.py) to create the sitemap.
- Run the second script (preview_sitemap.py) to verify the sitemap’s contents.
3. Deploy the Sitemap:
- Upload the generated sitemap.xml file to the root directory of your website.
4. Submit to Search Engines:
- Submit the sitemap to Google Search Console and Bing Webmaster Tools.
1st Part: Sitemap Parsing and Dynamic Content Rendering
Purpose: This part of the code is designed to load a sitemap (an XML file containing website URLs), validate each URL, and render the content of JavaScript-heavy pages into static HTML using a headless browser (Selenium).
Key Features:
- Parse Sitemap: Extracts URLs from the sitemap and validates them.
- Validate URLs: Ensures only HTML pages are processed by filtering out unsupported files (e.g., images, CSS).
- Render Dynamic Pages: Uses Selenium WebDriver to load and render dynamic content into static HTML files.
- Save Rendered HTML: Saves the rendered HTML files with unique names (based on a hash of the URL) for later use.
Why It’s Important: Many websites use JavaScript to load content dynamically, which search engines may not index properly. This part ensures the content is fully rendered and saved as static HTML files, making it easier for search engines to crawl.
2nd Part: Metadata Analysis of Rendered HTML
Purpose: This code analyzes the static HTML files generated in the first part to validate and assess their metadata for SEO compliance.
Key Features:
- Title Analysis: Checks if the <title> tag exists, its length, and whether it meets SEO guidelines (10–60 characters).
- Meta Description Validation: Ensures the <meta name=”description”> tag exists and has a length of 50–160 characters.
- Canonical Link Check: Confirms the presence of <link rel=”canonical”> tags to avoid duplicate content issues.
- Summary and Reporting:
- Saves a CSV report with detailed results for each file.
- Generates a JSON summary with overall statistics, such as how many files are missing metadata.
Why It’s Important: Metadata plays a crucial role in SEO. This part identifies gaps and issues, providing actionable insights to improve page rankings.
Understanding the Output
This output is a result of analyzing metadata from rendered HTML files and generating a report. The report provides a detailed analysis of SEO-related components like <title>, <meta>, and canonical tags for each processed HTML file. Here’s what each part of the output means:
1. File Path for Saved Reports
· CSV Report Path:
- Location: /content/drive/MyDrive/Dataset For Dynamic Rendering for JavaScript SEO/metadata_analysis_report.csv
- What it contains: A detailed spreadsheet report with information about each HTML file’s metadata (title, meta description, canonical links).
- Use Case: This CSV can be opened in tools like Excel or Google Sheets for review and further analysis.
· JSON Summary Path:
- Location: /content/drive/MyDrive/Dataset For Dynamic Rendering for JavaScript SEO/metadata_analysis_summary.json
- What it contains: A summary of overall statistics, like total files processed, missing titles, or missing canonical links.
- Use Case: The JSON is useful for quickly understanding the health of your metadata without going through individual files.
2. Preview of First 40 Rows
This preview shows a summary of the first 40 files analyzed. Let me explain the key columns one by one:
1. File Name:
- What it is: The name of the static HTML file (e.g., 2445840446231693084.html).
- Use Case: Helps you identify which file corresponds to which webpage in the sitemap.
2. Title:
- What it is: The content of the <title> tag in the HTML file. For example, “THATWARE® – Revolutionizing SEO with Hyper-Intelligence”.
- Use Case: Titles are important for search engines to understand the purpose of the page and attract users.
3. Title Validity:
- What it is: Whether the
meets SEO guidelines (10–60 characters). - “Valid”: Title meets length requirements.
- “Invalid”: Title is too short or too long.
- Use Case: Helps you spot titles that need improvement for better SEO rankings.
4. Title Suggestion:
- What it is: Recommendations for fixing invalid titles.
- Example: “Shorten the title to 60 characters or less.”
- Use Case: Provides clear action steps to improve titles for SEO.
5. Meta Description:
- What it is: The content of the tag, which summarizes the page.
- Example: “THATWARE® is the world’s first SEO agency to specialize in hyper-intelligence.”
- Use Case: Helps search engines and users understand the page content.
6. Meta Description Validity:
- What it is: Whether the description meets SEO guidelines (50–160 characters).
- “Valid”: Description meets requirements.
- “Invalid”: Missing or not within the ideal length.
- Use Case: Shows if your descriptions are ready for search engine display.
7. Meta Description Suggestion:
- What it is: Recommendations for improving invalid meta descriptions.
- Example: “Add a meta description with 50-160 characters.”
- Use Case: Provides specific fixes to enhance click-through rates.
8. Canonical Link:
- What it is: The content of the tag, which indicates the preferred URL for search engines.
- Example: https://thatware.co/seo-company-delhi/
- “Missing” if not found.
- Use Case: Prevents duplicate content issues by pointing to the main version of the page.
9. Canonical Suggestion:
- What it is: Recommendations for missing canonical links.
- Example: “Ensure a canonical link is added to prevent duplicate content issues.”
- Use Case: Guides you in preventing SEO penalties due to duplicate content.
10. Has <html>, <head>, <body>:
- What it is: Confirms the presence of essential structural elements in the HTML file.
- “True”: The element exists.
- “False”: The element is missing.
- Use Case: Ensures the basic structure of the webpage is intact.
Key Observations from the Preview
- Valid Titles: Most titles are valid, but a few are too long (e.g., “Invalid: Title too long (64 characters)”). The suggestions guide you to fix them.
- Meta Description: One file (4449457235279430625.html) is missing a meta description. The suggestion is to add one.
- Canonical Links: Most files have valid canonical links, except one (4449457235279430625.html) where it is missing.
- HTML Structure: All files have essential elements (<html>, <head>, <body>), ensuring the pages are structurally sound.
What This Output Tells Us
1. SEO Health: This analysis provides a snapshot of your HTML files’ SEO readiness.
- Strong Points: Most titles and meta descriptions are valid, and all files have proper structure.
- Areas to Improve: Fix a few titles that are too long and add missing meta descriptions and canonical links.
2. Next Steps:
- Implement the suggestions from the report (e.g., shorten long titles, add meta descriptions).
- Deploy the improved HTML files to improve SEO performance.
Use Cases of This Report
- SEO Improvement: Helps you identify and fix issues that could harm your website’s visibility on search engines.
- Client Reporting: Share this analysis with your client to demonstrate the quality and readiness of their webpages.
- Future Optimizations: Use this as a baseline to track improvements over time.
3rd Part: Optimizing and Validating Static HTML
Purpose: This code optimizes the rendered HTML files to ensure they meet SEO standards and are ready for deployment.
Key Features:
- Add Missing Metadata:
- Adds a <meta name=”description”> tag if it’s missing.
- Adds a <link rel=”canonical”> tag to prevent duplicate content issues.
- SEO-Friendly Title Adjustment:
- Adjusts the length of <title> tags to fit SEO best practices.
- Enhance Accessibility: Adds alt attributes to images that lack them.
- Remove Redundant Elements:
- Deletes unnecessary <script> and <noscript> tags.
- Removes lazy-loading attributes and preloader elements.
- Validation: Checks the optimized HTML for compliance with SEO guidelines and logs any remaining issues.
Why It’s Important: This part ensures that the static HTML files are optimized for search engine crawlers and free from common SEO pitfalls.
What Does This Output Represent?
The output shows:
- The URLs being processed: Each URL corresponds to a webpage that is being analyzed for SEO improvements.
- Rendering HTML content: The script uses tools like Selenium to load and analyze the HTML structure of these pages.
- Optimized HTML Preview: The HTML is checked and modified for better SEO.
- Validation Status: Each webpage is either marked as “Valid” or flagged with issues based on specific SEO checks.
- Log Updates: Logs are updated after each URL is processed, storing information about the page’s SEO quality.
Now let’s break down the output in detail.
Step-by-Step Explanation of the Output
1. Processing the URL
Example:
Processing URL: https://thatware.co/law-firm-seo-services/
Rendering: https://thatware.co/law-firm-seo-services/
· What is happening?
- The script starts processing a specific webpage. For example, https://thatware.co/law-firm-seo-services/ is one of the pages.
- It then renders the webpage using a browser-like tool. Rendering means the script loads the page as if a real user were visiting it in a browser.
· Why is this step needed?
- Many websites generate content dynamically using JavaScript, which regular tools cannot analyze. Rendering ensures the script captures the final webpage as seen by users.
· Use Case:
- This ensures the SEO analysis is based on the actual content users and search engines see, not just the raw code.
2. Optimized HTML Preview
Example:
===== Optimized HTML Preview =====
<html class=”no-js” lang=”en-US”>
<!–<![endif]–>
<head>
<meta charset=”utf-8″/>
<meta content=”upgrade-insecure-requests” http-equiv=”Content-Security-Policy”/>
…
===================================
· What is happening?
- The rendered HTML content of the webpage is displayed. This includes important elements like:
- <title>: The title of the page shown in search results.
- <meta name=”description”>: A summary of the page for search engines.
- <link rel=”canonical”>: The preferred URL to avoid duplicate content issues.
- Other metadata: Author information, image previews, and social media tags.
· Why is this step needed?
- This helps identify if the webpage has all the necessary SEO elements and whether they meet best practices.
· Use Case:
- You can see what improvements are being suggested, such as fixing a missing <meta> tag or improving the <title> length.
3. Title Tag Analysis
Example:
<title>
Law Firm SEO Services | Best Legal SEO Services Agency
</title>
· What is this?
- The title tag is a key SEO element that appears in search engine results.
- In this example, the title is: “Law Firm SEO Services | Best Legal SEO Services Agency.”
· Why is it important?
- Titles should be concise, relevant, and between 10–60 characters. They influence whether users click on the link.
· Use Case:
- If the title is too long, short, or irrelevant, the script suggests changes to improve SEO.
4. Meta Description
Example:
<meta content=”Explore our Law Firm SEO Services and take the lead in the digital courtroom of search engines! Outshine competitors in the legal field.” name=”description”/>
· What is this?
- The meta description provides a summary of the page content.
- In this example, it’s: “Explore our Law Firm SEO Services and take the lead in the digital courtroom of search engines!”
· Why is it important?
- It helps search engines and users understand the page. It should be between 50–160 characters for optimal performance.
· Use Case:
- If the description is missing or too long, the script flags it and suggests improvements.
5. Canonical Link
Example:
<link href=”https://thatware.co/law-firm-seo-services/” rel=”canonical”/>
· What is this?
- The canonical tag tells search engines the preferred version of the URL.
- For example, https://thatware.co/law-firm-seo-services/ is the main URL for this page.
· Why is it important?
- It prevents search engines from indexing duplicate pages and ensures the correct page ranks higher.
· Use Case:
- Missing canonical links can cause duplicate content issues, harming SEO.
6. Validation Status
Example:
Validation Status: Valid
· What is this?
- After analyzing the HTML, the script checks if the page meets SEO standards.
- “Valid” means the page has no major issues.
· Why is this step important?
- It confirms that the page is ready for deployment without any critical SEO problems.
· Use Case:
- Valid pages can be deployed directly. Invalid pages need fixes before publishing.
7. Log Updates
Example:
Log updated for URL: https://thatware.co/law-firm-seo-services/
· What is this?
- After processing each page, the script updates a log file with the results.
· Why is it important?
- It keeps track of which pages were analyzed, their validation status, and any issues.
· Use Case:
- This log can be shared with developers or clients for transparency and further action.
Summary of Key Steps
- URL Processing: Loading and analyzing each webpage.
- HTML Rendering: Displaying the actual content and metadata.
- Title and Meta Analysis: Ensuring these elements are optimized for search engines.
- Canonical Tag Check: Preventing duplicate content issues.
- Validation: Marking pages as “Valid” or flagging them for issues.
- Logging: Documenting the results for tracking and reporting.
How to Use This Output?
- Fix Issues: Review flagged issues like missing titles, long descriptions, or duplicate content and apply the suggested fixes.
- Deploy Valid Pages: Move validated pages to the live environment for improved SEO performance.
- Monitor Progress: Use the logs to track which pages were processed and their current SEO status.
4th Part: Comprehensive SEO Report Generation
Purpose: Analyzes the optimized static HTML files to generate a detailed SEO report, highlighting issues and providing actionable recommendations.
Key Features:
- Analyze Common Issues: Identifies missing or invalid elements, such as:
- <meta name=”description”> tags.
- <title> tags that are too short or too long.
- <h1> tags and structured data (JSON-LD).
- Generate Reports:
- A detailed log for each file, highlighting specific issues.
- A summary report with overall statistics (e.g., total URLs, valid/invalid URLs, most common issues).
- Actionable recommendations to fix the identified issues.
Why It’s Important: The report helps track progress, measure SEO improvements, and prioritize fixes for common issues.
Understanding the Output
SEO Report Summary
1. “Total URLs Processed: 189”
· What this means:
- The system analyzed 189 URLs provided in the input, which likely came from a sitemap or a list of web pages.
- These URLs represent pages that were checked to see if they meet SEO standards.
· Use Case:
- This tells you the total number of pages under consideration for improving their search engine visibility.
2. “Valid URLs: 97”
· What this means:
- Out of 189 URLs, 97 pages were marked as ‘Valid’, meaning these pages passed the SEO checks and are considered optimized.
- These pages have the required metadata, structure, and other SEO components.
· Use Case:
- You can deploy these pages confidently without additional SEO changes. They are already optimized for search engines.
3. “Invalid URLs: 92”
· What this means:
- 92 URLs failed the SEO validation checks.
- These pages might have issues like:
- Missing important metadata (e.g., titles, meta descriptions, structured data).
- Problems rendering the content (e.g., server issues, slow loading, or dynamic content not loading properly).
- Incomplete or improper HTML structure.
· Real-World Cause:
- As you mentioned, some of these URLs failed because the server didn’t respond or load the content during the rendering process in Part 3 of the code.
- This can happen due to:
- Server issues: Temporary downtime or unresponsiveness.
- Dynamic content: Pages rely on JavaScript to load content, but the script couldn’t fully render them.
· Use Case:
- These pages require attention to fix the specific issues preventing them from being valid.
- If the issue is server-related, you can reprocess these URLs when the server is functioning properly.
4. “Common Issues: Missing structured data (JSON-LD): 97 occurrences”
· What this means:
- Out of all the URLs, 97 pages are missing structured data in JSON-LD format.
- Structured data is a special format of metadata that helps search engines understand the content of a page better.
· Why this matters:
- Structured data can enable features like:
- Rich snippets (e.g., star ratings, product prices) in search results.
- Better visibility on Google.
- Improved indexing and ranking by search engines.
· Use Case:
- You should prioritize adding structured data to these pages to boost their SEO performance.
Recommendations:
1. Add Structured Data for Richer Search Engine Results
· What this recommendation means:
- You should add JSON-LD structured data to your web pages. JSON-LD is a format for describing your page content (like products, reviews, or articles) in a way that search engines can understand.
· Why it is important:
- Improves visibility: Structured data helps search engines like Google display rich information (like FAQs, ratings, and reviews) directly on the search results page.
- Boosts SEO ranking: Pages with structured data often rank higher because search engines can understand them better.
- Increases click-through rates: Rich snippets attract more clicks because they provide additional information at a glance.
Explaining JSON-LD and Why It’s Important
What is JSON-LD?
- JSON-LD stands for JavaScript Object Notation for Linked Data.
- It’s a simple, lightweight way to describe the structure of your content in a format search engines understand.
How JSON-LD Works:
- It adds context to your webpage by describing key elements, such as:
- Title: What the page is about.
- Description: A summary of the content.
- Author: Who created the page.
- Image: Visual content on the page.
- Type of content: Whether the page is an article, product, recipe, etc.
Example of JSON-LD for a Web Page:
Benefits of Adding JSON-LD:
- Enables Rich Snippets:
- Displays additional information (like FAQs or product ratings) directly in search results.
- Improves Understanding:
- Search engines understand your content better and index it more accurately.
- Boosts Search Rankings:
- Pages with structured data are often favored in search engine rankings.
- Supports Voice Search:
- Structured data helps voice assistants like Alexa or Google Assistant retrieve accurate answers from your website.
What Should You Do Next?
1. Fix Invalid URLs:
- Re-run the rendering process for the invalid URLs (especially the ones affected by server issues) when the server is stable.
- Analyze why these pages failed and fix any technical or SEO-related issues.
2. Add Structured Data:
- For the 97 pages missing structured data, implement JSON-LD according to the type of content (e.g., articles, products, FAQs).
- Use tools like Google’s Structured Data Markup Helper to generate JSON-LD.
3. Reprocess and Validate:
- After making the fixes, reprocess the URLs to check if the structured data and other improvements are recognized.
4. Share Recommendations with Developers:
- Provide the JSON report and this explanation to your developers or SEO team for implementation.
Why Is This Output Useful?
1. Guides Prioritization:
- Shows which pages are ready to deploy and which need fixing.
- Highlights the most common issues to focus on.
2. Improves SEO Results:
- Following the recommendations (like adding JSON-LD) ensures better performance in search engines.
3. Tracks Progress:
- The saved JSON report helps track which issues were resolved and which remain outstanding.
5th Part: Preparing Static Files for Deployment
Purpose: Filters and prepares validated static HTML files for deployment, ensuring only optimized files are deployed to the server.
Key Features:
- Filter Valid Files: Only deploys files marked as “Valid” in the rendering log.
- Deployment Directory Setup: Copies validated files to a separate deployment directory for easy management.
- Log Deployment Summary: Provides a summary of the number of files deployed and skipped.
Why It’s Important: This part ensures that only high-quality, SEO-optimized files are served to search engine bots, enhancing crawlability and indexability.
Understanding the Deployment Summary Output
Deployment Output
Deployed Files
· What this means:
- The list of deployed files (e.g., -3386425097208935956.html) represents the HTML pages that were successfully prepared for deployment.
- These files are now available in the deployment directory at:
/content/drive/MyDrive/Dataset For Dynamic Rendering for JavaScript SEO/deployment
- Each file corresponds to a unique webpage (e.g., a blog post, service page, or product page) from your website.
· Why these files were deployed:
- These files passed the SEO checks and validation processes in earlier steps.
- They were deemed ready to be deployed because they:
- Contain necessary metadata like <title>, <meta description>, and <canonical link>.
- Rendered successfully without any errors.
· Use Case:
- These deployed files are ready to be served to search engines for indexing.
- They are optimized and can improve your website’s visibility on Google and other search engines.
“Static HTML files prepared for deployment in: /content/drive/MyDrive/Dataset For Dynamic Rendering for JavaScript SEO/deployment”
· What this means:
- This is the directory where all the successfully validated and optimized HTML files were saved after rendering.
- Think of it as the final storage location for your web pages that will be uploaded to a server.
· Use Case:
- You can take these files and upload them to your live website or a Content Delivery Network (CDN).
- They are now ready to be accessed by visitors or crawled by search engines.
Understanding the Importance of “Successfully Deployed Pages”
Let me break this down in simple terms so that we can understand the significance of these 97 successfully deployed pages.
What Are These Deployed Pages?
- These are web pages from your website that:
- Passed all validation checks during processing.
- Were optimized for search engine friendliness.
- Are now ready for deployment (publishing or serving live on your website or a server).
- Contain critical elements like a properly formatted <title>, <meta description>, and <canonical link>.
Why Are These Pages Important?
These 97 pages are the heart of your project. Here’s why:
1. SEO-Optimized Pages Help You Rank on Google
- These pages are tailored to meet search engine requirements (like Google’s SEO standards).
- They include critical metadata (like a well-written
and <meta description>), which helps search engines understand: - What your page is about.
- Whether it’s relevant to a user’s query.
- Better understanding by search engines = higher chances of ranking on Google.
2. Improves User Experience (UX)
- These pages are structured to load quickly and display relevant content.
- They are optimized to ensure the user sees the information they need (like product descriptions, blog articles, or service details) without confusion.
- Happier users mean they’re more likely to stay on your site longer.
3. Foundation for Website Growth
- These pages act as a base for your website’s online presence.
- You can expand and build new pages using the same optimized structure.
4. Prepared for Search Engine Indexing
- These files are ready to be crawled and indexed by search engines.
- Once indexed, they’ll appear in search results for relevant keywords.
- This leads to organic traffic—people finding your website for free by searching.
Where and How to Use These Pages?
1. Host Them on Your Website
- These pages are meant to be uploaded to your live server or Content Management System (CMS).
- Examples:
- If your website is hosted on platforms like WordPress, you can integrate these pages into the site.
- If you use a custom server, upload them to the corresponding directories.
2. Deploy Them on a CDN (Content Delivery Network)
- A CDN helps load pages faster for users across the world.
- Example: Use platforms like Cloudflare or AWS CloudFront to serve these pages to users globally, improving their experience.
3. Submit Them to Search Engines
- Use tools like Google Search Console to submit these pages’ URLs.
- Once submitted:
- Google crawls and indexes the pages.
- Indexed pages appear in search results, driving organic traffic to your website.
4. Use Them in Marketing Campaigns
- Share the deployed pages as part of your advertising or social media campaigns.
- Example:
- A service page (e.g., law-firm-seo-services.html) can be used in email campaigns targeting law firms looking for SEO solutions.
- Product pages can be shared on platforms like LinkedIn or Twitter to attract attention.
The Benefits of These Successfully Deployed Pages
1. Boosts Online Visibility:
- These pages are like billboards on Google’s highway—they’ll attract more visitors to your site.
2. Increases Traffic to Your Website:
- When these pages rank higher in search results, more people click on them, leading to increased website traffic.
3. Enhances Credibility:
- Optimized pages make your website look professional and trustworthy.
- Visitors are more likely to convert into customers.
4. Saves Time and Effort:
- Since these pages are validated and ready to go, you don’t have to spend additional time fixing issues.
5. Drives Conversions and Revenue:
- High-quality, well-structured pages can convert visitors into paying customers by providing relevant information quickly.
How These Pages Relate to the Entire Project
· Purpose of Writing All That Code:
- The primary goal of project was to take raw web pages (some incomplete or poorly formatted) and turn them into high-quality, SEO-ready pages.
- Every step of the process—from reading URLs in a sitemap to rendering, validating, and deploying—was aimed at achieving this outcome.
· Final Outcome:
- These 97 deployed pages are the result of all the hard work.
- They represent the optimized and error-free content that search engines and users both love.
What to Do Next With These Pages?
Step 1: Upload to Your Server
- Use an FTP client or your hosting provider’s control panel to upload these HTML files.
- Ensure they are accessible via your website’s URLs.
Step 2: Test Live Pages
- After uploading, open the URLs in a browser to ensure everything works correctly.
- Use tools like Google’s Mobile-Friendly Test to confirm the pages display properly on all devices.
Step 3: Submit to Google
- Log in to Google Search Console.
- Add your website’s sitemap containing these deployed pages.
- This will notify Google to crawl and index your site.
Step 4: Monitor Performance
- Use analytics tools like Google Analytics to track:
- Traffic coming to these pages.
- User behavior (time spent on page, bounce rates, etc.).
Step 5: Iterate and Improve
- Use insights from the deployed pages to:
- Fix skipped files (189 skipped pages).
- Create new pages following the same optimization standards.
Final Thoughts
The 97 successfully deployed pages are the backbone of your website’s SEO efforts. They are now ready to help you:
- Attract more visitors.
- Improve your search rankings.
- Drive business growth.
Treat these pages as a milestone—a tangible outcome of project’s success. Celebrate this achievement while working on improving the skipped files to ensure even more success in the future!
“Total Files Skipped: 189”
· What this means:
- 189 files were skipped because they didn’t pass the validation checks.
- Reasons files might be skipped:
- Rendering issues:
- Some files couldn’t be rendered properly, often due to server issues or dynamic content failing to load.
- Missing metadata:
- These pages lacked key SEO elements like <meta description>, <title>, or <canonical link>.
- Errors in structure:
- The HTML structure was incomplete or contained errors.
- Server or connection problems:
- The page didn’t load because the server was temporarily down or slow.
- Rendering issues:
· Why this happened:
- Many of these files were skipped because the server didn’t respond properly or the content couldn’t be loaded dynamically during the rendering phase (3rd part of the process).
- When the server doesn’t provide a response, the rendering tool skips those files.
· Use Case:
- You need to review these skipped files, identify the specific issues, and fix them. For server-related issues, reprocess these URLs when the server is stable.
What To Do Next?
1. Deploy the Valid Files (97 Files)
- These files are ready and can be uploaded to your web server.
- They will improve your website’s SEO because they are optimized.
2. Review and Fix Skipped Files (189 Files)
- Go through the logs or errors associated with these files to find out why they were skipped.
- Common reasons to check:
- Server connection issues: Ensure the server is responding.
- Missing metadata: Add <title>, <meta description>, and other SEO elements.
- Rendering issues: If JavaScript-heavy pages failed to load, debug them.
3. Reprocess the Skipped URLs
- After fixing the skipped files, rerun the rendering and validation process to generate optimized HTML for those pages.
4. Implement Recommendations for Skipped Files
- Focus on adding structured data (JSON-LD) to all pages, including the skipped ones.
- Use Google’s Structured Data Testing Tool or other validators to check your JSON-LD implementation.
JSON-LD and Recommendations
What is JSON-LD?
- JSON-LD stands for JavaScript Object Notation for Linked Data.
- It’s a way of describing the content of your webpage in a format search engines can easily understand.
Why Add JSON-LD?
- Richer Search Results:
- Pages with JSON-LD appear more appealing in Google search results, often showing rich snippets (e.g., star ratings, product details, FAQs).
- Better Search Engine Understanding:
- JSON-LD helps search engines understand the page’s purpose and structure better.
- Higher Click-Through Rates:
- Rich snippets lead to more clicks because they provide additional information upfront.
- Improved Rankings:
- Structured data can indirectly improve your rankings by increasing click-through rates and engagement.
Example Use Case of JSON-LD
For example, if a page provides information about a product, you can use JSON-LD to describe it:
1. What Was Achieved:
- 97 pages were successfully optimized and are ready to improve your website’s performance on Google.
- These pages are “SEO-ready” and can be published live.
2. What Needs Attention:
- 189 pages were skipped because of errors, missing data, or server issues.
- Fix these pages to ensure they meet the same quality as the deployed ones.
3. Why JSON-LD Matters:
- Adding structured data (JSON-LD) will help Google better understand your pages and display them attractively in search results.
- This can result in higher rankings, more clicks, and better engagement.
Dynamic Rendering for JavaScript SEO Model
What is Dynamic Rendering for JavaScript SEO?
1. Problem Statement:
- Many modern websites use JavaScript frameworks (e.g., Angular, React, Vue.js).
- Search engines, like Google, sometimes struggle to fully process JavaScript-heavy websites. This can lead to:
- Pages not being indexed correctly.
- Content not appearing in search results.
2. Dynamic Rendering Solution:
- Dynamic rendering creates two different versions of your website:
- A search-engine-friendly version (fully loaded HTML) optimized for bots.
- The JavaScript-heavy version intended for real users.
3. Benefit:
- Search engines see a complete, ready-to-index version of your page, improving SEO visibility.
What is This Output?
The output is a report generated during the dynamic rendering process. It shows the status of each page as it is processed and optimized for search engines. This ensures that:
- Pages are being properly rendered.
- SEO-critical elements (like titles, descriptions, and canonical tags) are present and correct.
Detailed Breakdown of the Output
1. Processing URL
- Example: Processing URL: https://thatware.co/google-page-title-update/
- What it means:
- This shows the page being analyzed for SEO optimization.
- It confirms that the system is reviewing this page to ensure it works for search engines.
- Why it’s important:
- Every critical page on your website must be reviewed to ensure search engines can process them without issues.
- Action for Website Owners:
- Ensure all important pages (like product, service, and blog pages) are being processed.
2. Rendering URL
- Example: Rendering: https://thatware.co/google-page-title-update/
- What it means:
- This confirms that the dynamic rendering system has loaded the page and generated a search-engine-friendly version.
- Why it’s important:
- Rendering ensures that search engines see the fully loaded content of the page, not just incomplete JavaScript.
- Action for Website Owners:
- Verify that the rendered version contains all the necessary information (titles, descriptions, and content).
3. Optimized HTML Preview
This section contains the actual HTML code that search engines will see after the page is dynamically rendered. Let’s break down the key parts:
a. <title> Tag
- Example: <title>Google Title Tag Update: Ways To Ignore CTR Downfall – ThatWare</title>
- What it means:
- The <title> tag is the title of your page, which appears on search engine results pages (SERPs).
- Why it’s important:
- A well-crafted title improves your click-through rate (CTR).
- Action for Website Owners:
- Make sure the title is concise, includes keywords, and describes the page content effectively.
b. <meta> Tags
- Example:
<meta content=”Google title tag update – Google start giving priority to the H1 or H2 as a webpage’s title which causes CTR drop. Read on the guide to fix the drop.” name=”description”/>
- What it means:
- Meta descriptions provide a brief summary of the page’s content. They show up under the title in search results.
- Why it’s important:
- A compelling meta description encourages users to click on your page.
- Action for Website Owners:
- Write unique, engaging meta descriptions for every page.
c. <link> Tags
- Example:
<link href=”https://thatware.co/google-page-title-update/” rel=”canonical”/>
- What it means:
- Canonical tags indicate the preferred version of a page. This prevents duplicate content issues.
- Why it’s important:
- Duplicate content can hurt SEO rankings. Canonical tags help search engines understand which page to prioritize.
- Action for Website Owners:
- Ensure every page has a correct canonical tag pointing to the main URL.
4. Validation Status
- Example: Validation Status: Valid
- What it means:
- This confirms that the page has been successfully optimized for search engines.
- Why it’s important:
- A valid status ensures there are no technical SEO issues.
- Action for Website Owners:
- If the status is invalid, investigate the errors (e.g., missing titles, broken links) and fix them.
How is This Output Beneficial?
1. Improves Search Engine Visibility:
- Search engines can correctly interpret and index your pages, leading to better rankings.
2. Fixes JavaScript Rendering Issues:
- Ensures that JavaScript-heavy websites don’t lose visibility due to rendering problems.
3. Boosts CTR:
- Optimized titles and meta descriptions make your page more attractive on search results.
4. Avoids Duplicate Content Issues:
- Canonical tags prevent SEO penalties for duplicate pages.
Steps for Website Owners After Receiving This Output
1. Review the Rendered HTML
- Check the
and <meta> tags: - Are they relevant to the page content?
- Do they include keywords and entice users to click?
2. Fix Missing Data
- Ensure:
- Every page has a title, description, and canonical tag.
- Headings (H1, H2) are clear and descriptive.
3. Optimize Page Speed
- Use tools like Google PageSpeed Insights to ensure the rendered version loads quickly.
4. Validate URLs
- Verify that all critical pages are being rendered and indexed correctly.
5. Monitor for Errors
- Look for issues like:
- Error processing: ‘loading’
- Fix these by checking server response times or JavaScript dependencies.
Conclusion
This output is a roadmap for ensuring your website is SEO-friendly and optimized for search engines. By using dynamic rendering:
- Search engines can better understand your website.
- You can avoid technical issues with JavaScript.
- You improve your visibility, rankings, and traffic.
Understanding the Output of Dynamic Rendering for JavaScript SEO
This output provides a detailed report on how pages are processed, validated, and optimized using the Dynamic Rendering for JavaScript SEO Model.
What is the Purpose of Dynamic Rendering?
Dynamic Rendering is a process used to help search engines understand and rank websites that rely heavily on JavaScript. Sometimes, search engines like Google can’t read or process JavaScript correctly, which can hurt a website’s SEO. Dynamic Rendering solves this by:
- Generating static HTML versions of pages for search engines.
- Ensuring all important SEO elements (titles, descriptions, structured data, etc.) are included.
This is critical for improving a website’s visibility in search engine results.
What Does This Output Mean?
The output is a detailed log of:
- How many pages were analyzed.
- Which pages were successfully processed and optimized.
- Common issues found and steps recommended for improvement.
- Deployment details of optimized HTML files.
Let’s break it down part by part.
1. Logs of URLs Processed
· Example:
Total URLs Processed: 152
Valid URLs: 75
Invalid URLs: 77
· Meaning:
- 152 pages on your website were analyzed by the system.
- 75 of those pages were successfully processed and optimized.
- 77 pages had errors or issues that need attention.
· Action Steps:
- Focus on fixing the invalid URLs. These may not be visible or indexed properly by search engines.
- Check the error logs for more details about what went wrong.
2. Common Issues
· Example:
– Missing structured data (JSON-LD): 75 occurrences
· Meaning:
- Structured data is additional information in your website’s code that helps search engines understand your content better.
- Missing structured data means your website might not show rich results (e.g., product ratings, FAQs, images) in search results.
· Why It’s Important:
- Without structured data, your website may not perform as well in search rankings. For example, your competitors might have eye-catching product listings, while your pages appear plain.
· Action Steps:
1. Add structured data to the pages.
Use tools like Google’s Structured Data Markup Helper or ask a developer to add JSON-LD.
- Validate structured data using the Rich Results Test Tool.
3. Validation Status
· Example:
Validation Status: Valid
· Meaning:
- A valid status means the page is successfully optimized for search engines.
- If the status is invalid, it means something is broken (e.g., missing titles, broken links, or incomplete HTML).
· Action Steps:
- Review invalid pages for missing SEO elements (titles, descriptions, structured data).
- Fix errors and re-run the rendering process to ensure all pages are validated.
4. Deployment Details
· Example:
Deployed: 7965246990520139191.html
Total Files Deployed: 75
Total Files Skipped: 211
· Meaning:
- 75 files were successfully created and deployed as static HTML versions for search engines.
- 211 files were skipped because they might not need rendering or had errors.
· Why It’s Important:
- Deployment ensures that optimized HTML files are ready for search engines.
- Skipped files may mean missed opportunities for SEO improvement.
· Action Steps:
1. Verify the deployed files:
- Check if all important pages (home, products, blogs, etc.) were deployed.
- Investigate skipped files:
- Identify why they were skipped (e.g., errors, unnecessary files).
5. Recommendations
· Example:
Recommendations:
– Add structured data for richer search engine results.
· Meaning:
- The system identified specific improvements to make your website more SEO-friendly.
· Action Steps:
- Add structured data to ensure your website shows rich results like FAQs, star ratings, or events in search engines.
- Prioritize adding structured data to the most important pages first (e.g., homepage, product pages).
Why is This Output Beneficial?
1. Improves Search Engine Ranking:
- By optimizing all pages and resolving errors, your website becomes more search-engine-friendly.
2. Fixes JavaScript Rendering Issues:
- Ensures that search engines can fully understand JavaScript-heavy websites.
3. Enhances Search Appearance:
- Adding structured data makes your pages more attractive in search results.
4. Identifies and Resolves Errors:
- Invalid pages and missing elements are flagged, giving you a clear roadmap to fix them.
5. Prepares Static Files for SEO:
- Optimized HTML files are ready to be indexed by search engines, reducing the risk of poor rankings.
Final Steps for Website Owners
Here’s a clear, step-by-step guide:
1. Review All Processed URLs:
- Check the list of valid and invalid URLs.
- Focus on fixing invalid pages first.
2. Add Missing Structured Data:
- Prioritize pages flagged for missing structured data.
- Use JSON-LD to add rich snippets for:
- Products
- Articles
- FAQs
- Reviews
3. Fix Validation Errors:
- Review errors like:
- Missing <title> or <meta> tags.
- Incorrect canonical URLs.
- Update your pages to fix these issues.
4. Optimize Skipped Pages:
- Investigate skipped files to understand why they were not processed.
- Re-run the rendering tool after fixing issues.
5. Monitor Performance:
- Use tools like Google Search Console to check indexing and performance.
- Validate structured data with Google’s Rich Results Test Tool.
Conclusion
This output is a roadmap for improving your website’s SEO. It highlights:
- Which pages need optimization.
- What errors need fixing.
- How to enhance your website’s visibility in search results.
By following these steps, you’ll ensure your website is search-engine-friendly, ranks higher, and attracts more traffic.
Thatware | Founder & CEO
Tuhin is recognized across the globe for his vision to revolutionize digital transformation industry with the help of cutting-edge technology. He won bronze for India at the Stevie Awards USA as well as winning the India Business Awards, India Technology Award, Top 100 influential tech leaders from Analytics Insights, Clutch Global Front runner in digital marketing, founder of the fastest growing company in Asia by The CEO Magazine and is a TEDx speaker and BrightonSEO speaker..