JavaScript (JS) is very popular in the ecommerce world because it helps create a seamless and user-friendly experience for shoppers.
Take, for example, loading items on a category page, or dynamically updating products on a site using JS.
While this is great news for ecommerce sites, JavaScript poses some challenges for SEO experts.
Google is consistently working to improve its search engine, and most of its efforts are dedicated to ensuring its crawlers can access JavaScript content.
However, ensuring that Google crawls a JS site smoothly is not easy.
In this post, I’ll share everything you need to know about JS SEO for ecommerce and how you can improve your organic performance.
How JavaScript Works For Ecommerce Sites
When building an ecommerce site, developers use HTML for content and organization, CSS for design, and JavaScript for interaction with backend servers.
JavaScript plays three important roles in an ecommerce website.
1. Adding Interactivity To A Web Page
The purpose of adding interactivity is to allow users to see changes based on their actions, such as scrolling or filling out a form.
For example: the product image changes when the buyer hovers the mouse over it. Or hovering the mouse makes the image rotate 360 degrees, allowing shoppers to get a better view of the product.
All of these improve user experience (UX) and help shoppers decide on their purchase.
JavaScript adds such interactivity to sites, enabling marketers to engage visitors and drive sales.
2. Connecting To Backend Servers
JavaScript enables better backend integration using Asynchronous JavaScript (AJAX) and Extensible Markup Language (XML).
This allows web applications to send and retrieve data from the server asynchronously while respecting UX.
In other words, the process does not interfere with the appearance or behavior of the page.
Otherwise, if the visitor wants to load another page, they have to wait for the server to respond with a new page. This is annoying and can cause buyers to leave the site.
So JavaScript enables backend-enabled dynamic interactions – such as updating an item and seeing it updated in the cart – right away.
Similarly, it powers the ability to drag and drop elements on a web page.
3. Web Tracking And Analytics
JavaScript offers real-time pageview tracking and heatmaps that let you know how far people are reading your content.
For example, it can tell you where their mouse is or what they clicked on (click tracking).
This is how JS supports tracking user behavior and interactions on web pages.
How Do Search Bots Process JS?
Google processes JS in three stages: crawling, rendering, and indexing.
Image from Google Search Center, September 2022
As you can see in this image, Google bots put pages in a queue to be crawled and rendered. During this phase, the bot scans the page to assess new content.
When a URL is retrieved from the crawl queue by sending an HTTP request, it will first access your robots.txt file to check if you have allowed Google to crawl the page.
If not allowed, the bot will ignore it and not send HTTP requests.
In the second stage, rendering, HTML, CSS, and JavaScript files are processed and converted into a format that can be easily indexed by Google.
In the final stage, indexing, the rendered content is added to Google’s index, allowing it to appear in the SERPs.
Common JavaScript SEO Challenges With Ecommerce Sites
JavaScript crawling is much more complex than traditional HTML sites.
The process is faster in the latter case.
Check out this quick comparison.
So, with JS rich ecommerce sites, Google finds it difficult to index content or find links before the page is rendered.
In fact, in a webinar on how to migrate a website to JavaScript, Sofiia Vatulyak, a renowned SEO JS expert, shared,
“While JavaScript offers some useful and resource-saving features for web servers, not all search engines can process them. Google takes time to render and index JS pages. So implementing JS while upholding SEO is challenging.”
Here are the top JS SEO challenges that ecommerce marketers should be aware of.
Limited Crawl Budget
Ecommerce websites often have a large (and growing!) volume of pages that are not well organized.
These sites have extensive crawl budget requirements, and in the case of JS websites, the crawling process is lengthy.
Also, outdated content, such as orphaned and zombie pages, can lead to a huge waste of crawling budgets.
Limited Render Budget
As previously mentioned, to be able to see the content loaded by JS in the browser, the search bot must render it. But rendering at scale demands computational time and resources.
In other words, like a crawl budget, every website has a rendering budget. If that budget is spent, bots will walk away, delaying content discovery and consuming additional resources.
Google renders JS content in the second round of indexing.
It’s important to display your content in HTML, allowing Google to access it.
Image from Google Search Center, September 2022
Open the Inspect element on your page and look for some content. If you can’t find it there, search engines will have a hard time accessing it.
Troubleshooting Issues For JavaScript Websites Is Tough
Most JS websites face crawling and fetching issues.
For example, JS content limits the bot’s ability to navigate the page. This affects its indexability.
Similarly, bots cannot know the context of the content on a JS page, thus limiting their ability to rank pages for certain keywords.
Such problems make it difficult for ecommerce marketers to determine the rendering status of their web pages.
In such cases, using an advanced crawler or log analyzer can help.
Tools like Semrush Log File Analyzer, Google Search Console Crawl Stats, and JetOctopus, among others, offer a complete log management solution, enabling webmasters to better understand how search bots interact with web pages.
JetOctopus, for example, has JS rendering functionality.
Check out this GIF showing how this tool sees a JS page as a Google bot.
Screenshot from JetOctopus, September 2022
Similarly, Google Search Console’s Crawl Statistics shares a useful overview of your site’s crawl performance.
Screenshot from Google Search Console Crawl Stats, September 2022
The crawl statistics are sorted into:
Client-Side Rendering On Default
Ecommerce sites built in JS frameworks such as React, Angular, or Vue are, by default, set to client-side rendering (CSR).
With this setting, bots won’t be able to see what’s on the page, causing rendering and indexing issues.
Large And Unoptimized JS Files
JS code prevents important website resources from loading quickly. This has a negative impact on UX and SEO.
Top Optimization Tactics For JavaScript Ecommerce Sites
1. Check If Your JavaScript Has SEO Issues
Here are three quick tests to run on different page templates on your site, namely the home page, category or product listing page, product page, blog page, and additional page.
Access the Inspect URL report in your Google Search Console.
Screenshot from Google Search Console, September 2022
Screenshot from Google Search Console, September 2022
Next, hit View Tested Pages and move to the screenshot of the page. If you see this section is blank (as in this screenshot), Google is having problems rendering this page.
Screenshot from Google Search Console, September 2022
Repeat this step for all relevant ecommerce page templates shared earlier.
Running a site search will help you determine if a URL is in the Google index.
First, check the no-index and canonical tags. You want to make sure that your canonical is self-referencing and that there are no index tags on the page.
Next, go to Google search and enter – Site:yourdomain.com inurl:yoururl
Screenshot of a search for [Site: target.com inurl:], Google, September 2022
This screenshot shows that Target’s “About Us” page is indexed by Google.
If there’s a problem with your site’s JS you won’t see this result or get a result similar to this, but Google won’t have any meta information or anything readable.
Screenshot of a search for [Site:made.com inurl:hallway], Google, September 2022
Screenshot from search [Site:made.com inurl:homewares], Google, September 2022
Sometimes, Google can index a page, but the content is unreadable. This final test will help you assess whether Google can read your content.
Gather lots of content from your page templates and put them on Google to see the results.
Let’s take some content from Macy’s.
Screenshot from Macy’s, September 2022
Screenshot from search [alfani essential capri pull-on with tummy control], Google, September 2022
But look what happened to this content on Kroger. It’s a nightmare!
Screenshot from Kruger, September 2022
Screenshot of a search for [score $8 s’mores bunder when you buy 1 Hershey], Google, September 2022
While finding JavaScript SEO issues is more complex than this, these three tests will help you quickly assess if your ecommerce Javascript has SEO issues.
Take this test with a detailed JS website audit using an SEO crawler that can help identify if your website is failing when running JS, and if some code isn’t working properly.
For example, some SEO crawlers have a list of features that can help you understand this in detail:
2. Implement Dynamic Rendering
How your website renders code affects how Google indexes your JS content. Therefore, you need to know how JavaScript rendering occurs.
In this case, the rendered page (page rendering occurs on the server) is sent to the crawler or browser (client). Crawling and indexing are similar to HTML pages.
But implementing server-side rendering (SSR) is often challenging for developers and can increase server load.
Furthermore, Time to First Byte (TTFB) is slow because the server renders the page on the go.
One thing developers should keep in mind when implementing SSR is to refrain from using functions that operate directly in the DOM.
Here, JavaScript is rendered by the client using the DOM. This causes some computational problems when search bots try to crawl, render, and index content.
A viable alternative to SSRS and CSR is dynamic rendering that switches between client-side and server-side rendered content for a specific user agent.
It allows developers to deliver site content to users who access it using JS code generated in the browser.
However, it only serves a static version to the bot. Google officially supports implementing dynamic rendering.
Image from Google Search Center, September 2022
To implement dynamic rendering, you can use tools like Prerender.io or Puppeteer.
This can help you serve a static HTML version of your Javascript website to crawlers without any negative impact on CX.
Dynamic rendering is a great solution for ecommerce websites that typically store a lot of content that changes frequently or relies on social media sharing (containing embeddable social media walls or widgets).
3. Route Your URLs Properly
The JavaScript framework uses routers to map clean URLs. Therefore, it is very important to update the URL of the page when updating the content.
For example, JS frameworks like Angular and Vue generate URLs with a hash (#) like www.example.com/#/about-us
These URLs are ignored by Google bots during the indexing process. So, it is not recommended to use #.
Instead, use a static looking URL like http://www.example.com/about-us
4. Adhere To The Internal Linking Protocol
Internal links help Google crawl your site efficiently and highlight important pages.
A poor link structure can be dangerous for SEO, especially for JS heavy sites.
One of the common problems we run into is when ecommerce sites use JS for links that Google can’t crawl, such as onclick links or button types.
<a href=”/important-link”onclick=”changePage(‘important-link’)”>Design this</a>
If you want Google bots to find and follow your links, make sure they’re plain HTML.
Google recommends interlinking pages using HTML anchor tags with the href attribute and asks webmasters to avoid JS event handlers.
5. Use Pagination
Pagination is essential for JS rich ecommerce websites with thousands of products that retailers often choose to spread across multiple pages for better UX.
Allowing the user to scroll indefinitely may be good for UX, but not necessarily SEO friendly. This is because the bot does not interact with the page and cannot trigger events to load more content.
Eventually, Google will hit the limit (stop scrolling) and walk away. So, most of your content is ignored, resulting in poor rankings.
Make sure you use <a href> link to allow Google to see the second page of the pagination.
<a href=”https://example.com/shoes/”>
6. Lazy Load Images
Even though Google supports slow loading, it doesn’t scroll content while visiting the page.
This resizes the page’s virtual viewport, making it longer during the crawling process. And because the “scroll” event listener is not triggered, this content is not rendered.
So if you have an image below the fold, like most ecommerce websites, it’s very important to load it slowly, allowing Google to see all of your content.
7. Allow Bots To Crawl JS
This may seem obvious, but on several occasions, we’ve seen ecommerce sites inadvertently block JavaScript (.js) files from being crawled.
This will cause SEO JS issues, as bots won’t be able to render and index that code.
Check your robots.txt file to see if the JS file is open and available for crawling.
8. Audit Your JS Code
Finally, make sure you audit your JavaScript code to optimize it for search engines.
Use tools like Google Webmaster Tools, Chrome Developer Tools, and Ahrefs, as well as SEO crawlers like JetOctopus to run a successful SEO JS audit.
This platform can help you optimize your site and monitor your organic performance. Use GSC to monitor Googlebot and WRS activity.
For JS websites, GSC lets you see issues in rendering. It reports crawl errors and issues a notification for missing JS elements that have been blocked for crawling.
This web developer tool is built into Chrome for ease of use.
This platform lets you inspect the rendered HTML (or DOM) and network activity of your web pages.
From its Network tab, you can easily identify the JS and CSS resources that are loaded before the DOM.
Screenshot from Chrome Developer Tools, September 2022
Ahrefs allows you to effectively manage backlink generation, content audits, keyword research and more. It can render web pages at scale and allows you to check JavaScript redirects.
You can also enable JS in Site Audit crawl to unlock more insights.
Screenshot from Ahrefs, September 2022
The Ahrefs Toolbar supports JavaScript and shows a comparison of HTML with the rendered version of the tag.
JetOctopus SEO Crawler And Log Analyzer
JetOctopus is an SEO crawler and log analyzer that lets you easily audit common ecommerce SEO issues.
Because they can see and render JS as a Google bot, ecommerce marketers can solve JavaScript SEO problems at scale.
Its JS Performance tab offers comprehensive insight into JavaScript execution – First Paint, First Contentful Paint and page load.
It also shares the time it takes to complete all JavaScript requests with JS errors that need immediate attention.
GSC’s integration with JetOctopus can help you see the full dynamics of your site’s performance.
Ryte is another tool capable of crawling and inspecting your javascript pages. It will render the page and check for errors, help you troubleshoot and check the usability of your dynamic page.
seoClarity is an enterprise platform with many features. Like any other tool, it features dynamic rendering, allowing you to check how javascript is performing on your website.
Summing Up
An ecommerce site is a vivid example of dynamic content injected using JS.
Therefore, ecommerce developers praise how JS allows them to create highly interactive ecommerce pages.
On the other hand, many SEO experts are afraid of JS because they experience a drop in organic traffic once their sites start relying on client-side rendering.
While both are true, the fact of the matter is that websites that depend on JS can also perform well in the SERPs.
Follow the tips shared in this guide to take a step closer to utilizing JavaScript in the most effective way possible while maintaining your site’s ranking in the SERPs.
Featured Image: Visual Generation/Shutterstock
Which search engine is best?
Here are the top search engines in the world.
- The Best Search Engine in the World: Google.
- Search Engine #2. Bing.
- Search Engine #3. Baidu.
- Search Engine #4.Yahoo!
- Search Engine #5. Yandex.
- Search Engine #6. Ask.
- Search Engine #7. DuckDuckGo.
- Search Engine #8. Naver.
Which search engine is better than Google? There are many search engines that you can use besides Google. If your focus is on maintaining your privacy, search engines like DuckDuckGo, StartPage, and Swisscows are great choices. And if you want to do business in a specific location, you can try optimizing your site for Baidu and Yandex.
What is the number 1 search engine?
Google. With over 70% share of the search market, Google is undoubtedly the most popular search engine.
What is the number 1 search?
# | Keywords | Search volume |
---|---|---|
1 | 160,000,000 | |
2 | Youtube | 151,000,000 |
3 | amazon | 121.000.000 |
4 | weather | 103,000,000 |
Which search engine is the most accurate?
Google. Apart from being the most popular search engine covering more than 90% of the market worldwide, Google offers amazing features that make it the best search engine in the market. It boasts of cutting-edge algorithms, easy-to-use interface, and personalized user experience.
What is a good SEO strategy?
Keyword research is usually the first step of any legitimate SEO strategy. And one of the best ways to find the keywords your target customers are looking for? Google suggestions. These usually make great keywords for SEO because they come straight from Google.
What are the strategies for using SEO? An SEO strategy (also referred to as an “SEO approach”) is the process of planning and implementing steps designed to improve organic search engine rankings. In other words: SEO strategy is the process you follow when you want to get more organic traffic.
What is an example of SEO?
The way they determine the “best” result is based on an algorithm that takes account authority, relevance to the query, loading speed and more. (For example, Google has more than 200 ranking factors in their algorithm.) In most cases, when people think “search engine optimization”, they think “Google SEO”.
What is on-page SEO? SEO – Search engine optimization: the process of making your site better for search engines. Also the title of the person who does this for a living: We recently hired a new SEO to increase our presence on the web.
What is SEO in simple terms?
SEO stands for search engine optimization – much of which remains the same. It refers to techniques that help your website rank higher on search engine results pages (SERPs).
What is SEO and how it works?
Search engine optimization (SEO) is the art and science of making pages rank higher on search engines like Google. Since search is one of the main ways people find content online, ranking higher in search engines can lead to increased traffic to a website.
What is an example of SEO?
The way they determine the ‘best’ result is based on an algorithm that takes into account authority, relevance to the query, loading speed and more. (For example, Google has more than 200 ranking factors in their algorithm.) In most cases, when people think “search engine optimization”, they think “Google SEO”.
How does SEO work in simple words?
How Does SEO Work? SEO works by making certain changes to the design and content of your website that make your site more attractive to search engines. You do this in the hope that the search engine will show your website as the top result on the search engine results page.
Can I do SEO on my own?
You can actually do SEO yourself or DIY SEO (Do It Yourself SEO). With some research and lots of practice, anyone can learn how to do SEO for their business. A quick way to get started with SEO is to enter your URL here and then focus your SEO efforts on the recommended action items.
Can I do SEO for free? Ubersuggest is a free SEO tool that helps users generate new keywords and content ideas. This tool will give you a high-level overview of the keywords you are looking for.
Can I do SEO by myself?
If you have a website, chances are you’ve asked yourself this question, perhaps several times. The answer to this common question is: Yes, you CAN do SEO (search engine optimization) yourself.
Can SEO be done for free?
Ahrefs SEO Toolbar is a free Chrome and Firefox extension that lets you check for broken links, track redirect chains, and highlight nofollow links for any webpage. It also generates an on-page SEO report that includes the webpage: Title. Meta description.
Is it easy to learn SEO?
SEO is not easy. But that’s not rocket science either. There are things that you can implement right away and there are concepts that will take more time and effort. So yes, you can do your own SEO.
How much does it cost to have SEO?
Average SEO costs are $100-$250 per hour for US SEO agencies. SEO fees often range from $2,500 – $10,000 per month for US agencies. Average SEO plan costs $2819 per month (per Ahrefs) Overseas SEO companies may charge $10-$50 per hour.
Is it worth paying someone to do SEO?
SEO is beneficial if you have the right strategy in place and work with partners who know how to get results. About 93% of online experiences start with a search engine, and the SEO lead close rate is much higher than traditional marketing. So, SEO provides an impressive return on investment (ROI).
How much does SEO cost in 2021?
In fact, small businesses spend between $100 to $5,000 per month on SEO services. But again, the average tends to be around $500 per month. As you might expect, agencies and freelancers who charge higher monthly fees also tend to deliver better results.
Does Google SEO cost money?
It costs nothing to appear in organic search results like Google, and making changes to improve your website’s SEO, can greatly impact your search rankings over time.