#prerender.io
Explore tagged Tumblr posts
Text
Prerender.io with Redis for SEO: Boosting Visibility for JavaScript Websites

In today's fiercely competitive digital landscape, ensuring your website ranks well on search engines is non-negotiable. However, for websites primarily constructed using JavaScript frameworks such as React or Angular, achieving optimal search engine visibility can be challenging. This is because search engine crawlers often struggle to interpret and index dynamic content generated by JavaScript. Enter Prerender.io, with Redis for SEO, offering a robust solution to this dilemma. This guide will walk you through the process of installing Prerender.io, configuring it with Redis cache for improved performance, and seamlessly managing it with PM2 for efficient operation.
Prerequisites:
- Node.js and npm: These are essential for running JavaScript applications like Prerender.io. Install them using your system's package manager. - Redis: As a widely used in-memory data store, Redis serves as the caching layer for Prerender.io, significantly enhancing performance. Refer to the official Redis website for installation instructions. - Google Chrome: Prerender.io utilizes a headless Chrome instance for rendering web pages. Depending on your Linux distribution: - For Ubuntu/Debian-based systems, Chrome may be available in the repositories. However, for more control, consider installing the latest stable version directly from Google. - For other Linux distributions, you might need to manually download and install Chrome from the official Google website.
Installation of Node.js and npm (using nvm):
Node Version Manager (nvm) allows you to manage multiple Node.js versions on your system. Here's how to install and use it: Install nvm: curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.3/install.sh | bash This command downloads and executes the nvm installation script. Follow the on-screen instructions to complete the setup. Verify nvm installation: source ~/.bashrc nvm -v The first command ensures your shell recognizes nvm commands. The second command checks if nvm is installed correctly. If successful, it should output the nvm version. Install a specific Node.js version: nvm install v20.12.0; This command installs the specified Node.js version and sets it as the active version. Verify Node.js and npm installation: node -v npm -v These commands should display the installed Node.js and npm versions.
Installation of Redis server
Use the APT package manager to install Redis from the official Ubuntu repositories: sudo apt install redis-server
Installation of Google Chrome:
- Download the Chrome DEB package from the Google website using the following command: wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb - Install the downloaded package: sudo dpkg -i google-chrome-stable_current_amd64.deb If installation errors occur due to missing dependencies, resolve them using: - sudo apt-get install -f Then, retry the installation command. - sudo dpkg -i google-chrome-stable_current_amd64.deb
Installation of Prerender.io:
To globally install Prerender.io, execute the following command: sudo npm install -g prerender
Configuration:
Create a config.json file with the following content: { "prerender": { "chromeLocation": "/usr/bin/google-chrome", "forwardHeaders": false, "whitelist": , "blacklist": , "removeScriptTags": true, "redisUrl": "redis://localhost:6379" } } Adjust the settings as necessary, such as specifying the path to your Google Chrome installation, defining whitelists and blacklists for URLs, removing script tags from pre-rendered HTML, and providing the connection details for your Redis server.
Starting the Server:
Initiate the Prerender.io server using the command: node server.js --config /path/to/config.json Replace /path/to/config.json with the actual location of your configuration file. The server.js File: Here's an example of how your server.js file might look: const prerender = require('prerender'); const server = prerender(); server.start(); This code initializes the Prerender.io server using the prerender package and starts it. Make sure to customize it according to your specific requirements and configurations.
Configuring PM2:
Create an ecosystem file (ecosystem.config.js) in your project directory with the following content: module.exports = { apps: };
Managing Prerender.io with PM2:
If PM2 is not installed globally, do so by running: sudo npm install -g pm2 Then, start Prerender.io using PM2: pm2 start ecosystem.config.js
Monitoring and Maintenance (Optional):
For monitoring Prerender.io logs, use: pm2 logs Prerender-server To manage Prerender.io instances, execute: pm2 restart Prerender-server By following these steps, you'll effectively optimize your website for search engines using Prerender.io and Redis cache, ensuring improved visibility and performance in the digital realm. Read the full article
0 notes
Text
What are the major differences between Traditional Search and AI-powered search?

In today's world, even a simple search like "movie with the blue alien" quickly brings up details about Avatar. This shows how powerful AI-powered search has become. It’s not something from the future—it’s already happening because of advanced technology. Search engines have changed a lot in recent years. With the rise of AI tools like ChatGPT, searches are no longer just about matching keywords. Instead, they now understand what users mean. This change has made search engines smarter and also raised people's expectations. This article will explore how traditional search differs from AI-powered search. For more details, you can read this article.
Read More - https://prerender.io/blog/traditional-search-vs-ai-powered-search-explained/
0 notes
Text

Top 10 Node.js SEO Libraries!
The right libraries can make a huge difference while optimizing Node.js websites for SEO performance. Listed are the top 10 libraries for Node.js that will help in increasing the visibility and ranking of websites:
Prerender.io: Server-side rendering can be easily performed with this library in the case of single-page applications. Otherwise, SEO enhancement would be tough without it.
Puppeteer: Programmatically generates HTML snapshots to improve SEO visibility on search engines.
SEO.js: This library is designed specifically for Node.js applications, offering tools for meta tags, sitemaps, canonical URLs, etc.
Helmet: An Express.js middleware to secure HTTP headers with a focus on SEO-relevant headers, such as robots.txt and meta tags.
Screaming Frog SEO Spider: A desktop tool for crawling and auditing websites at the core of finding and resolving SEO issues.
All these libraries help the developer make a Node.js application optimal in performance for search engines. Check out the article for more details.
0 notes
Link
Best Pre-Rendering Testing Tool: Angular, Blazor, React, Vue
Getthit's Pre-Render Testing Tool for websites built with Single Page Application (SPA). Supports Angular, React, Vue, Next.js, Blazor, Prerender.io & more
0 notes
Text
Getting started with SEO for Dynamic sites like Angular, React, JavaScript etc.
A lot of modern sites use Single Page Applications (dynamic sites like Angular, React, JavaScript) which has performance/UX benefits. But these sites usually return empty HTML file initially which makes SEO crawl difficult. Even though Google is getting better in crawling dynamic sites it’s not great at this moment. Worry not, you have a workaround for it.
Dynamic Rendering – switching between client-side rendered and pre-rendered content for specific user agents. When you render the app on the server first (using pre-rendering/server-side rendering) the user (and bots) get a fully rendered HTML page – problem solved.

Dynamic rendering for dummier
How Dynamic Rendering works.?
Dynamic rendering requires your web server to detect crawlers (for example, by checking the user agent). Requests from crawlers are routed to a renderer, requests from users are served normally. Where needed, the dynamic renderer serves a version of the content that's suitable to the crawler, for example, it may serve a static HTML version. You can choose to enable the dynamic renderer for all pages or on a per-page basis.
how dynamic rendering works
Implementing Dynamic Rendering
Install and configure a dynamic renderer to transform your content into static HTML that's easier for crawlers to consume. Some common dynamic renderers are Puppeteer, Rendertron, and prerender.io.
Choose the user agents that you think should receive your static HTML and refer to your specific configuration details on how to update or add user agents. Here's an example of a list of common user agents in the Rendertron middleware:
export const botUserAgents = [ 'googlebot', 'google-structured-data-testing-tool', 'bingbot', 'linkedinbot', 'mediapartners-google', ];
If pre-rendering slows down your server or you see a high number of pre-rendering requests, consider implementing a cache for pre-rendered content, or verifying that requests are from legitimate crawlers.
Determine if the user agents require desktop or mobile content. Use dynamic serving to provide the appropriate desktop or mobile version. Here's an example of how a configuration could determine if a user agent requires desktop or mobile content:
isPrerenderedUA = userAgent.matches(botUserAgents) isMobileUA = userAgent.matches(['mobile', 'android']) if (!isPrerenderedUA) { } else { servePreRendered(isMobileUA) }
In this example, use if (!isPrerenderedUA) {...} to serve regular, client-side rendered content. Use else { servePreRendered(isMobileUA)} to serve the mobile version, if needed.
Configure your server to deliver the static HTML to the crawlers that you selected. There are several ways you can do this depending on your technology; here are a few examples:
Proxy requests coming from crawlers to the dynamic renderer.
Pre-render as part of your deployment process and make your server serve the static HTML to crawlers.
Build dynamic rendering into your custom server code.
Serve static content from a pre-rendering service to crawlers.
Use a middleware for your server (for example, the rendertron middleware).
Verify your configuration
After implementing dynamic rendering, verify if everything is working by following the below steps.
Test your mobile content with the Mobile-Friendly Test to make sure Google can see your content.
Test your desktop content with Fetch as Google to make sure that the desktop content is also visible on the rendered page (the rendered page is how Googlebot sees your page).
If you use structured data, test that your structured data renders properly with the Structured Data Testing Tool.
Troubleshooting
Even after following the steps if your site isn’t appearing in Google search results.? Try troubleshooting with following steps.
Content is incomplete or looks different
What caused the issue: Your renderer might be misconfigured or your web application might be incompatible with your rendering solution. Sometimes timeouts can also cause content to not be rendered correctly.
Fix the issue: Refer to the documentation for your specific rendering solution to debug your dynamic rendering setup.
High response times
What caused the issue: Using a headless browser to render pages on demand often causes high response times, which can cause crawlers to cancel the request and not index your content. High response times can also result in crawlers reducing their crawl-rate when crawling and indexing your content.
Fix the issue
Set up a cache for the pre-rendered HTML or create a static HTML version of your content as part of your build process.
Make sure to enable the cache in your configuration (for example, by pointing crawlers to your cache).
Check that crawlers get your content quickly by using testing tools such as the Mobile-Friendly Test or webpagetest (with a custom user agent string from the list of Google Crawler user agents). Your requests should not time out.
Structured data is missing
What caused the issue: Missing the structured data user agent, or not including JSON-LD script tags in the output can cause structured data errors.
Fix the issue
Use the Structured Data Testing Tool to make sure the structured data is present on the page. Then configure the user agent for the Structured Data Testing Tool to test the pre-rendered content.
Make sure JSON-LD script tags are included in the dynamically rendered HTML of your content. Consult the documentation of your rendering solution for more information.
Smoke Tests for dynamic rendering
Here’s a little more nuance to server side rendering troubleshooting based on some real-world situations we’ve encountered.
How To Test Server Side Rendering On A New Site Before It’s Launched
It often is the case that SEOs get brought into the process well after a site has been built, but only a few days before it will be launched. We will need a way to test the new site in Google without competing in Google with the old site. For a variety of reasons we don’t want the entire new site to get crawled and indexed, but we want to know that Googlebot can index the content on a URL, that it can crawl internal links and that it can rank for relevant queries. Here’s how to do this:
Create test URLs on new site for each template (or use URLs that have already been built) and make sure they are linked from the home page.
Add a robots.txt file that allows only these test URLs to be crawled.
Here’s an example: User-Agent: Googlebot Disallow: / (this means don’t crawl the entire site) Allow: /$ (allow Gbot to crawl only the home page even though the rest of the site is blocked in the line above) Allow: /test-directory/$ (allow crawling of just the /test-directory/ URL) Allow: /test-directory/test-url (allow crawling of /test-directory/test-url)(you can add as many URLs as you want to test – the more you test, the more certain you can be, but a handful is usually fine)
Once the robots.txt is set up, verify the test site in Google Search Console.
Use the Fetch as Google tool to fetch and render the home page and request crawling of all linked URLs. We will be testing here that Google can index all of the content on the home page and can crawl the links to find the test URLs. You can view how the content on the home page looks in the Fetch tool, but I wouldn’t necessarily trust it – we sometimes see this tool out of sync with what actually appears in Google.
In a few minutes, at least the test home page should be indexed. Do exact match searches for text that appears in the title tag and in the body of the home page. If the text is generic, you may have to include site:domain.com in your query to focus only on the test domain. You are looking for your test URL to show up in the results. This is a signal that at least Google can index and understand the content on your homepage. This does not mean the page will rank well, but at least it now has a shot.
If the test links are crawlable, soon you should the test URLs linked from the home page show up in Google. Do the same tests. If they don’t show up within 24 hours, while this doesn’t necessarily mean the links aren’t crawlable, it’s at least a signal in that direction. You can also look at the text-only cache of the indexed test home page. If the links are crawlable, you should see them there.
If you want to get more data, unblock more URLs in robots.txt and request more indexing.
Once you have finished the test, request removal of the test domain in GSC via the Remove URLs tool.
We often can get this process done in 24 hours, but we recommend to clients giving it a week in case we run into any issues.
Pro-tip: If you are using Chrome and looking at a test URL for the SEO content like title tag text, often SEO extensions and viewing the source will only show the “hooks” (e.g. {metaservice.metaTitle}) and not the actual text. Open Chrome Developer Tools and look in the Elements section. The SEO stuff should be there.
Do Not Block Googlebot on Your PreRender Server
Believe it or not, we had a client do this. Someone was afraid that Googlebot was going to eat up a lot of bandwidth and cost them $. I guess they were less afraid of not making money to pay for that bandwidth.
Do Not Throttle Googlebot on Your PreRender Server
We convinced the same client to unblock Googlebot, but noticed in Google Search Console’s crawl report that pages crawled per day was very low. Again someone was trying to save money in a way that guaranteed them to lose money. There may be some threshold where you may want to limit Googlebot’s crawling, but my sense is Googlebot is pretty good at figuring that out for you.
1 note
·
View note
Photo
Alternative to prerender.io for Shopify? https://www.reddit.com/r/SEO/comments/f5lzl4/alternative_to_prerenderio_for_shopify/
Are there any alternatives that will allow for prerendering JS pages on Shopify? Shopify doesn't allow apps such as prerender
submitted by /u/Brandsseller [link] [comments] February 18, 2020 at 09:54AM
0 notes
Photo
With the recent developments in the JavaScript field, it’s tempting to start using the new ‘modern’ website building approach. However not understanding the repercussions of that decision can be devastating for your SEO. To understand the SEO implications of using complex javascript frameworks, let’s take a look at the difference between traditional CGI (common gateway interface) website building standard used since 1993. With the traditional CGI deployment, the HTML is formulated before it’s presented to the client (web browser/crawler). The process could slightly differ depending on back-end frameworks used, however, the result is always fully or partially rendered before it is passed on to the server which in turn sends it back to the browser. The advantage of this method is that the content received by the browser is mostly ready to use and will load extremely fast with proper optimisations e.g. Amazon.co.uk. This approach uses JavaScript as a UI support rather than the main logic processor. The modern Javascript Framework e.g. ReactJS methodology is to handle most of data rendering on the Client’s side (the browser), this can include Routing (deciding how the page link gets handled by the web application). It works by delivering basic HTML structure to the browser and initialising JavaScript framework to handle the rendering. This requires a chain of requests to come back from the server successfully which increases the initial loading time greatly. The selling point of this approach is that once you’ve loaded everything up, you can navigate quickly through the web application without fully reloading the page. Here is an infographic that shows the support of different search engines for JavaScript frameworks: Looking at the graphic we can assume that ReactJS is supported by the google bot? – No, it’s a lot more complicated than that. To understand why, we need to have the knowledge of how crawlers get our URLs and index them. Crawlers are simple programs that go to different URLs and process the HTML they receive to make decisions about rankings. They are not browsers and they do not process any JavaScript as far as we know it. The process of extraction of meaningful data from a markup language like HTML is called parsing. As it stands, there is no easy way to parse JavaScript as it is not a markup language but a scripting language, so it needs to be interpreted by the browser or NodeJS. Therefore it is a huge effort to interpret Javascript as a browser simulation and a lot of additional server resources are required. Have you ever wondered what the website code looks like, right clicked on the page and chosen view source? You can safely assume that the crawler can see all that’s displayed to you on that view unless special server rules are in place. The crawler will only read what is immediately returned from the server. So if you’re running a one page JavaScript app and all that is sent is some wrapping HTML then that is what will get indexed. For more complex debugging try command (Linux or Mac): curl –user-agent “Googlebot/2.1 (+http://www.google.com/bot.html)” http://exmaple.com You’re probably confused as the green ticks on the infographic suggest the support. Yes, there is limited support but it’s not the Crawler interpreting the website. Google has developed a “Web Rendering Service” that is a separate piece of software and it will run at different times to the main Crawler. The indexing process example: 1st week – crawl homepage (crawler) and schedule the “Web Rendering Service” to visit the page —> render homepage using “Web Rendering Service” and find all the links and relevant info (does not follow links on its own) 2nd week – crawl homepage (data from “Web Rendering Service” is used to crawl links previously not seen) —> render homepage using “Web Rendering Service” and find all the links and relevant info (does not follow links on its own) As you can see those 2 software pieces don’t run together very well and the Crawler runs on its own schedule independently of what the “Web Rendering Service” is doing as well as it is the ultimate decision maker in the process of indexing your website. You can also notice that there is a minimum of 1 week lag in indexing pages returned by the “Web Rendering Service” which can be very undesirable for quickly changing content e.g. e-commerce shops, news website. If you have a large website, it could take an unreasonably long amount of time to index every page. It’s also important to understand that the “Web Rendering Service” has a hard cut off point after 5 seconds of loading, which means that if your website loads for longer than 5 seconds the service will quit and not pass anything back to the crawler. This will cause your website to be invisible to Google. “Web Rendering Service” is an experimental technology and is flaky at best. You cannot expect proper indexing and for Google to see exactly what you see on your browser. It’s been reported that it requires most of JavaScript to be inline to be even considered for processing. As you can imagine it’s very expensive to run this service and it’s very unlikely that Google will increase the 5 seconds load limit in the future. But not all hope is lost. There are mechanisms that can make your website visible to the crawler: Isomorphic JavaScript – render your JS manually using NodeJS on the server side https://prerender.io/ – semi-automatic service that can be deployed using NodeJS on the server side Anything that renders your HTML in full before it hits the browser The main idea behind those mechanisms is to get the HTML rendered before it’s received by the browser like the CGI method I described at the beginning of this article where the server will serve pre-rendered HTML for the search engine and non-rendered for the standard user. We can confirm that this method works when your website loads more than 5 seconds and the “Web Rendering Service” sees nothing. However, we cannot confirm what the penalties SEO penalties can be applied if the Crawler and “Web Rendering Service” do not agree on the content seen. The user-agent detection is critical here and any small error can cost rankings or apply long term penalties. So, should you use one page JavaScript frameworks for web application where you want to gain ranking? It is a cool, trendy and new way of making websites, however, the tradeoffs in the area of SEO are too big at the present: You may be penalised for the increased initial load time (crawler will load your website from the beginning every time – it doesn’t behave like a user) Your website might completely disappear from all search engines Unless your users spend hours on your website at one time then the increased initial load time will actually make the UX worse Splitting the server logic between front-end and server can be a mess if the team doesn’t work well “Web Rendering Service” uses chrome 41 which means your JavaScript needs to have been usable 3 years ago Sharing crawlers like Bing, Facebook, Twitter will not render your JavaScript Larger server resources are required to handle pre-rendering of the content and caching Using pre-rendering services increases the danger of cloaking penalty Great effort in debugging required Auditing your website will be more expensive and complex Even using everything at your disposal cannot guarantee correct indexing Where it is safe to use: When no ranking is required e.g. admin panel, not publicly accessible content The post Will your one page JavaScript app get indexed? appeared first on Blueclaw.
0 notes
Text
Update Meta Tags In AngularJS
Update Meta Tags In AngularJS
Dynamically update meta tags and document title for SEO purposes in your AngularJS application.
Main Features:
lightweight (< 1KB)
uses original meta syntax
supports prerender.io for SEO purposes
supports Open Graph protocol meta elements
supports schema.org protocol meta elements
supports link elements
update your document title dynamically
update your meta tags depending on the state your…
View On WordPress
1 note
·
View note
Text
Why choose Mean Stack for Ecommerce Web Development
Alright, Am going to contend why MEAN stack is generally excellent for internet business. Furthermore, why we decided to assemble our open source eCommerce stage
Am a major enthusiast of MySQL and built up a ton numerous items utilizing it. Having said that I ought to concede that MySQL is gradually turning into an outdated Database of the past. The DB has lots of issues with adaptability. Then again MongoDB gives rich Data model, Dynamic Schema, Data Locality, Auto-Sharing, Auditing that makes giving exceptionally differing information types a breeze and lifts execution immensely. It can likewise be scaled inside and over numerous dispersed server farms.
With regards to Server side programming clearly NodeJS beats PHP as far as SPEED ( Performance ). Node.js is occasion driven and non-blocking and truly adept at dealing with simultaneous solicitations. We as a whole realize the memory issues PHP based eCommerce locales run into. One of the principle preferred position of NodeJS is memory use. On the off chance that you do long-surveying stuff node.js sparkles since it doesn't require a 100 MB Apache + PHP example to deal with each solicitation.
In an eCommerce platform we should chose to have an API shield layer over the NodeJS modules, so creating front end Templates can be accelerated. What's more, the experience and loading speed Front end developments like AngularJS, ReactJS and so forth give are fantastic.
Many may contend that front end deveopment with JS technologies like Angular JS are not SEO friendly. That isn't True. Infact the pages done utilizing Angular can be made more SEO cordial than PHP controlled HTML pages. In straightforward words, pages can be built to give HTML content whenever opened by bugs and simply Angular substance whenever opened by people! There are a lot of supporting innovations to accomplish this effectively like Prerender.io and so forth.
The main 3 components that makes any eCommerce item effective are:
Smart Search: Using a mix of Angular and NodeJS with refined calculations driving in the background, a genuine decent internet searcher can be made. We likewise have developed open source Search Engines like SOLR scan that has explicit modules for MEAN Stack.
Recommendation Engine: Proper and wise Product proposal is the sacred goal of deals for any eCommerce business. I accept if an appropriate item suggestion algo is assembled and controlled with the consistent experience given by front end JS advances, enchantment can occur. In the open source eCommerce stage we are building, we host incorporated some incredible third gathering suggestion motors like Guess work that as of now power probably the greatest eCommerce Stores on the planet.
Transportation Automation: We face a daily reality such that there are incalculable alternatives accessible for Shipping ( FedEx, DHL and so on ) Each of them give their APIs, yet since various players are compelling in various pieces of the world, we need a progressively amassed and robotized way to deal with this. Fortunately there are numerous new businesses attempting to unravel this and yes they can be effectively incorporated with eCommerce stores controlled by MEAN stack innovations.
Benefits of MEAN stack however. Those are :
• Database
MongoDB (NoSQL database): A cross-stage document organized database structure. JSON-style reports with dynamic organizations give straightforwardness and force, making the blend of data into explicit applications (particularly Javascript-based ones) snappy and basic.
• Security
MEAN is in like manner Stable and secure Platform.
• Cost
MEAN is more affordable to work, since it is and open source and you don't require differing experts for frontend and backend and you can "Mix Match".
• Performance
MongoDB is brisk, yet it achieves its execution by trading off consistency (in clustered arrangements). As such, MongoDB is uncommon when you require speed, flexibility in your model and can recognize minor and respectably periodic data hardship.
• Through-put
MEAN is Asynchronous.
For more details about Mean Stack Online Course CLICK HERE
Contact us for more details +919989971070 or visit us www.visualpath.in
0 notes
Text
JavaScript rendering and the problems for SEO in 2020
30-second summary:
Anyone working in enterprise SEO in 2020 will have encountered this web architecture scenario with a client at some point. Frameworks like React, Vue, and Angular make web development more simply expedited.
There are tons of case studies but one business Croud encountered migrated to a hybrid Shopify / JS framework with internal links and content rendered via JS. They proceeded to lose traffic worth an estimated $8,000 per day over the next 6 months… about $1.5m USD.
The experienced readers amongst us will soon start to get the feeling that they’re encountering familiar territory.
Croud’s VP Strategic Partnerships, Anthony Lavall discusses JavaScript frameworks that deal with the most critical SEO elements.
While running the SEO team at Croud in New York over the last three years, 60% of our clients have been through some form of migration. Another ~30% have either moved from or to a SPA (Single Page Application) often utilizing an AJAX (Asynchronous Javascript and XML) framework to varying degrees.
Anyone working in enterprise SEO in 2020 will have encountered this web architecture scenario with a client at some point. Frameworks like React, Vue, and Angular make web development more simply expedited. This is especially true when creating dynamic web applications which offer relatively quick new request interactivity (once the initial libraries powering them have loaded – Gmail is a good example) by utilizing the power of the modern browser to render the client-side code (the JavaScript). Then using web workers to offer network request functionality that doesn’t require a traditional server-based URL call.
With the increased functionality and deployment capabilities comes a cost – the question of SEO performance. I doubt any SEO reading this is a stranger to that question. However, you may be still in the dark regarding an answer.
Why is it a problem?
Revenue, in the form of lost organic traffic via lost organic rankings. It’s as simple as this. Web developers who recommended JavaScript (JS) frameworks are not typically directly responsible for long-term commercial performance. One of the main reasons SEOs exist in 2020 should be to mitigate strategic mistakes that could arise from this. Organic traffic is often taken as a given and not considered as important (or controllable), and this is where massive problems take place. There are tons of case studies but one business we encountered migrated to a hybrid Shopify / JS framework with internal links and content rendered via JS. They proceeded to lose traffic worth an estimated $8,000 per day over the next 6 months… about $1.5m USD.
What’s the problem?
There are many problems. SEOs are already trying to deal with a huge number of signals from the most heavily invested commercial algorithm ever created (Google… just in case). Moving away from a traditional server-rendered website (think Wikipedia) to a contemporary framework is potentially riddled with SEO challenges. Some of which are:
Search engine bot crawling, rendering, and indexing – search engine crawlers like Googlebot have adapted their crawling process to include the rendering of JavaScript (starting as far back as 2010) in order to be able to fully comprehend the code on AJAX web pages. We know Google is getting better at understanding complex JavaScript. Other search crawlers might not be. But this isn’t simply a question of comprehension. Crawling the entire web is no simple task and even Google’s resources are limited. They have to decide if a site is worth crawling and rendering based on assumptions that take place long before JS may have been encountered and rendered (metrics such as an estimated number of total pages, domain history, WhoIs data, domain authority, etc.).
Google’s Crawling and Rendering Process – The 2nd Render / Indexing Phase (announced at Google I/O 2018)
Speed – one of the biggest hurdles for AJAX applications. Google crawls web pages un-cached so those cumbersome first loads of single page applications can be problematic. Speed can be defined in a number of ways, but in this instance, we’re talking about the length of time it takes to execute and critically render all the resources on a JavaScript heavy page compared to a less resource intensive HTML page.
Resources and rendering – with traditional server-side code, the DOM (Document Object Model) is essentially rendered once the CSSOM (CSS Object Model) is formed or to put it more simply, the DOM doesn’t require too much further manipulation following the fetch of the source code. There are caveats to this but it is safe to say that client-side code (and the multiple libraries/resources that code might be derived from) adds increased complexity to the finalized DOM which means more CPU resources required by both search crawlers and client devices. This is one of the most significant reasons why a complex JS framework would not be preferred. However, it is so frequently overlooked.
Now, everything prior to this sentence has made the assumption that these AJAX pages have been built with no consideration for SEO. This is slightly unfair to the modern web design agency or in-house developer. There is usually some type of consideration to mitigate the negative impact on SEO (we will be looking at these in more detail). The experienced readers amongst us will now start to get the feeling that they are encountering familiar territory. A territory which has resulted in many an email discussion between the client, development, design, and SEO teams related to whether or not said migration is going to tank organic rankings (sadly, it often does).
The problem is that solutions to creating AJAX applications that work more like server-based HTML for SEO purposes are themselves mired in contention; primarily related to their efficacy. How do we test the efficacy of anything for SEO? We have to deploy and analyze SERP changes. And the results for migrations to JavaScript frameworks are repeatedly associated with drops in traffic. Take a look at the weekly stories pouring into the “JS sites in search working group” hosted by John Mueller if you want some proof.
Let’s take a look at some of the most common mitigation tactics for SEO in relation to AJAX.
The different solutions for AJAX SEO mitigation
1. Universal/Isomorphic JS
Isomorphic JavaScript, AKA Universal JavaScript, describes JS applications which run both on the client and the server, as in, the client or server can execute the <script> and other code delivered, not just the client (or server). Typically, complex JavaScript applications would only be ready to execute on the client (typically a browser). Isomorphic Javascript mitigates this. One of the best explanations I’ve seen (specifically related to Angular JS) is from Andres Rutnik on Medium:
The client makes a request for a particular URL to your application server.
The server proxies the request to a rendering service which is your Angular application running in a Node.js container. This service could be (but is not necessarily) on the same machine as the application server.
The server version of the application renders the complete HTML and CSS for the path and query requested, including <script> tags to download the client Angular application.
The browser receives the page and can show the content immediately. The client application loads asynchronously and once ready, re-renders the current page and replaces the static HTML with the server rendered. Now the web site behaves like an SPA for any interaction moving forwards. This process should be seamless to a user browsing the site.
Source: Medium
To reiterate, following the request, the server renders the JS and the full DOM/CSSOM is formed and served to the client. This means that Googlebot and users have been served a pre-rendered version of the page. The difference for users is that the HTML and CSS just served is then re-rendered to replace it with the dynamic JS so it can behave like the SPA it was always intended to be.
The problems with building isomorphic web pages/applications appear to be just that… actually building the thing isn’t easy. There’s a decent series here from Matheus Marsiglio who documents his experience.
2. Dynamic rendering
Dynamic rendering is a more simple concept to understand; it is the process of detecting the user-agent making the server request and routing the correct response code based on that request being from a validated bot or a user.
This is Google’s recommended method of handling JavaScript for search. It is well illustrated here:
The Dynamic Rendering Process explained by Google
The output is a pre-rendered iteration of your code for search crawlers and the same AJAX that would have always been served to users. Google recommends a solution such as prerender.io to achieve this. It’s a reverse proxy service that pre-renders and caches your pages. There are some pitfalls with dynamic rendering, however, that must be understood:
Cloaking – In a world wide web dominated primarily by HTML and CSS, cloaking was a huge negative as far as Google was concerned. There was little reason for detecting and serving different code to Googlebot aside from trying to game search results. This is not the case in the world of JavaScript. Google’s dynamic rendering process is a direct recommendation for cloaking. They are explicitly saying, “serve users one thing and serve us another”. Why is this a problem? Google says, “As long as your dynamic rendering produces similar content, Googlebot won’t view dynamic rendering as cloaking.” But what is similar? How easy could it be to inject more content to Googlebot than is shown to users or using JS with a delay to remove text for users or manipulate the page in another way that Googlebot is unlikely to see (because it is delayed in the DOM for example).
Caching – For sites that change frequently such as large news publishers who require their content to be indexed as quickly as possible, a pre-render solution may just not cut it. Constantly adding and changing pages need to be almost immediately pre-rendered in order to be immediate and effective. The minimum caching time on prerender.io is in days, not minutes.
Frameworks vary massively – Every tech stack is different, every library adds new complexity, and every CMS will handle this all differently. Pre-render solutions such as prerender.io are not a one-stop solution for optimal SEO performance.
3. CDNs yield additional complexities… (or any reverse proxy for that matter)
Content delivery networks (such as Cloudflare) can create additional testing complexities by adding another layer to the reverse proxy network. Testing a dynamic rendering solution can be difficult as Cloudflare blocks non-validated Googlebot requests via reverse DNS lookup. Troubleshooting dynamic rendering issues therefore takes time. Time for Googlebot to re-crawl the page and then a combination of Google’s cache and a buggy new Search Console to be able to interpret those changes. The mobile-friendly testing tool from Google is a decent stop-gap but you can only analyze a page at a time.
This is a minefield! So what do I do for optimal SEO performance?
Think smart and plan effectively. Luckily only a relative handful of design elements are critical for SEO when considering the arena of web design and many of these are elements in the <head> and/or metadata. They are:
Anything in the <head> – <link> tags and <meta> tags
Header tags, e.g. <h1>, <h2>, etc.
<p> tags and all other copy / text
<table>, <ul>, <ol>, and all other crawl-able HTML elements
Links (must be <a> tags with href attributes)
Images
Every element above should be served without any JS rendering required by the client. As soon as you require JS to be rendered to yield one of the above elements you put search performance in jeopardy. JavaScript can, and should be used to enhance the user experience on your site. But if it’s used to inject the above elements into the DOM then you have got a problem that needs mitigating.
Internal links often provide the biggest SEO issues within Javascript frameworks. This is because onclick events are sometimes used in place of <a> tags, so it’s not only an issue of Googlebot rendering the JS to form the links in the DOM. Even after the JS is rendered there is still no <a> tag to crawl because it’s not used at all – the onclick event is used instead.
Every internal link needs to be the <a> tag with an href attribute containing the value of the link destination in order to be considered valid. This was confirmed at Google’s I/O event last year.
To conclude
Be wary of the statement, “we can use React / Angular because we’ve got next.js / Angular Universal so there’s no problem”. Everything needs to be tested and that testing process can be tricky in itself. Factors are again myriad. To give an extreme example, what if the client is moving from a simple HTML website to an AJAX framework? The additional processing and possible issues with client-side rendering critical elements could cause huge SEO problems. What if that same website currently generates $10m per month in organic revenue? Even the smallest drop in crawling, indexing, and performance capability could result in the loss of significant revenues.
There is no avoiding modern JS frameworks and that shouldn’t be the goal – the time saved in development hours could be worth thousands in itself – but as SEOs, it’s our responsibility to vehemently protect the most critical SEO elements and ensure they are always server-side rendered in one form or another. Make Googlebot do as little leg-work as possible in order to comprehend your content. That should be the goal.
Anthony Lavall is VP Strategic Partnerships at digital agency Croud. He can be found on Twitter @AnthonyLavall.
The post JavaScript rendering and the problems for SEO in 2020 appeared first on Search Engine Watch.
from Digital Marketing News https://www.searchenginewatch.com/2020/05/06/javascript-rendering-and-the-problems-for-seo-in-2020/
0 notes
Link
Show HN: Rendering your JavaScript for Google so you don't have to https://prerender.io/ March 16, 2020 at 11:47AM via أبك عربي
0 notes
Text
Digital Marketing Trends in 2019: What's important?
Once again this year a lot has happened in the SEO and content marketing world. The online marketing agency Pixxoma the most important developments and introduces 10 SEO and content marketing trends for the year 2019.
SEO-Trends 2019: That will occupy the industry next year
Many of the topics from this year will remain relevant in 2019 as well. Among other things, user experience, voice search, mobile first index and the indexing of Javascript applications, which we explain here for you. Here are our SEO trends for the year 2019 summarized for you: 1. Mobile First The Mobile First Index, rolled out this year, will continue to play a big role next year, as the transition is likely to continue for some time. As a website owner, however, you should make sure as soon as possible that you can offer your users a mobile-optimized site. From a SEO perspective, it's not just the fact that Google rates sites based on their mobile version. Much more important are the users. More and more people are surfing on their mobile devices. With a mobile-optimized website you will therefore meet the needs of your users at best. Our tip: Create your website in a responsive web design. The mobile optimization test lets you see if your site's mobile presence is accurate and usable. Also, check your markup structure: Use at least one relevant markup per page and make sure the markup is the same for both the desktop version and the mobile variant. 2. Website Speed With the "Speed Update" in July 2018, the load times of a website have become a ranking factor for the mobile search . For slow websites, even small load time improvements can have a positive impact on the ranking. This gives the performance an ever higher priority. Although Google made it clear that this factor is just one of many influencing the ranking. Nevertheless, site operators are advised to optimize the load time of their websites. Because users want to get answers to their searches as quickly as possible, they often jump off slow-loading sites and return to the search results page. This sends a negative user signal to the search engine regarding the website. Meanwhile Google even ships Warnings to slow websites . Our tip: To review website performance, we recommend Google's newly redesigned PageSpeed Insights tool. 3. User signals In 2018, user signals were a big topic. Next year, they will continue to employ the online marketing industry. User signals such. Dwell time, bounce rate, CTR and the back-to-SERP rate have an impact on the ranking. The CTR (Click Through Rate) indicates how often users actually click through from the search results page to a website. The back-to-SERP rate, on the other hand, indicates how often users return from a webpage to the search results page because they may not have received a satisfactory response on the website. This year the importance of user signals is increasing for the following reasons : User signals provide a lot of data about the behavior and the intention of the usersSearch engines increasingly rely on machine learning, ie Artificial Intelligence, to optimize their productsArtificial intelligence is particularly good at recognizing patterns from large data sets and thus drawing conclusions about the users' intentions. These can in turn be used for the optimization of the rankings The measures to improve the user signals are recommended for each website. Regardless of whether they directly influence rankings on search results pages or not. Our tip: Check how accurately your tracking tools measure user traffic from your website. If these settings do not produce meaningful data, you'd better set up your own events to track your users' behavior (here's an example of bounce tracking optimization ). 4. Voice Search The great breakthrough in voice search is still waiting, but reports show the increasing sales of virtual speech assistants such as Alexa, Google Assistant and Co. In Germany have according to representative survey of Next Media Hamburg in May only 14 percent of a virtual voice assistant. Forecasts assume, however, that the number of users worldwide will increase by almost 35 percent next year . Compared to written search queries, the spoken ones are usually longer and more complex. Often they are even formulated as a whole question. In addition, usually only one result is read from the hits of the search results page. In addition, Google is currently testing so-called selected news publishers Speakable markup . This structured record identifies passages in articles that can be easily read by language assistants. If the test is successful, each website may be given the opportunity to use this markup to provide text assistants with readable text passages. Our tip: In order to be prepared for the future in digital marketing, you should therefore optimize your website in terms of voice search . Find out which questions your audience is asking and answer them briefly and succinctly on your website. For example, in the form of FAQ pages. Use questions such as AnswerThePublic or Ubersuggest to research questions . Alternatively, you can ask your customer support to track the questions your customers ask. 5. OnSERP SEO Ranking one on Google's search results page no longer means being shown there first. Instead, Google plays more and more selected content on the so-called position 0 and plans to do so in the future even more. This content includes so-called Featured Snippets , Direct Answer boxes and carousels for pictures and videos. So the user can often find the answer to his question directly on the Google search results page. Without having to go to the actual website that provides the information. This is good for the users but potentially bad for you as a website owner. Our tip: You can counteract this development in two ways. Try tweaking the content on your website to show it at position 0 by Google. In many cases this will increase your traffic or at least attract attention for your brand. For this, probably next year, the structured awards of FAQs and instructions are important. Second, you should put a lot of emphasis on establishing your brand in the minds of your users. This will make you more independent of Google as a traffic source. 6. Rich Content On the topic of OnSERP SEO Google has also announced that visual content such as videos and images (so-called rich content) should be more important in the search. For example, in the future, videos can also increasingly be played as Featured Snippets. In addition, Google has repeatedly tested this year to upgrade Snippets with small thumbnails. Search engines are getting better at recognizing images and videos. The breakthrough of visual search, where images can be uploaded as an alternative to search terms, and search engine actually searches for similar images based on the characteristics of the original, is also approaching soon. For example, Google Lens already shows the potential of this type of search. Our tip: Say good bye to Stock Photos and create your own, meaningful images and videos for your websites. At the moment the file name, the alt tag and the caption should definitely be optimized for relevant keywords if this describes the content of the image well. 7. JavaScript and SEO JavaScript is becoming increasingly popular and the number of websites based on JavaScript is increasing. For a long time, however, JavaScript and SEO have not really got along. Because Google and Co. could not crawl the JS code, making this content for the search engines were not useful. With improved technology, this has however changed now , and Google, for example, be able to crawl JavaScript and rendering. From a SEO point of view, it can no longer be said that JavaScript is not suitable for websites. In October 2018, John Mueller of Google also announced that the importance of JavaScript for SEOs will increase. Our tip: Develop a basic understanding of JavaScript and learn what matters in search engine optimization of JavaScript applications. Explore the possibilities of dynamic rendering and use tools like prerender.io to optimize the rendering of JS pages.
Content Marketing Trends 2019
Also in Content Marketing you should definitely keep an eye on a few topics in the new year. This includes: 1. Livestreaming How do you let your customers participate in big events, live, location-independent and in color? Of course via livestreams. Streaming conference talks or announcing new products via stream is becoming increasingly popular. Currently, many platforms offer the possibility to implement livestreams. These include, for example, YouTube, Facebook, IGTV and Twitch. However, the enormous potential for companies to come into direct contact with their target group is not yet fully exploited. In addition to virtual participation in major events, live streaming can also be used to provide insights into the daily lives of companies or special employees. Our tip: If you are looking for influencers for your next advertising campaign or looking for a new distribution channel for short ads, then take a look at the Twitch streaming platform. Or try a livestream yourself . 2. Stories First Customers want sympathetic companies to touch. Let customers use your stories to look into your business and participate in activities. Because stories are no longer on the rise . Stories now outnumber the feeds of their respective platform. Snapchat got the story-ruble going, Instagram was very successful and Facebook, WhatsApp and other platforms use the popular feature. Facebook now wants to bring the stories into the groups of its platform so that even non-followers with a specific interest can be addressed directly. Our tip: The most popular stories are those that are close to life. You do not have to be professionally raised because the onlooker wants to see what it really is. Show your customers interesting, funny or new things about your company. 3. Transparency Every second person, according to a study, finds influencers untrustworthy . Influencers, like marketers and sales agents, try to sell products. The result of the study shows above all: Consumers are becoming increasingly critical. Exaggerated promises can even be perceived as fraudulent. In addition, the consumer will find numerous information and reviews on the Internet, even on products with defects. That is why many companies are now pursuing the strategy of acting as transparently as possible. If a product obviously has a defect, it will not hurt if you try to cover it up. Better acknowledge your weaknesses and address negative feedback directly. In the long term, transparent communication also binds the customer to the brand. Our tip: Communicate with your customers transparently and be open-minded with negative criticism. Also govern the feedback promptly, so that no shitstorm arises.
Conclusion
In order to be successful in digital marketing in 2019, SEOs and content marketers should not be put off by well-known statements that have also played a role in digital marketing in recent years: "SEO is dead!", The competition is too high, Google even mixes in position 0 as a competitor on the SERPs and the old tactics have minimal effects. As the challenges for online marketers increase and complexity increases, the reward will be even greater in the event of success. Individual SEO and content measures may be dying out. Optimizing websites for users and search engines will remain a relevant discipline in the New Year as well. The type of optimization is constantly changing. Have potential strategies such as voice search, visual search, Include OnSERP SEO or Livestreaming in Marketing Planning. To find the best tactics for successful optimization in these areas, you need to test what works and stay up to date. Knowledge is a hard currency, especially in the SEO and online marketing industry. For this reason, further education programs for employees and teams are essential. Fact is, companies will benefit from this investment in the future. Read the full article
0 notes
Text
Overcoming Angular SEO Issues Associated with AngularJS Framework
AngularJS is a javascript-based web application framework that provides dynamic functionality to extend HTML and enhance user-experience of a website by creating “single page applications,” or SPAs. Although this platform can be beneficial for user interaction and increased conversion rates, it can be challenging for a search engine optimization strategy and might significantly impact organic search traffic.
What is AngularJS?
AngularJS is like other javascript-based web platforms, such as Ember and Backbone, because it loads data asynchronously to the browser and executes primarily on the client-side as the user interacts with elements of the web page.
This provides increased flexibility with web presentation, especially when compared to using static HTML or dynamic template driven pages. SPAs serve data to the browser – where much of the processing happens – without the need to constantly reload the page after every user action.
As with most javascript programming, the major challenge comes down to the ability to render indexable content and internal navigation or links into the HTML code of the page, so search bots can crawl and index it.
What makes this even more confusing, is that AngularJS was created by Google, so the simple association with the search giant fuels the perception that it’s SEO friendly. However, it can completely hide your page content from the search engines and tank your organic search traffic.
Who is using it?
The first version of Angular was officially released in 2012, but has gained quite a bit of momentum over the last year, especially with enterprise-level clients. According to BuiltWith, 942 of the top 10,000 sites (ranked by traffic from Quantcast) are now using Angular. This is up dramatically from 731 sites only a year ago.
Examples of large websites that are now using Angular, in some fashion, include:
Walmart Grocery
The Guardian
CVS Pharmacy
Weather.com
VEVO
Land’s End
A larger list can be found at MadeWithAngular.
Why is AngularJS so great?
The Angular platform is great for creating dynamic web pages that users can interact with in real time. SPAs are very effective at decreasing bounce rate and keeping visitors on the site longer, which can increase conversion. AngularJS is very useful in situations where there are a variety of data, images, or other elements that need to change quickly within the web page.
Optimizing how users interact with your products can lead to more conversions and increased revenue.CLICK TO TWEETAn example of where this could impact user experience significantly would be a clothing retail website. This type of site sells a variety of products that come in many different options. They sell shirts, pants, shoes and other products that come in different colors, sizes, patterns, and other options that need a visual presentation on the website to sell effectively. It’s important to retail clothing consumers want to see what they are buying before they complete the purchase.
Using static HTML or standard template-driven dynamic sites would require serving a different webpage or reloading the page content to render each option. With AngularJS, all of these options can be preloaded, using ng-show and ng-hide, to allow the customer the ability to change the visual representation just by hovering the mouse over each option. The customer could hover over the desired color and style in real time to visualize the actual product to aid them in making a decision.
For this reason, it’s easy to see how increasing the speed in which the user can interact with products on the website could lead to much higher conversion rates and increased revenue.
What is the problem with AngularJS?
As mentioned, the major challenge with any javascript programming always comes back to indexable content. Search engines have historically been challenged by sites that serve content via javascript. Google has become much more proficient with their ability to crawl and index javascript, and have even deprecated their previous recommendation to make AJAX crawlable. However, they are still not always reliable. The other search engines are even further behind with this ability.
Google still has trouble indexing certain aspects of javascript, especially Angular.CLICK TO TWEETBartosz Góralewiczto conducted an interesting study that tested Google’s capabilities with indexing various javascript based platforms. They created webpages using several popular frameworks:
AngularJS 1
AngularJS 2
React
Ember
jQuery
Vue
Javascript
They concluded that although Google has made significant progress, they still have challenges with indexing certain aspects of javascript, especially Angular.
How bad can it be?
A well-known healthcare company released Angular on the e-commerce section of their website in 2015 for a significant number of product pages. The goal was to roll the many similar pages, that varied only in color or size, into a single product page using Angular to show the various product options. This is the cached version showing what was actually rendered in the HTML on each of their product pages when Google indexed the pages.
The result was a +40% drop in organic search traffic from the previous year. For a site this size, it can equate to hundreds of thousands of dollars or more in lost revenue.
How to overcome SEO issues with AngularJS
Search engines still need to see the content and elements of the page in the source code to guarantee that it will be indexed correctly. One current solution is to consider using a pre-rendering platform, such as Prerender.io. This is middleware that will crawl your web page, execute all javascript files, and host a cached version of your Angular pages on their content delivery network (CDN). When a request is received from a search engine bot it will show them the cached version, while the non-bot visitor will be shown the Angular page.
This may sound like cloaking, but Google has confirmed that it is not. It has also been stated by Google that as long as your intent is to improve user experience, and the content that’s available to the visitors is the same as what’s presented to Googlebot, you will not be penalized.
Never leave the indexing of your site in the hands of the search engines.CLICK TO TWEET
Check the rendering of your web pages
Be aware of how your web pages render for search engines. Here are a few tools that can help:
Browseo – This is a great tool that not only renders the elements of the page, but also lists out total word count, internal and external links, and important <head> content such as HTML title, meta description and keywords, Twitter and Facebook tags, and a SERP preview.
Fetch as Google – From search console, you can run any page on your website to see what Google sees.
Search engine index – Check the most recent cached version of a web page by doing a “site:[domain]” query with Google or Bing. In the search results, locate the drop down caret at the end of the URL and click the and select “cache��� in Google and “cached page” in Bing. This will show what the bot found with the last crawl of your web page.
Angular v4 – The latest version of Angular has been released, and it looks very promising for technical search engine optimization purposes. This version includes Angular Universal, which provides the functionality to generate all of the HTML of a page at a given point and can be deployed to static HTML or a CDN.
Final thoughts…
Google continues to advance with the ability to crawl and index javascript, and the fact that they are “all in” with Angular shows that they will overcome it at some point. However, they explicitly state “… our rendering engine might not support all of the technologies a page uses.” They recommend adhering to “progressive enhancement” principles, which emphasizes focus on core web page content first, and other layers and features secondarily. This shows that they are not quite there yet.
The most important lesson that we have learned in our years of providing technical SEO services, is to never leave the indexing of your site in the hands of the search engines. Be in control of how your web pages render. We are very confident that with Google 100% committed to Angular it will be a very valuable platform to make websites more user-friendly, just make sure that you don’t sacrifice your valuable organic search traffic while it is still evolving.
Ref:http://www.verticalmeasures.com/search-optimization/overcoming-angular-seo-issues-associated-with-angularjs-framework/
0 notes
Link
Best Pre-rendering Testing Tool - Angular, Blazor, React Vue
Getthit.com, the best pre-rendering testing tool for Single Page Application or SPA websites. Supports Angular, React, Vue, Blazor, Prerender.io and more.
0 notes
Photo
Alternative to https://prerender.io/ for Shopify? https://www.reddit.com/r/SEO/comments/f5kwhg/alternative_to_httpsprerenderio_for_shopify/
Are there any alternatives that will allow for prerendering pages on Shopify?
submitted by /u/seo4lyf [link] [comments] February 18, 2020 at 08:32AM
0 notes