Cracking the SEO Code for React SPAs: Why We Built a Custom Prerenderer at HashQ
Look, anyone who has been in the web architecture space for a while knows the drill. You start a new project, you pick React because the developer experience is unmatched, and the UI development is going incredibly smooth. Then comes the classic bottleneck: "Why aren't our dynamic pages indexing properly on Google?"
When we were architecting the new hashq.in, we faced this exact dilemma. We had deliberately moved to a React-only Single Page Application (SPA) powered by Vite. The bundle sizes were lean, and the client-side routing was lightning fast. But SPAs and web crawlers still have a bit of a love-hate relationship, especially when you are trying to rank for highly technical services.
You might be thinking, "Why not just use Next.js and call it a day?" It’s a fair question. Server-Side Rendering (SSR) frameworks are great, but they also bring significant overhead. For our specific use case at HashQ Technologies, we didn't want the infrastructural baggage of running Node servers just to serve static marketing and portfolio pages. We wanted the pure, decoupled elegance of a Vite SPA, but with the SEO muscle of a statically generated site.
Here is a dive into how we engineered a custom, ground-up solution.
The Architecture: Puppeteer to the Rescue
Instead of migrating our entire codebase to a heavy framework, we decided to handle the SEO requirement at the deployment pipeline level.
We built a custom Node.js script utilizing Puppeteer. If you aren't familiar, Puppeteer is a Node library which provides a high-level API to control headless Chrome.
Here is the architectural flow of how we do it:
- The Build Phase: We run our standard Vite build.
- The Headless Crawl: Our custom Node script spins up a headless browser and serves the built production files locally.
- Dynamic Prerendering: Puppeteer navigates through our critical routes (our services, the blog, the team pages). It waits for the React DOM to fully mount, ensuring all API calls are resolved and the UI is painted.
- HTML Extraction: Once the page is fully rendered, we scrape the final HTML—including all the injected
<meta>tags and complex JSON-LD schemas. - Static Generation: We save this fully-baked HTML into static files in our
distfolder.
Injecting Structured Data (JSON-LD)
A visually rendered page is only half the battle. To really communicate with search engines, your technical SEO needs to be flawless.
Because we control the entire prerendering phase, we inject dynamic JSON-LD schema right before Puppeteer snaps the HTML. Whether it’s an article schema for this very blog, or an intricate LocalBusiness schema detailing our Thrissur headquarters, it gets baked directly into the static source code. To Googlebot, hashq.in looks exactly like a traditional, lightning-fast static website. To the user, once the initial HTML loads, React hydrates in the background and takes over the routing seamlessly.
The Trade-offs and the Wins
As an architect, every decision involves trade-offs.
- The Cost: Our CI/CD pipeline takes a couple of minutes longer to build. Puppeteer is resource-heavy.
- The Benefit: Zero server overhead. We host the entire platform on a standard CDN. The Time to First Byte (TTFB) is virtually nonexistent, and our Lighthouse scores are sitting comfortably in the green.
More importantly, it keeps our frontend stack completely decoupled from our backend ecosystem. Our Node.js, Express, and PostgreSQL microservices can focus entirely on crunching data and serving APIs, without worrying about rendering frontend views.
Looking Ahead
Building resilient, high-performance systems is what we do day in and out at HashQ. This custom prerendering engine is just one small module we've built to optimize our workflow. In fact, we are currently abstracting a lot of these internal tools into a modular ecosystem we are calling the HashQ Stack—designed to make enterprise-grade Node.js and React development a lot less painful.
Stay tuned to this space. We’ll be open-sourcing some of these utilities soon. Until then, keep building robust systems.
RELATED_EXPERTISE
Interested in Custom Software Engineering?
Explore how we apply these engineering principles to real-world products.