-
Notifications
You must be signed in to change notification settings - Fork 26
Description
This is a bit of a longer text, since the issues are not strictly white and black, and since I wanted to provide as much evidence and insight into what information I consumed to get to the report below.
First of all, this is in part related to the following issue #90 , so there might be stuff here that overlaps with mentions in that issue.
The Report below was researched and put together 100% by a human. However, the formatting and phrasing was altered by an AI in a format similar to the one i regularly use for custom reports in my private CyberSec Activities (small group of CyberSec Enthusiasts, Experts and CTF/general Hacking Buds) to make it more readable and useable. However, the content within is still fully correct and resembles the research of myself. To make it even easier, I include additionally to the report, a bit of my research summaries with the links from where to find and derive the information from.
Also I need to clarify that, whilst I do develop in different Languages, I have almost no experience with WebDev and the insane Iceberg behind Front- and Backend. So keep that in mind if any inaccuracies occur, i tried my best to grasp key elements and sources to give a solid understanding of key issues ficsit.app might face.
Below is the Report. Following the report are my Notes, which served as the basis for the AI-optimized summary (including some derived suggestions).
Click to expand/collapse the Report
SEO Audit / Report for Ficsit.app
Date: 17 September 2025
Reviewed by: External auditor (based on public HTML & current Google / SPA / JavaScript SEO best practices)
Findings
Item | Observed Behavior | Problem / Impact |
---|---|---|
HTML shell only / client-side rendering (CSR) | The initial HTML (for home, main routes) delivers only an app shell: navigation, headers, buttons, etc. The actual content (e.g. lists of mods, descriptions) is loaded via JavaScript / GraphQL after page load. | Googlebot and other search engines see very little content on crawl. This leads to poor ranking / almost no keyword visibility, because there's no static content to index. |
Generic metadata on all pages | The <title> / <meta description> are generic (“Home - SMR”, “Satisfactory Mod Repository”, etc.) and appear identical across routes. |
When mod pages or guide pages have no unique title/description in static HTML, search engines can’t distinguish them; duplicates hurt ranking; low click-through rates. |
Lack of server-side rendering or prerendered static content | No prerender or SSR evident. No fallback content or <noscript> sections. |
Delays in Google rendering JS; sometimes Google delays or partially renders JS; content may not be indexed in time. |
“Empty” cache / no archive | cache:ficsit.app returns nothing; likely no or limited cached snapshot by Google. |
If Google deems pages not worth caching, visibility suffers. Also suggests that rendering is incomplete or unknown to Google. |
Navigation & UI heavy; content light | The HTML includes banners, sidebars, common UI elements, but minimal or no semantic content in source. | Search engines favor pages with substantial text content, headings, semantic structure. UI chrome doesn’t help ranking. |
SPA routing issues | All routes appear to use the same shell until hydration; likely similar structure for /mods , /guides , etc., but static HTML doesn’t include route-specific content. |
Search engines treat pages with near-identical content as duplicates or collapse them; low crawl depth, low route discovery. |
Cross-Reference: Best Practices & Current Standards (2025)
Based on recent, up-to-date sources:
- Google’s doc “Understand JavaScript SEO Basics” explains how Google crawls → renders → indexes JavaScript-based content and warns that reliance on CSR alone can hide content or delay indexing. (developers.google.com)
- Guides for SPAs in 2024-2025 (e.g. from Prerender.io, WeDoWebApps) emphasize enabling server-side rendering (SSR) or prerendering for content-heavy routes to ensure important content is available to crawlers without JavaScript execution. (prerender.io)
- Metadata (title, description, Open Graph tags) need to be unique per page/route to help search engines and users. Duplicate or generic metadata across routes is a well-known SEO anti-pattern. (magnolia-cms.com)
- Structured data (schema.org / JSON-LD) helps search engines understand content types (mods, guides, tools) and can help with rich search results. (wedowebapps.com)
Recommendations
Here are specific, technical suggestions for improving SEO / search visibility.
-
Enable SSR or Prerendering for Core Content Routes
- For all major routes (
/mods
,/mods/<mod-id>
,/guides
,/tools
), deliver fully rendered HTML with mod names, descriptions, authors, etc., server-side or via build-time prerender. - Use SvelteKit’s SSR or static prerender option.
- For all major routes (
-
Unique Metadata Per Page / Route
- Each mod page should have
<title>
like “ – Satisfactory Mod Repository” rather than “Home – SMR”. - Unique
<meta name="description">
summarizing the mod content, versions, etc. - Also set Open Graph, Twitter card meta tags per page (images, summary).
- Each mod page should have
-
Structured Data / Schema Markup
- Use JSON-LD to describe mods (e.g.
SoftwareApplication
,CreativeWork
), guides, tools; include version, author, etc. - Include breadcrumbs, if applicable, so search results show navigation paths.
- Use JSON-LD to describe mods (e.g.
-
Provide Fallback Content /
<noscript>
- It can help to include key information in a
<noscript>
section so that non-JS renderers / crawlers see something meaningful even before JS hydration.
- It can help to include key information in a
-
Ensure All Critical JS / CSS / API Endpoints Are Crawlable
- Check
robots.txt
to ensure you are not blocking essential JS, CSS, or API endpoints used to render content. - Verify headers and status codes: no accidental
noindex
,nofollow
, or 403/404 on content resources.
- Check
-
XML Sitemap + Canonical Tags
- Maintain an up-to-date XML sitemap listing all content routes (mod pages, guides, etc.), submit via Google Search Console.
- Use
<link rel="canonical">
appropriately to avoid duplicate content issues.
-
Optimize JS Payload / Load Performance
- While rendering content, also work on Core Web Vitals: first contentful paint, time to interactive, cumulative layout shift.
- Use code splitting, lazy loading of non-critical JS, preload fonts, compress assets.
-
Dynamic Metadata & Route-Based Titles in CSR
- If CSR must be used, make sure route transitions update
<title>
and<meta>
dynamically (client side), but also ensure that server-side output includes them for crawlers.
- If CSR must be used, make sure route transitions update
-
Test How Google Sees the Site
- Use the URL Inspection Tool in Google Search Console to see rendered HTML for various routes.
- Use tools like Lighthouse / WebPageTest to monitor what content is visible on first load.
- Use Google’s Rich Results Test and Schema Validator to check structured data.
-
Monitor over time
- After making changes, monitor Search Console: coverage, search performance, keyword rankings.
- Track clicks/impressions for content pages to see if visibility improves.
Possible Order of Implementation (Suggested)
To reduce risk & get improvements earlier:
- Prerender the "top" mod pages and key content (most popular mods) so those have full static HTML quickly.
- Add unique metadata + structured data to those pages.
- Iterate to expand prerender / SSR across all content routes.
- Confirm that
robots.txt
and caching are properly configured. - Monitor via Search Console and adjust.
Summary
If nothing else changes, ficsit.app is likely to remain mostly invisible in Google searches for keywords (like “satisfactory mods”, “specific mod name”) despite technically being indexed via site:ficsit.app
. The root cause is reliance on client-side rendering and lack of static, crawlable content in the delivered HTML.
With moderate work (SSR/prerender + improved metadata + structured data), the site should gain significantly in search visibility.
Additionally, here are my original Notes with links and summaries from my research.
Click to expand/collapse the original Notes
1. HTML Content (CSR vs SSR)
- When you open the page source or curl the site, you mostly see the menu/navigation but not the actual mod texts. That’s basically the “app shell” thing, so it’s rendered on the client side (CSR) | SOURCE 1
- If you turn off JavaScript in Chrome, the page is basically empty. That confirms it.
- ASSUMPTION: Google tools (Search Console → URL Inspection) would most likely show that Google only sees an empty shell until JS loads. SOURCE 2 | SOURCE 3
So yeah, the site doesn’t give Google much in the raw HTML.
Additional Sources:
2. Titles and Meta Stuff
- If you look at
<title>
and<meta name="description">
, a lot of pages have the same text. - A crawler like Screaming Frog shows this pretty quickly – many pages don’t have unique titles or descriptions.
- That’s bad for SEO, because Google wants each page to have unique info there.
3. Server-Side Rendering / Prerender
- I checked the open-source frontend code, and it looks like on some pages like for mods, users and guides prerender is set to false, which means that these will from what i read only render dynamically on the server instead of being fully CSR (Since they rely on SSR if CSR is not disabled globally.
- Tools like Lighthouse also show a delay before content shows up, BUT this only happens for routes that are CSR or have dynamic hydration (i hope i understood that term and concept correctly).
- Rich-Search shows no results as seen in this Test using the official Google Tool, however full analysis can only be done using the Google Search Console | SOURCE 1
- Basically: Only pages explicitly CSR or with heavy client-only rendering may be delayed for bots which equals: these are worse for SEO.
- Now, while search engines try to fix this by running scripts with unknown limits, it is not fully effective and more of a "hit n miss"
4. Google Cache / Indexing
-
Search Console is the best way to see if stuff is indexed. If you do
site:ficsit.app <mod name>
on Google and nothing shows, then Google hasn’t really indexed it. However, Google does index out mod pages, but from what i have looked through it only indexes our mod pages and guides, i have not seen the homepage (and the most important one to be indexed, as google from my understanding uses it as a sort of "hub" for internal link authoritiy - and not being able to index or high-index the homepage brings issues to the page ranks and therefor the subpages - also it may think of dynamic query urls as duplicates or low-value as described later) -
The old
cache:
operator doesn’t show anything either, so no snapshot exists or they already phased it out fully). SOURCE -
Addition: I looked with schema.org and found nothing regarding structured data as seen here
5. Content vs Just UI
- In the raw HTML, there’s barely any real text (like descriptions or headings, which mainly show up in guides or mod descriptions).
- Crawlers like
lynx -dump
or Screaming Frog confirm that too. - That means search engines see mostly buttons, menus, and empty divs. Not good. (Google runs JS though, but since alot of CSR happens with most of the website, it is likely the homepage is looking empty for the crawlers, even google, and it will only see the navigation like a button to somewhere, but not the actual content of that.
6. SPA Routing
-
Each route (like
/mod/xxx
, /mod?p=xxx or/guides/
) just loads the same HTML shell. The actual content comes in later via JS. (i hope i got that right, i only went through the frontend partially, since its a lot and understanding the entire frontend is beyond me, i just hope the actual devs can use this info in some way and understand what i mean) -
Google has said before that if every route is just one shell, then they can’t properly crawl or index unique content per page.
-
So unless the site is prerendered or SSR is used, Google won’t really know the difference between pages. SOURCE 1
-
Addition: It seems ficsit.app uses dynamic query parameters '?p=0' in '/mods/?p=0'. Technically i think Google recognizes these as different URLs and it could confuse Google if there is no canonical tag pointing to a preferred URL (but im not sure).
7. Tools I Used / That Work
- Google Search Console (URL Inspection, Coverage reports).
- Browser DevTools → View Source, disable JS, check network.
- Screaming Frog (in JS mode).
- Lynx Dump
- Lighthouse / PageSpeed Insights for performance (though only as accurate as you trust it since it cant evaluate everything and is prone to inaccuracies as they advert on their own page).
- robots.txt check (to see if important files are blocked).
- https://schema.org/ to check for Structured Data
- https://search.google.com/test/rich-results to check if we have any Rich Results for the Website
- https://svelte.dev/docs/kit/seo for the seo docs
So yeah, the site looks fine for humans but for SEO it’s kind of invisible until JS runs, which is exactly what Google warns about with SPAs.
The only real parts of the Site that SEO would work on are the Guides and Mod Pages where actual Headers and so on are existing. These also provide Structured Data which is a great addition for better SEO Results.
maybe prerendering the homepage and mod?p= page ("key listing pages" - the ones that list mods and other stuff) will help in addition with some other stuff i will write in the report.
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Status