Skip to content
This repository was archived by the owner on Apr 7, 2025. It is now read-only.

Conversation

@josh-wong
Copy link
Member

Description

This PR adds a robots.txt file to restrict search engines from crawling the docs site. We need to implement this because pages to this old Community docs site still appear in search results. Although those links redirect visitors to the ScalarDB docs site, having links to the Community docs site in search results is confusing.

Related issues and/or PRs

N/A

Changes made

  • Created a robots.txt file that tells web-crawling robots to not crawl the site.

Checklist

The following is a best-effort checklist. If any items in this checklist are not applicable to this PR or are dependent on other, unmerged PRs, please still mark the checkboxes after you have read and understood each item.

  • I have updated the side navigation as necessary.
  • I have commented my code, particularly in hard-to-understand areas.
  • I have updated the documentation to reflect the changes.
  • Any remaining open issues linked to this PR are documented and up-to-date (Jira, GitHub, etc.).
  • My changes generate no new warnings.
  • Any dependent changes in other PRs have been merged and published.

Additional notes (optional)

N/A

@josh-wong josh-wong added the enhancement New feature or request label Apr 7, 2025
@josh-wong josh-wong self-assigned this Apr 7, 2025
@josh-wong josh-wong merged commit 71d199c into main Apr 7, 2025
1 check passed
@josh-wong josh-wong deleted the add-robots.txt-file branch April 7, 2025 02:32
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant