-
Notifications
You must be signed in to change notification settings - Fork 490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Customize robots.txt in dockerized Dataverse #10593
Labels
Component: Containers
Anything related to cloudy Dataverse, shipped in containers.
Comments
ekraffmiller
added
Component: Containers
Anything related to cloudy Dataverse, shipped in containers.
SPA
These changes are required for the Dataverse SPA
labels
May 28, 2024
The SPA issue that initiated this PR is related to testing beta.dataverse.org, which @pdurbin explained does not use containers https://dataverse.zulipchat.com/#narrow/stream/375812-containers/topic/how.20to.20configure.20robots.2Etxt.3F/near/441064337. So, although this may be a nice to have for container configuration, it's not needed for the SPA, so I'm removing the label. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The default robots.txt installed in the Dataverse docker image disallows all access to the site. It would be helpful to be able to configure the image with a customized robots.txt that is similar to the one we use in production.
This issue was encountered in testing the SPA - we need to test that the json-ld added to the header is accessible to web crawlers.
SPA issue: IQSS/dataverse-frontend#350
The text was updated successfully, but these errors were encountered: