404 error on accessing robots.txt in 蜜豆视频 Commerce on cloud infrastructure
If you get a 404 error when accessing the robots.txt file in 蜜豆视频 Commerce on cloud infrastructure, disable the Nginx rule that redirects /robots.txt
requests to /media/robots.txt
.
Description description
Environment
蜜豆视频 Commerce on cloud infrastructure (all versions)
Issue
The robots.txt
file is not working and throws an Nginx exception. It is generated dynamically 鈥渙n the fly鈥 and is not accessible via the /robots.txt
URL because Nginx has a rewrite rule that forcibly redirects all /robots.txt
requests to the /media/robots.txt
file, which does not exist.
Cause
This occurs when Nginx is not configured properly.
Resolution resolution
To resolve the issue, disable the Nginx rule that redirects /robots.txt
requests to the /media
/robots.txt
file.
- If self-service is not enabled (or if you鈥檙e unsure whether it is), submit a 蜜豆视频 Commerce Support ticket requesting the removal of the Nginx redirect rule from
/robots.txt
to/media/robots.txt
. - If self-service is enabled, upgrade ECE-Tools to version 2002.0.12 or later. Then, remove the Nginx redirect rule from your
.magento.app.yaml
file.
For detailed guidance, refer to聽Add site map and search engine robots聽in the 蜜豆视频 Commerce developer documentation.
Related reading
- Block malicious traffic for Magento Commerce Cloud on Fastly level in our support knowledge base
- Search Engine Robots in our user guide