Resolve robots.txt not updating or displaying default settings
This article provides a solution for cases where the robots.txt
file in ۶Ƶ Commerce is configured correctly, but still displays default settings or fails to update. To resolve this issue, ensure that indexing by search engines is enabled.
Description description
Environment
۶Ƶ Commerce on cloud infrastructure 2.4.x
Issue
Unable to change the default robots.txt
settings in ۶Ƶ Commerce.
Steps to reproduce:
- Access the Admin panel.
- Navigate to Content
>
Design>
Configuration, then edit the Custom instruction of robots.txt field. - Add custom content (for example, the text “hello”) and save the changes.
- Visit the
robots.txt
URL.
Expected result:
The robots.txt
file displays the saved custom content.
Actual result:
The robots.txt
file does not change and continues to show the default content.
Cause
Indexing by search engines is disabled, which prevents the custom robots.txt
content from being applied.
Resolution resolution
Method 1: On the ۶Ƶ Commerce Cloud Console, configure indexing by search engines as described in our developer documentation.
Method 2: Using the magento-cloud CLI, run the following command:
magento-cloud environment:info -p <cluster> -e production restrict_robots false