How to add a robots.txt in DX container environments
Applies to
HCL Digital Experience 8.5 and higher
Introduction
This article describes how to add a robots.txt file in your HCL Digital Experience (DX) container environment to control how search engines crawl and index your site. A robots.txt file contains instructions that search engine crawlers use to determine which parts of your site to crawl and include in their search index. For more information, refer to Introduction to robots.txt.
Instructions
Refer to the following steps to add a robots.txt file in DX container environments:
- Deploy DX in any supported cloud providers, such as Amazon Elastic Kubernetes Service (EKS), Azure Kubernetes Service (AKS), Google Kubernetes Engine (GKE), and Red Hat OpenShift.
- Build your web application (
.warfile) with therobots.txtfile included within the root of the application's web content. - Configure the application's context root to
/. - Deploy the application in the Websphere Application Server (WAS).
- Access your
robots.txtusing the load balancer host name or IP. For example:https://<loadbalancer hostname or IP>/robots.txt.