WCM Content AI Analysis
Learn how to configure the AI analysis feature for WCM Content in a traditional, on-premise deployment, and how to set up a content AI provider. This feature is available from HCL Digital Experience (DX) 9.5 Container Update CF213 onward. The following table outlines AI capabilities by release:
| Release | Feature updates |
|---|---|
| CF224 | AI Workflows and AI Translation are available. |
| CF221 | The default AI model is updated to gpt-4o. |
| CF214 | Custom AI implementations are supported. |
| CF213 | OpenAI ChatGPT is the supported content AI provider. |
Content AI provider - Open AI's ChatGPT
OpenAI is the AI research and deployment company that offers ChatGPT. When you sign up with ChatGPT, it provides API access through an API key. After signing up at https://platform.openai.com/playground, you can create a personal account with limited access or a corporate account. You can use the playground to experiment with the API also. A highlight of the API is that it accepts natural language commands similar to the ChatGPT chatbot.
For privacy and API availability and other conditions, see the OpenAI website or contact the OpenAI team.
Configure the engine task for enabling content AI analysis
To enable content AI analysis:
-
Connect to DX Core and run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-configure-wcm-content-ai-service -DContentAIProvider=CUSTOM -DCustomAIClassName={CustomerAIClass} -DContentAIProviderAPIKey={APIKey} -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadminNote
- Possible values for the
ContentAIProviderparameter areOPEN_AIorCUSTOM. - If the
ContentAIProvidervalue is set asOPEN_AI, the value set for the parameterCustomAIClassNameis ignored. - If
ContentAIProvidervalue is set asCUSTOM, set the custom content AI provider implementation class in theCustomAIClassNameparameter . For example, entercom.ai.sample.CustomerAI. Refer to Configuring AI Class for Custom Content AI Provider for more information about how to implement a custom content AI provider class. - Depending on the
ContentAIProvidervalue, set the correct API key of the respective provider in theContentAIProviderAPIKeyparameter.
- Possible values for the
-
Validate that all the required configurations are added.
- Log in to the WebSphere Application Server console.
-
Verify that the
AI_CLASSconfiguration property is added in WCM Config Service by going to Custom Properties: click Resource > Resource Environment > Resource Environment Providers > WCM_WCMConfigService > Custom Properties. Possible values forAI_CLASSare:com.hcl.workplace.wcm.restv2.ai.ChatGPTAnalyzerService(ifContentAIProvidervalue is set asOPEN_AI)- Value set in parameter
CustomAIClassName(ifContentAIProvidervalue is set asCUSTOM)

-
Log in to the DX Portal.
-
Verify that the Credential Vault with the Vault slot Name
ai.authis configured using the AI content provider's API key by clicking Administration > Security > Credential Vault > Manage System Vault Slot.
Configuring an AI class for a custom content AI provider
Only administrators can configure an AI class to use a custom content AI provider.
Note
If you wish to connect to an AI provider (such as LiteLLM) that is API-compatible with OpenAI models, you may be able to use the default provider class and just change its configuration parameters (see below).
-
Write the custom content AI provider class by implementing the
com.hcl.workplace.wcm.restv2.ai.IAIGeneration. Optionally, starting CF224, you can also implement thecom.hcl.workplace.wcm.restv2.ai.IAITranslationinterface.-
Create the JAR file.
-
Put the JAR file either in a custom-shared library or in the
/opt/HCL/wp_profile/PortalServer/sharedLibraryfolder. -
Restart JVM.
The following example of a custom content AI provider class can be used to call custom AI services for AI analysis.
package com.ai.sample; import java.util.ArrayList; import java.util.List; import com.hcl.workplace.wcm.restv2.ai.IAIGeneration; import com.hcl.workplace.wcm.restv2.ai.IAITranslation; import com.ibm.workplace.wcm.rest.exception.AIGenerationException; public class CustomerAI implements IAIGeneration, IAITranslation { @Override public String generateSummary(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated summary return "AIAnalysisSummary"; } @Override public List<String> generateKeywords(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated keywords List<String> keyWordList = new ArrayList<String>(); keyWordList.add("keyword1"); return keyWordList; } @Override public Sentiment generateSentiment(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated sentiment return Sentiment.POSITIVE; } public String translate(String value, Locale language) throws AIGenerationException { // Call the custom AI Service to translate the value return "translated"; } } -
-
Run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-configure-wcm-content-ai-service -DContentAIProvider=CUSTOM -DCustomAIClassName={CustomerAIClass} -DContentAIProviderAPIKey={APIKey} -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadmin
Config engine task for disabling content AI analysis
To disable content AI analysis:
-
Connect to DX Core and run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-remove-wcm-content-ai-service -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadmin -
Validate that all the required configurations are deleted.
- Log in to the WebSphere Application Server console.
- Verify that the
AI_CLASSconfiguration property is deleted from WCM Config Service. - Log in to the DX Portal.
- Verify that the Credential Vault with the Vault slot name
ai.authis deleted.
Custom configurations for AI analysis
If AI analysis-related configurations require customization, log in to the WebSphere Application Server console for customizing any of the custom properties in the WCM Config Service. Click Resource > Resource Environment > Resource Environment Providers > WCM_WCMConfigService > Custom Properties.
OpenAI ChatGPT specific custom configurations
OPENAI_MODEL: The currently supported AI model isgpt-4o. However, AI model can be overriden by overriding this property.OPENAI_MAX_TOKENS: Set a positive integer value for GPT-3 models liketext-davinci-003. It specifies the maximum number of tokens that the model can output in its response and defaults to256.OPENAI_TEMPERATURE: Set positive float values ranging from0.0to1.0. This parameter in OpenAI's GPT-3 API controls the randomness and creativity of the generated text. Higher values produce more diverse and random output. Lower values produce more focused and deterministic output.OPENAI_HOST: The host to connect to for AI calls, defaults toapi.openai.com. Configuring this could allow you to connect to a different service that offers an OpenAI-compatible API, such as LiteLLM.OPENAI_SCHEME: The scheme which AI calls will use, defaults tohttps.
After enabling the content AI analysis in DX deployment, use the WCM REST V2 AI Analysis API to call the AI analyzer APIs of the configured content AI provider.