WCM Content AI Analysis
Learn how to configure the AI analysis feature for WCM Content in a traditional, on-premise deployment. You can also learn steps for configuring a content AI provider to be used for AI analysis. The AI analysis for a WCM Content feature is available in HCL Digital Experience 9.5 Container Update CF213 and later.
Note
OpenAI ChatGPT is the supported content AI provider in CF213 and later. Custom AI implementation is supported in CF214 and later.
Content AI provider overview
OpenAI ChatGPT overview
OpenAI is the AI research and deployment company that offers ChatGPT. When you sign up with ChatGPT, it provides API access through an API key. After signing up at https://platform.openai.com/playground, you can create a personal account with limited access or a corporate account. You can use the playground to experiment with the API also. A highlight of the API is that it accepts natural language commands similar to the ChatGPT chatbot.
For privacy and API availability and other conditions, see the OpenAI website or contact the OpenAI team.
Configure the engine task for enabling content AI analysis
To enable content AI analysis:
-
Connect to DX Core and run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-configure-wcm-content-ai-service -DContentAIProvider=CUSTOM -DCustomAIClassName={CustomerAIClass} -DContentAIProviderAPIKey={APIKey} -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadmin
Note
- Possible values for the
ContentAIProvider
parameter areOPEN_AI
orCUSTOM
. - If the
ContentAIProvider
value is set asOPEN_AI
, the value set for the parameterCustomAIClassName
is ignored. - If
ContentAIProvider
value is set asCUSTOM
, set the custom content AI provider implementation class in theCustomAIClassName
parameter . For example, entercom.ai.sample.CustomerAI
. Refer to Configuring AI Class for Custom Content AI Provider for more information about how to implement a custom content AI provider class. - Depending on the
ContentAIProvider
value, set the correct API key of the respective provider in theContentAIProviderAPIKey
parameter.
- Possible values for the
-
Validate that all the required configurations are added.
- Log in to the WebSphere Application Server console.
-
Verify that the
AI_CLASS
configuration property is added in WCM Config Service by going to Custom Properties: click Resource > Resource Environment > Resource Environment Providers > WCM_WCMConfigService > Custom Properties. Possible values forAI_CLASS
are:com.hcl.workplace.wcm.restv2.ai.ChatGPTAnalyzerService
(ifContentAIProvider
value is set asOPEN_AI
)- Value set in parameter
CustomAIClassName
(ifContentAIProvider
value is set asCUSTOM
)
-
Log in to the DX Portal.
-
Verify that the Credential Vault with the Vault slot Name
ai.auth
is configured using the AI content provider's API key by clicking Administration > Security > Credential Vault > Manage System Vault Slot.
Configuring an AI class for a custom content AI provider
Only administrators can configure an AI class to use a custom content AI provider.
-
Write the Custom Content AI Provider class by implementing the
com.hcl.workplace.wcm.restv2.ai.IAIGeneration
interface.-
Create the JAR file.
-
Put the JAR file either in a custom-shared library or in the
/opt/HCL/wp_profile/PortalServer/sharedLibrary
folder. -
Restart JVM.
The following example of a custom content AI provider class can be used to call custom AI services for AI analysis.
package com.ai.sample; import java.util.ArrayList; import java.util.List; import com.hcl.workplace.wcm.restv2.ai.IAIGeneration; import com.ibm.workplace.wcm.rest.exception.AIGenerationException; public class CustomerAI implements IAIGeneration { @Override public String generateSummary(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated summary return "AIAnalysisSummary"; } @Override public List<String> generateKeywords(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated keywords List<String> keyWordList = new ArrayList<String>(); keyWordList.add("keyword1"); return keyWordList; } @Override public Sentiment generateSentiment(List<String> values) throws AIGenerationException { // Call the custom AI Service to get the custom AI generated sentiment return Sentiment.POSITIVE; } }
-
-
Run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-configure-wcm-content-ai-service -DContentAIProvider=CUSTOM -DCustomAIClassName={CustomerAIClass} -DContentAIProviderAPIKey={APIKey} -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadmin
Config engine task for disabling content AI analysis
To disable content AI analysis:
-
Connect to DX Core and run the following config engine task:
/opt/HCL/wp_profile/ConfigEngine/ConfigEngine.sh action-remove-wcm-content-ai-service -DWasPassword=wpsadmin -DPortalAdminPwd=wpsadmin
-
Validate that all the required configurations are deleted.
- Log in to the WebSphere Application Server console.
- Verify that the
AI_CLASS
configuration property is deleted from WCM Config Service. - Log in to the DX Portal.
- Verify that the Credential Vault with the Vault slot name
ai.auth
is deleted.
Custom configurations for AI analysis
If AI analysis-related configurations require customization, log in to the WebSphere Application Server console for customizing any of the custom properties in the WCM Config Service. Click Resource > Resource Environment > Resource Environment Providers > WCM_WCMConfigService > Custom Properties.
OpenAI ChatGPT specific custom configurations
OPENAI_MODEL
: The currently supported AI model istext-davinci-003
. However, AI model can be overriden by overriding this property.OPENAI_MAX_TOKENS
: Set positive integer values between 1 and 2048 for GPT-3 models liketext-davinci-003
. It specifies the maximum number of tokens that the model can output in its response.OPENAI_TEMPERATURE
: Set positive float values ranging from0.0
to1.0
. This parameter in OpenAI's GPT-3 API controls the randomness and creativity of the generated text. Higher values produce more diverse and random output. Lower values produce more focused and deterministic output.
After enabling the content AI analysis in DX deployment, use the WCM REST V2 AI Analysis API to call the AI analyzer APIs of the configured content AI provider.