+

Search Tips   |   Advanced Search


Install multiple clusters in a single cell on HP-UX

Create a new, independent WebSphere Portal cluster in a cell where a WebSphere Portal cluster already exists.

In the following steps, Cluster A will be used to describe the existing cluster. Portal B will be used to describe the new server profile that will be the basis for the new cluster definition, Cluster B.

Perform the following steps to install multiple clusters in a single cell:

  1. Upgrade Cluster A, including the Deployment Manager node, to the current, supported hardware and software levels and to the current version of WebSphere Portal.

  2. Configure Portal B; see the "Preparing the primary node" topic for the appropriate operating system for details.

    Maintain the same number of data sources with identical names to the Cluster A data sources so that data source bindings in the applications can be resolved on every cluster in which they run. If implementing database sharing across the clusters, the above statement refers to both the shared and non-shared domains; all domains should use the same names.

  3. Optional: Using the same database user ID and password for each identically named domain/data source will allow the existing JAAS Authentication Aliases to be functional. If unique database user ID and password are required, additional manual configuration is needed to create new JAAS Authentication Aliases for each data source and map these accordingly. Run...

      ./ConfigEngine.sh create-alias-multiple-cluster -DauthDomainList=release,jcr -DWasPassword=dmgr_password

    from...

      profile_root/ConfigEngine

    ...to create the new JAAS Authentication Aliases.

    where authDomainList is set to a list of domains which use unique database user ID and passwords and those domain properties are set correctly in the wkplc_comp.properties file, including user ID and password.

  4. Optional: If necessary, upgrade Portal B to the current cumulative fix.

  5. Run...

      ./ConfigEngine.sh mapped-app-list-create -DWasPassword=password

    from...

      profile_root/ConfigEngine

    ...to build an inventory list of Portal B enterprise applications and portlets.

  6. Stop the server1 and WebSphere_Portal servers on Portal B and ensure that the following parameters are set correctly in wkplc.properties, located in...

      profile_root/ConfigEngine/properties

  7. If the Deployment Manager is configured to use a stand-alone LDAP user registry, update wkplc.properties, located in...

      profile_root/ConfigEngine/properties

    on the primary node with the stand-alone LDAP user registry property values from the Deployment Manager. You can find these settings under the VMM Stand-alone LDAP configuration heading.

    Ensure that you set WasUserid and WasPassword to the Deployment Manager user ID and password.

  8. Run...

      ./ConfigEngine.sh cluster-node-config-pre-federation -DWasPassword=password

    from the profile_root/ConfigEngine of the primary node.

  9. Run...

      ./ConfigEngine.sh map-apps-to-server -DWasPassword=password

    to determine which applications from the inventory list are no longer mapped to Portal B.

    The task uses the application profiles already in the cell to restore the mappings. Wait 30 minutes after running this task to allow all EAR files to expand before proceeding to the next step.

  10. To federate Portal B into the deployment manager cell:

    1. Ensure that all database parameters are correctly set, including passwords, in the wkplc_comp.properties and wkplc_dbtype.properties files.

    2. Run...

        ./ConfigEngine.sh cluster-node-config-post-federation -DWasPassword=dmgr_password

    3. After running the cluster-node-config-post-federation task, wait at least 30 minutes to allow all EAR files to expand.

    4. Since the WebSphere Portal node is now using security settings from the Deployment Manager cell, update the WebSphere Portal administrative user ID and password to match an administrative user defined in the cell's user registry. Run...

        ./ConfigEngine.sh wp-change-portal-admin-user -DWasPassword=password -DnewAdminId=newadminid -DnewAdminPw=newpassword -DnewAdminGroupId=newadmingroup

      ...from...

        profile_root/ConfigEngine

      ...to update the WebSphere Portal administrative user ID if the Deployment Manager cell is using a different user registry.

      This task verifies the user against a running server instance.

      If the server is stopped, add the -Dskip.ldap.validation=true parameter to the task to skip the validation.

      If standalone LDAP security is already enabled on the Deployment Manager cell, delay running the wp-change-portal-admin-user task until after the cluster-node-config-cluster-setup (static cluster) or cluster-node-config-dynamic-cluster-setup (dynamic cluster) task completes. After running the wp-change-portal-admin-user task, start or restart the WebSphere_Portal server to use the updated administrator user ID.

      The WebSphere Portal administrative user ID and administrative group must exist in the Deployment Manager before running the wp-change-portal-admin-user task. -DnewAdminPw is an optional parameter to update the Administrative password in wkplc.properties if required.

    5. From the administrative console, click System Administration > Node Agents.

    6. Check the box next to the required node agent and then click Restart.

    7. Cycle the deployment manager.

    8. Cycle the WebSphere_Portal server on Portal B

  11. Restart the WebSphere_Portal server on Cluster A. Verify that Cluster A is functionally intact by spot checking pages and portlets and then verify that Portal B is functionally intact by spot checking pages and portlets that you deployed into Portal B before it was federated. Any discrepancies or errors should be corrected now before continuing.

    If Portal B is using a non-default Portal server administrative ID, not wpsadmin, the server will not be functional until the cluster configuration is complete and the Portal administrative ID has been configured to match the Cells security settings.

  12. Choose one of the following options to define a cluster using Portal B as the basis:

    Option Description
    Static cluster Perform the following steps to define a static cluster using Portal B as the basis:

    1. Run the ./ConfigEngine.sh cluster-node-config-cluster-setup -DWasPassword=dmgr_password task.

    2. Configure the cluster to use an external Web server to take advantage of features such as workload management. Choose one of the following options:

      Start with the step about launching the Plug-ins installation wizard.

    3. Access the Web Content Management content through an external Web server:

      1. Log on to the deployment manager administrative console.

      2. Select Environment > WebSphere Variables.

      3. From the Scope drop-down menu, select the Node=nodename, Server=servername option to narrow the scope of the listed variables, where Node=nodename is the node that contains the appserver.

      4. Update the WCM_HOST variable with the fully qualified host name used to access the WebSphere Portal server through the Web server or On Demand Router.

      5. Update the WCM_PORT variable with the port number used to access the WebSphere Portal server through the Web server or On Demand Router.

    4. Save changes and then restart the deployment manager, the node agent(s), server1, and the WebSphere_Portal servers.

    5. Change the task list settings to point to the cluster instead of a single server if using Portal business process integration. See one of the following sections in the Information Center for information:

      • Cross cell: Adding a BPI-enabled portal server to a managed cell in a cross-cell setup

      • Single cell: Adding a BPI-enabled portal server to a managed cell in a single-cell setup

    Dynamic cluster Perform the following steps to define a dynamic cluster using Portal B as the basis:

    1. Log on to the deployment manager administrative console.

    2. Create a node group:

      1. Click New.

      2. Type the node group Name.

      3. Type any information about the node group in the Description text box.

      4. Click OK.

      5. Click the Save link to save your changes to the master configuration.

    3. To add members to the node group:

      1. Click System administration > Node groups.

      2. Click on the name of the node group that you want to add members to.

      3. Click Node group members under Additional Properties.

      4. Click Add.

      5. Select the primary node and then click Add.

      6. Click the Save link to save your changes to the master configuration.

    4. Create a dynamic cluster in the node group:

      1. Click Servers > Dynamic clusters.

      2. Click New.

      3. Select WAS from the Server Type pull-down menu and then click Next.

      4. Type the cluster name in the Dynamic cluster name text box and then click Next. Type the same value that you provided for the ClusterName parameter in wkplc.properties of the primary node.

      5. Remove all default membership policies and then click Subexpression builder.

      6. Enter the following information in the Subexpression builder window:

        1. Select and from the Logical operator pull-down menu.

        2. Select Nodegroup from the Select operand pull-down menu.

        3. Select Equals (=) from the Operator pull-down menu.

        4. Type the nodegroup name created in the previous step in the Value text box.

        5. Click Generate subexpression.

        6. Click Append.

      7. Click Preview membership to verify that all nodes included in the nodegroup display and then click Next.

      8. Click the Create the cluster member using an existing server as a template radio button and then select the WebSphere_Portal server for the primary node from the pull-down menu.

      9. Click Next.

      10. Specify the dynamic cluster.properties and then click Next.

      11. Review the summary page to verify your actions and then click Finish.

    5. Define or verify the following parameters in wkplc.properties, located in...

        profile_root/ConfigEngine/properties

      1. Set ClusterName to the name of the new dynamic cluster.

      2. Verify that CellName is set to the deployment manager cell.

      3. Verify that NodeName is set to the local WebSphere Portal node.

      4. Set ServerName to the server that will be used for the dynamic cluster member on this node.

      5. Verify that PrimaryNode is set to true.

    6. Run...

        ./ConfigEngine.sh cluster-node-config-dynamic-cluster-setup -DWasPassword=dmgr_password

      from...

        profile_root/ConfigEngine

      ...to create the dynamic cluster.

    7. Access the Web Content Management content through an external Web server:

      1. Log on to the deployment manager administrative console.

      2. Select Environment > WebSphere Variables.

      3. From the Scope drop-down menu, select the Node=nodename, Server=servername option to narrow the scope of the listed variables, where Node=nodename is the node that contains the appserver.

      4. Update the WCM_HOST variable with the fully qualified host name used to access the WebSphere Portal server through the Web server or On Demand Router.

      5. Update the WCM_PORT variable with the port number used to access the WebSphere Portal server through the Web server or On Demand Router.

    8. Save changes and then restart the deployment manager, the node agent(s), server1, and the WebSphere_Portal servers.

    9. For a Web server to connect to the On Demand Router (ODR), configure the web server as a trusted proxy on the ODR. Refer to Configure a Web server as a trusted proxy server for instructions.Tip: You can also configure the ODR to dynamically update the Web server configuration when changes occur. Refer to Configure an on demand router to dynamically update the Web server plug-in configuration for instructions.

    10. Change the task list settings to point to the cluster instead of a single server if using Portal business process integration. See one of the following sections in the Information Center for information:

      • Cross cell: Adding a BPI-enabled portal server to a managed cell in a cross-cell setup

      • Single cell: Adding a BPI-enabled portal server to a managed cell in a single-cell setup

  13. Install any additional nodes to the cell to support additional cluster members for Cluster B identically to the primary node, and then federate as them as secondary nodes and define as cluster members on these nodes. For information about adding additional nodes navigate to Installing WebSphere Portal > Setting up WebSphere Portal > Setting up a clustered production environment. Select the appropriate operating system and navigate to Preparing additional nodes. You can then choose to add an additional static or dynamic cluster and/or a vertical static or dynamic cluster.

  14. Restart the server1 and WebSphere_Portal servers on Cluster A and Cluster B.

Installation of Cluster B is complete. It is now an independent cluster from Cluster A, which means that Cluster B can have its own configuration, set of end-user portlets, and target community. Any applications that are common between Cluster A and Cluster B are most likely infrastructure or related to administration, and special care needs to be taken to preserve their commonality between clusters and correct maintenance levels.


Parent topic:

Set up multiple clusters on HP-UX


Next topic:

Routing requests across clusters on HP-UX


Related tasks


Add a BPI-enabled portal server to a managed cell in a cross-cell setup
Add a BPI-enabled portal server to a managed cell in a single-cell setup