Set up farm instances using a shared configuration

Choose this option to set up the portal farm from a shared filesystem. This option allows you to only maintain one image, which makes using the farm maintenance easier because multiple copies of this one image can be used to update farm instances on a server-by-server basis. There are fewer resources to manage and the environment is simple. The disadvantage to this option is that there is no ability to affect specific changes on a per-server basis and most changes require a server restart.

To set up farm instances using a shared configuration: The first machine where WebSphere Portal is installed is used as a basis for the portal farm and is termed the original server.

  1. Create an empty file system on a central file server with enough capacity for a full installation and then mount it on the original server as a writable file system in the location where WebSphere Portal should be installed; for example:

    • UNIX™: /opt/IBM

    • AIX®: /usr/IBM

    • IBM i: /QIBM/IBM

  2. Install WebSphere Portal on the original server into the mounted file system, which results in a set of product binaries (AppServer and PortalServer directories) and a default configuration profile (wp_profile). WebSphere Portal will run off the mounted file system on the original server.

  3. Reconfigure the local instance to use a different TCP/IP hostname. This hostname should either be localhost or a hostname configured in the local hosts file alias localhost. This allows the local instance to always reference itself using the loopback address. To reconfigure the hostname:

    1. Change to the WP_PROFILE/bin directory.

    2. Run the following task, where node_name is the node name assigned to the local node:

      • UNIX: ./wsadmin.sh –c “/$AdminTask changeHostName {-nodeName node_name -hostName localhost}; /$AdminConfig save” –conntype NONE

      • IBM i: wsadmin.sh –c “$AdminTask changeHostName {-nodeName node_name -hostName localhost}; $AdminConfig save” –conntype NONE

    3. Change to the WP_PROFILE/ConfigEngine.

    4. Run the following task to propagate the profile changes to the WebSphere Portal configuration:

      • UNIX: ./ConfigEngine.sh localize-clone -DWasPassword=foo

      • IBM i: ConfigEngine.sh localize-clone -DWasPassword=foo

  4. Configure this instance to represent the baseline configuration for the entire farm, including configuring databases and configuring the user registry. See the appropriate database and user registry topics under the Setting up a stand-alone server topic in the Related task section below.

  5. Optional. If you are using IBM Web Content Manager, run the following task, from the WP_PROFILE/ConfigEngine, to set up the local messaging bus and queue:

      To avoid the system receiving all content updates from the authoring system, one server outside the farm needs to be identified as the subscriber. This server also requires a message queue where content update messages are posted from members to the farm. All farm servers will listen for these messages to update their own content caches.

      This step is only performed once when setting up the portal farm; it is not performed on each server in the farm. This step is only executed on the server identified as the WCM SUBSCRIBER used in the portal farm.

      • UNIX: ./ConfigEngine.sh create-wcm-jms-resources -DWasPassword=foo

      • IBM i: ConfigEngine.sh create-wcm-jms-resources -DWasPassword=foo

      If you are using a remote content environment, see the "Working with syndicators and subscribers" link below under Related concepts.

  6. Optional. If you are using Web Content Manager, perform the following steps on the original server so that all farm instances will listen for content update messages:

      This step is performed only once on the original server.

      1. Open the prepreq.wcm.properties file, located in the WP_PROFILE/PortalServer/wcm/config/properties/ directory and update the following properties with the appropriate information for the farm servers:

          See the prepreq.wcm.properties file for specific information about the required parameters.

            remoteJMSHost
            remoteJMSBootstrapPort
            remoteJMSNodeName
            remoteJMSServerName

      2. Run the following task to set up the remote messaging bus and queue:

        • UNIX: ./ConfigEngine.sh create-wcm-jms-resources-remote -DWasPassword=foo

        • IBM i: ConfigEngine.sh create-wcm-jms-resources-remote -DWasPassword=foo

  7. To enable the server to run in farm mode, where the systemTemp parameter specifies where the server-specific directory is located. This directory will contain all directories and files that the running portal instance will write to, such as for logging and page compiling:

    1. Create the target directory path. For example:

      • UNIX: /var/log/was_tmp

      • IBM i: /var/log/was_tmp

    2. Run the following task to enable the server to run in farm mode:

      • UNIX: ./ConfigEngine.sh enable-farm-mode –DsystemTemp=/var/log/was_tmp -DWasPassword=foo

      • IBM i: ConfigEngine.sh enable-farm-mode –DsystemTemp=/var/log/was_tmp -DWasPassword=foo

  8. On each server you plan to add to the portal farm, mount the network accessible file system on a new system in the same location as on the original server to preserve the installation path configuration.

  9. Change to the WP_PROFILE/PortalServer/bin directory and run the following task to start or stop an instance of WebSphere Portal from a farm server:

    Steps to stop and start the WebSphere_Portal server on any server in the farm by OS.

    OS Tasks
    Windows start_WebSphere_Portal.bat

    stop_WebSphere_Portal.bat

    UNIX ./start_WebSphere_Portal.sh

    /stop_WebSphere_Portal.sh

    IBM i start_WebSphere_Portal.sh

    stop_WebSphere_Portal.sh


    Disable farm mode

    After setting up farm using a shared configuration, you may need to disable the farm mode, which will allow you to return the original IBM WebSphere Portal instance that manages the shared file system to a regular, stand-alone server instance. You can then make system updates, for example change the systemTemp value, and then run the enable-farm-mode task to re-enable the farm or you can use the instance for a different purpose.


Parent

Choose the type of portal farm to create
Work with syndicators and subscribers


Related tasks


Set up a stand-alone production server

 


+

Search Tips   |   Advanced Search