Set up farm instances using a non-GPFS file shared configuration
Choose this option to set up the portal farm from a shared filesystem. This option allows us to only maintain one image, which makes using the farm maintenance easier because multiple copies of this one image can be used to update farm instances on a server-by-server basis. There are fewer resources to manage and the environment is simple. The disadvantage to this option is that there is no ability to affect specific changes on a per-server basis and most changes require a server restart.
Set up farm instances using a non-GPFS file shared configuration
The first machine where IBM WebSphere Portal is installed is used as a basis for the portal farm and is termed the original server.
- Create an empty file system on a central file server with enough capacity for a full installation and then mount it on the original server as a writable file system in the location where WebSphere Portal should be installed; for example:
/usr/IBM
- Install WebSphere Portal on the original server into the mounted file system, which results in has a set of product binaries (PortalServer directory) and a default configuration profile (wp_profile). WebSphere Portal runs off the mounted file system on the original server.
- Re-configure the local instance to use a different TCP/IP hostname. This hostname should either be localhost or a hostname configured in the local hosts file alias localhost. This allows the local instance to always reference itself using the loopback address. To re-configure the hostname:
- cd WP_PROFILE/bin
- Run the following task, where node_name is the node name assigned to the local node:
wsadmin.sh -conntype NONE
$AdminTask changeHostName {-nodeName node_name -hostName localhost};
$AdminConfig save
- cd WP_PROFILE/ConfigEngine
- Propagate the profile changes to the WebSphere Portal configuration:
./ConfigEngine.sh localize-clone -DWasPassword=foo
- Configure this instance to represent the baseline configuration for the entire farm, including configuring databases and configuring the user registry.
See the appropriate database and user registry topics under the Setting up a stand-alone server topic in the Related task section.
- For IBM Web Content Manager, run the following task, from WP_PROFILE/ConfigEngine, to set up the local messaging bus and queue:
To avoid the system receiving all content updates from the authoring system, one server outside the farm needs to be identified as the subscriber. This server also requires a message queue where content update messages are posted from members to the farm. All farm servers listen for these messages to update their own content caches.
This step is only complete once when setting up the portal farm. It is not completed on each server in the farm. This step is only executed on the server identified as the WCM SUBSCRIBER used in the portal farm.
./ConfigEngine.sh create-wcm-jms-resources -DWasPassword=foo
If we are using a remote content environment, see the "Work with syndicators and subscribers" link .
- If we are using Web Content Manager on the original server so that all farm instances will listen for content update messages:
This step is performed only once on the original server.
- Edit...
WP_PROFILE/PortalServer/wcm/confprepreq.wcm.properties
...and update the following properties with the appropriate information for the farm servers:
See the prepreq.wcm.properties file for specific information about the required parameters.
- remoteJMSHost
- remoteJMSBootstrapPort
- remoteJMSNodeName
- remoteJMSServerName
- Set up the remote messaging bus and queue:
./ConfigEngine.sh create-wcm-jms-resources-remote -DWasPassword=foo
- To enable the server to run in farm mode, where the systemTemp parameter specifies where the server-specific directory is located. This directory will contain all directories and file that the running portal instance will write to, such as for logging and page compiling:
- Create the target directory path.
For example:
/var/log/was_tmp
- Run the following task to enable the server to run in farm mode:
./ConfigEngine.sh enable-farm-mode –DsystemTemp=/var/log/was_tmp -DWasPassword=foo
- On each server you plan to add to the portal farm, mount the network accessible file system on a new system in the same location as on the original server to preserve the installation path configuration.
- On the farm client...
cd WP_PROFILE/PortalServer/bin
- Start or stop an instance of WebSphere Portal from a farm server:
Operating system Tasks Windows start_WebSphere_Portal.bat stop_WebSphere_Portal.bat
UNIX ./start_WebSphere_Portal.sh ./stop_WebSphere_Portal.sh
IBM i start_WebSphere_Portal.sh stop_WebSphere_Portal.sh
- To use a Web server for load balancing, complete Set up the HTTP server plug-in on a portal farm next.
Parent: Choose the farm instance using a shared configuration