Configuring geographic mirroring with dedicated independent disk pools

 

To configure geographic mirroring, first configure your cross-site mirroring (XSM) environment and create the independent disk pool that you want to mirror.

This includes defining your primary and backup nodes within the recovery domain. Before using iSeries™ Navigator, you should also define at least one and as many as four data port TCP/IP addresses, which would constitute one or more one-to-one bidirectional routes as part of the connection between the production copy nodes and the mirror copy nodes. Geographic mirroring will allow you to maintain an exact copy of the independent disk pool on a system at a different location for protection and availability purposes.

The following figure shows an example configuration for geographic mirroring. Primary Node A in New York City is the source system for the production copy of the independent disk pool that is dedicated to Node A. Node B is the backup system in Boston that is the target node for the mirror copy of the independent disk pool that is dedicated to Node B.

To configure your geographic mirroring with the iSeries Navigator, follow these steps:

  1. Plan and configure your data port TCP/IP routes.

  2. Create a cluster.

  3. Create the independent disk pool that you want to mirror.

  4. Create the device cluster resource group, also known as switchable hardware group, for the independent disk pool that you just created:

    1. In iSeries Navigator, expand Management Central.

    2. Expand Clusters.

    3. Expand the cluster for which you need to add a switchable hardware group.

    4. Right-click Switchable Hardware, and select New Group.

    5. On the Create new or add existing disk pool dialog box, select No, add an existing switchable disk pool to the switchable hardware group.

  5. Define your geographic mirroring sites in the recovery domain:
    1. Right-click the newly created switchable hardware group and select Properties.
    2. Select the Recovery Domain tab.
    3. Select the primary node and click Edit.
    4. In the site name field, specify the primary site for the production copy. Since this example contains two nodes, each of the nodes must have a different site name, with one node per site. In environments with more that two nodes, the site with more than one node must have an independent disk pool in either a hardware tower or an IOP which can be switched between nodes within the same site. In this case, nodes within that site would have the same site name.

      The site name for the production and mirror copy can not be the same.

    5. Click Add to specify the data port IP addresses of the primary node.
    6. On the Edit Node dialog, specify the data port IP addresses for the primary node that you setup in step 1, and click OK. You can configure up to four data port IP addresses in a one-to-one manner, one port to one port, each port independent of the other. You should consider configuring multiple communication lines to allow for redundancy and the highest throughput. The same number of ports used here should be configured on the mirroring node.
    7. On the General tab, click OK.
    8. Repeat the previous steps to specify the site name and IP addresses for the mirror copy node.

  6. After you have completed the XSM prerequisites, follow these steps to configure geographic mirroring:
    1. In iSeries Navigator, expand My Connections (or your active environment).
    2. Expand the primary node on the source system.
    3. Expand Configuration and Service > Hardware > Disk Units > Disk Pools.
    4. If the Geographic Mirroring columns are not displayed, click the Disk Pool you want to mirror, and select View > Customize this view > Columns, then select the columns with the suffix Geographic Mirroring from the Columns available to display list.
    5. Right-click the Disk Pool you want to mirror, and select Geographic Mirroring > Configure Geographic Mirroring.
    6. Follow the wizard's instructions to configure geographic mirroring.

      The disk pools you select to geographically mirror must be in the same switchable hardware group. If you want to geographically mirror disk pools in more than one switchable hardware group, you need to complete the wizard one time for each switchable hardware group.

You have now configured geographic mirroring. Starting switchable hardware group to prepare the independent disk pool for use in this environment. Print your disk configuration in case a recovery situation. Also, record the relationship between the independent disk pool name and number.

To start switchable hardware group, follow these steps.

  1. Start the switchable hardware group to enable device resiliency for the switchable hardware group.

  2. Make a disk pool available. To access the disk units in an independent disk pool make the disk pool available (vary on) the disk pool.

  3. Wait for resync to complete.

  4. Perform a test switchover. Before you add data to the disk pool, perform a test switchover to ensure the backup node can become the primary node and the primary node can become the backup node.

If you remove a node from a device domain after you configure geographic mirroring, the removed node takes any production copies or mirror copies that it owns. These nodes are changed to nongeographic mirrored disk pools.

You can also use CL commands and APIs for creating a switchable independent disk pool, however there are some tasks that require that you use iSeries Navigator. To configure your geographic mirroring with the CL commands and APIs, follow these steps:

  1. Plan and configure your data port TCP/IP routes on all nodes in the recommended manner as follows:

    • Node A should have routes to C and D.

    • Node B should have routes to C and D.

    • Node C should have routes to A and B.

    • Node D should have routes to A and B.

  2. Create the cluster with the nodes that you want using the CRTCLU (Create Cluster) command.

  3. Start the nodes in the cluster using the STRCLUNOD (Start Cluster Node) command.

  4. Create the device domain. You must create the device domain for all nodes involved in switching an independent disk pool using the ADDDEVDMNE (Add Device Domain Entry) command.

  5. Create the device descriptions. Device descriptions must be created on each node that will be in the cluster resource group (CRG). Use the CRTDEVASP (Create Device Description (ASP)) command. On the command line in the character-based interface, enter CRTDEVASP. In the Resource Name and the Device Description fields, enter the name of the independent disk pool you plan to create.

  6. Create the cluster resource group. Create the device CRG with the nodes, their roles in the recovery domain, and the independent disk pool device descriptions. You must also specify a site name and data port IP addresses for each node in the recovery domain. Use the CRTCRG (Create Cluster Resource Group) command.

  7. Create a disk pool on the node that owns the disk units using the New Disk Pool wizard when the system is fully restarted. Make sure clustering is active before you start. Name the independent disk pool to match the device description resource name. As you add disk units, it is best to localize disk units in the same expansion unit or IOP. Also, do not spread the disk pool across parity sets (iSeries Navigator required).

  8. Follow these steps to configure geographic mirroring:

    1. In iSeries Navigator, expand My Connections (or your active environment).

    2. Expand the System i™ you want to examine,Configuration and Service > Hardware > Disk Units > Disk Pools.

    3. Right-click the Disk Pool you want to mirror and select Geographic Mirroring > Configure Geographic Mirroring.

    4. Follow the wizard's instructions to configure geographic mirroring.

      The disk pools you select to geographically mirror must be in the same switchable hardware group. If you want to geographically mirror disk pools in more than one switchable hardware group, you will need to complete the wizard one time for each switchable hardware group.

  9. Print your disk configuration in case of a recovery situation. See how to display your disk configuration in Backup and Recovery. Also, record the relationship between the independent disk pool name and number.

  10. Start the cluster resource group to enable device resiliency using the STRCRG (Start Cluster Resource Group) command.

  11. To access the disk units in an independent disk pool, vary on the disk pool using the VRYCFG (Vary Configuration) command. Vary on will also reconnect connections, so that any new route definitions can take effect.

  12. Before you add data to the disk pool, use the CHGCRGPRI (Change CRG Primary) command to perform a test switchover. Use the test switchover to ensure the backup node can become the primary node and the primary node can become the backup node. .

 

Parent topic:

Configuring geographic mirroring with switchable independent disk pools

Related concepts
How geographic mirroring works Communications requirements