Dell EqualLogic PS4100 Hardware Manual

Dell EqualLogic PS4100 Hardware Manual

Equallogic ps series iscsi storage arrays with microsoft windows server failover clusters
Hide thumbs Also See for EqualLogic PS4100:

Advertisement

Dell EqualLogic PS Series iSCSI Storage Arrays With
Microsoft Windows Server Failover Clusters
Hardware Installation and Troubleshooting Guide

Advertisement

Table of Contents
loading

Summary of Contents for Dell EqualLogic PS4100

  • Page 1 Dell EqualLogic PS Series iSCSI Storage Arrays With Microsoft Windows Server Failover Clusters Hardware Installation and Troubleshooting Guide...
  • Page 2: Notes, Cautions, And Warnings

    Information in this publication is subject to change without notice. © 2012 Dell Inc. All rights reserved. Reproduction of these materials in any manner whatsoever without the written permission of Dell Inc. is strictly forbidden. Trademarks used in this text: Dell...
  • Page 3: Table Of Contents

    Contents Notes, Cautions, and Warnings....................2 1 Introduction..........................5 .................................5 Cluster Solution ..........................6 Cluster Hardware Requirements ..............................6 Cluster Nodes ..............................6 Cluster Storage ......................8 Network Configuration Recommendations ..........................9 Supported Cluster Configurations ..........................9 iSCSI SAN-Attached Cluster ..........................9 Other Documents You May Need 2 Cluster Hardware Cabling......................11 ....................11 Mouse, Keyboard, And Monitor Cabling Information ........................11...
  • Page 4 4 Troubleshooting.........................43 5 Cluster Data Form........................47 6 iSCSI Configuration Worksheet....................49...
  • Page 5: Introduction

    Introduction A Dell Failover Cluster combines specific hardware and software components to provide enhanced availability for applications and services that run on your cluster. A Failover Cluster reduces the possibility of any single point of failure within the system that can cause the clustered applications or services to become unavailable. It is recommended that...
  • Page 6: Cluster Hardware Requirements

    Component Minimum Requirement Cluster nodes A minimum of two identical Dell PowerEdge systems are required. The maximum number of nodes that are supported depend on the variant of the Windows Server operating system used in your cluster. The variant of the Windows Server operating system that is installed on your cluster nodes determines the minimum required amount of system RAM.
  • Page 7 Can share one or more supported storage arrays. stand-alone systems A Dell EqualLogic PS series storage array includes redundant, hot-swappable disks, fans, power supplies, and control modules. A failure in one of these components does not cause the array to be offline. The failed component can be replaced without bringing the storage array down.
  • Page 8: Network Configuration Recommendations

    Enable Flow Control on Enable Flow Control on each switch port and NIC that handles iSCSI traffic. Dell switches and NICs EqualLogic PS Series arrays correctly respond to Flow Control.
  • Page 9: Supported Cluster Configurations

    • Getting Started Guide provides an overview of initially setting up your system. Dell Failover Clusters with Microsoft Windows Server 2003 Installation and Troubleshooting Guide provides • more information on deploying your cluster with the Windows Server 2003 operating system.
  • Page 10 Release Notes — Provides the latest information about the Dell EqualLogic PS Series arrays and/or Host Integration Tools. – QuickStart — Describes how to set up the array hardware and create a Dell EqualLogic PS Series group. – Group Administration — Describes how to use the Group Manager graphical user interface (GUI) to manage a Dell EqualLogic PS Series group.
  • Page 11: Cluster Hardware Cabling

    For some environments, consider having backup generators and power from separate electrical substations. The following figures illustrate recommended methods for power cabling a cluster solution consisting of two Dell PowerEdge systems and one storage array. To ensure redundancy, the primary power supplies of all the components are grouped into one or two circuits and the redundant power supplies are grouped into a different circuit.
  • Page 12: Cluster Cabling Information For Public And Private Networks

    5. redundant power supplies on one AC power strip Figure 3. Power Cabling Example With Two Power Supplies in the PowerEdge Systems 1. cluster node 1 2. cluster node 2 3. primary power supplies on one AC power strip 4. EqualLogic PS series storage array 5.
  • Page 13: For Public Network

    Figure 4. Example of Network Cabling Connection 1. public network 5. private network 2. cluster node 1 6. cluster node 2 3. public network adapter 4. private network adapter For Public Network Any network adapter supported by a system running TCP/IP may be used to connect to the public network segments. You can install additional network adapters to support additional public network segments or to provide redundancy in the event of a faulty primary network adapter or switch port.
  • Page 14: Nic Teaming

    Method Hardware Components Connection Optical Gigabit or 10 Gigabit Ethernet Connect a multi-mode optical cable between the network adapters with LC network adapters in both nodes. connectors Dual-Port Network Adapters Usage You can configure your cluster to use the public network as a failover for private network communications. If dual-port network adapters are used, do not use both ports simultaneously to support both the public and private networks.
  • Page 15 Figure 5. Two-Node iSCSI SAN-Attached Cluster 1. public network 5. Gigabit or 10 Gigabit Ethernet switches 2. cluster node 6. storage system 3. private network 4. iSCSI connections Gigabit NICs can access the 10 Gigabit iSCSI ports on the EqualLogic PS4110/PS6010/PS6110/PS6510 storage systems if any one of the following conditions exist: •...
  • Page 16 Connect a network cable from the network switch 0 to Ethernet 0 on the control module 0. Connect a network cable from the network switch 1 to Ethernet 0 on the control module 1. Repeat steps 1 and 2 to connect the additional Dell EqualLogic PS4110/PS6110 storage array(s) to the iSCSI switches.
  • Page 17 (left Ethernet 0 port), use CAT6 or better cable. With the SFP+ port (right Ethernet 0 port), use fiber optic cable acceptable for 10GBASE-SR or twinax cable. For more information, see the figures below. Figure 7. Cabling an iSCSI SAN-Attached Cluster to a Dell EqualLogic PS4110 Storage Array 1. cluster node 1 5. Dell EqualLogic PS4110 storage system 2.
  • Page 18 NOTE: For PS4100 storage array, having all 4 cables in steps 1 through 4 provides highest level of cable redundancy. It works fine with only 2 cables. You can skip either step 1 or 2, and either step 3 or 4.
  • Page 19 Figure 10. Cabling an iSCSI SAN-Attached Cluster to a Dell EqualLogic PS4100 Storage Array 1. cluster node 1 5. Dell EqualLogic PS4100 storage system 2. cluster node 2 6. control module 0 3. switch 0 7. control module 1 4. switch 1 Figure 11.
  • Page 20 Connect a network cable from the network switch 0 to Ethernet 2 on the control module 0. Repeat steps 1 to 6 to connect the additional Dell EqualLogic PS5000/PS5500 storage array(s) to the iSCSI switches. For more information, see the figures below.
  • Page 21 Figure 13. Cabling an iSCSI SAN-Attached Cluster to a Dell EqualLogic PS5000 Storage Array 1. cluster node 1 5. Dell EqualLogic PS5000 storage system 2. cluster node 2 6. control module 1 3. switch 0 7. control module 0 4. switch 1 Figure 14.
  • Page 22 Connect a network cable from the network switch 0 to Ethernet 2 on the control module 0. Connect a network cable from the network switch 0 to Ethernet 3 on the control module 0. Repeat steps 1 to 8 to connect the additional Dell EqualLogic PS6000/PS6100/PS6500 storage array(s) to the iSCSI switches.
  • Page 23 Figure 16. Cabling an iSCSI SAN-Attached Cluster to a Dell EqualLogic PS6100 Storage Array 1. cluster node 1 5. Dell EqualLogic PS6100 storage system 2. cluster node 2 6. control module 0 3. switch 0 7. control module 1 4. switch 1 Figure 17.
  • Page 24 NOTE: For PS4100 storage array, having all 4 cables in steps 1 through 4 provides highest level of cable redundancy. It works fine with only 2 cables. You can skip either step 1 or 2, and either step 3 or 4.
  • Page 25 Connect a network cable from the network switch 0 to Ethernet 2 on the control module 0. Connect a network cable from the network switch 0 to Ethernet 3 on the control module 0. Repeat steps 1 to 8 to connect the additional Dell EqualLogic PS6000/PS6100/PS6500 storage array(s) to the iSCSI switches.
  • Page 27: Preparing Your Systems For Clustering

    Damage due to servicing that is not authorized by Dell is not covered by your warranty. Read and follow the safety instructions that came with the product.
  • Page 28: Installation Overview

    Failover Cluster successfully. It is recommended that you establish server roles prior to configuring a Failover Cluster, depending on the operating system configured on your cluster. For a list of Dell PowerEdge systems, iSCSI NICs, supported list of operating system variants, and specific driver and Dell Cluster Configuration Support Matrices at dell.com/ha.
  • Page 29 Running The Remote Setup Wizard The Remote Setup Wizard simplifies Dell EqualLogic PS Series group (SAN) and Windows system setup. After you install the Host Integration Tools, you can choose to launch the Remote Setup Wizard automatically, or you can run it later using the following step: Click Start Programs →...
  • Page 30 Prompt Description IP address Network address for the Ethernet 0 network interface. Netmask Combines with the IP address to identify the subnet on which the Ethernet 0 network interface resides. Default Network address for the device used to connect subnets and forward network traffic beyond the local gateway network.
  • Page 31 Initializing An Array And Expanding A Group Using the Remote Setup Wizard, you can initialize a Dell EqualLogic PS Series array and add the array to an existing group. In addition, the wizard configures the group IP address as an iSCSI discovery address on the computer.
  • Page 32 Computer Access To A Group You can use the Remote Setup Wizard to enable Windows computer access to a Dell EqualLogic PS Series group. You can also use the wizard to modify existing group access information on a computer (for example, if the group IP address changed).
  • Page 33: Installing The Microsoft Iscsi Software Initiator

    Dell Cluster Configuration Support Matrices at information on the supported iSCSI Software Initiator version, see the dell.com/ha. If the version in the Host Integration Tools is not listed in the Support Matrix, download and install the Microsoft iSCSI Software Initiator using the following steps:...
  • Page 34: Modifying The Registry Settings

    Use a web browser and go to the Microsoft Download Center website at microsoft.com/downloads. iscsi initiator . Search for Select and download the latest supported initiator software and related documentation for your operating system. Double-click the executable file. The installation wizard launches. In the Welcome screen, click Next.
  • Page 35 • Creating Access Control Records. • Connecting Hosts to Volumes. • Advanced Storage Features. Running The Group Manager GUI You can use the Group Manager graphical user interface (GUI) to configure the storage array(s) and perform other group administration tasks using one of the following methods: •...
  • Page 36 iSCSI target name that is generated for the volume. Host access to the volume is always through the iSCSI target name, not the volume name. – Description — The volume description is optional. – Storage pool — All volume data is restricted to the members that make up the pool. By default, the volume is assigned to the default pool.
  • Page 37: Advanced Storage Features

    – Select the Advanced tab. – Ensure that the Enable shared access to the iSCSI target from multiple initiators check box is selected. Connecting Hosts To Volumes This section discusses how to make the proper connection to a PS Series SAN including adding the Target Portal and connecting to volumes from the host.
  • Page 38: Snapshots

    CAUTION: If you want to mount the volume of a snapshot, clone, or replica using the Group Manager GUI, mount it to a standalone node or a cluster node in a different cluster. Do not mount the snapshot, clone, or replica of a clustered disk to a node in the same cluster because it has the same disk signature as the original clustered disk.
  • Page 39: Volumes

    Volumes Cloning a volume creates a new volume with a new name and iSCSI target, having the same size, contents, and Thin Provisioning setting as the original volume. The new volume is located in the same pool as the original volume and is available immediately.
  • Page 40: Replication

    Replication Replication enables you to copy volume data across groups, physically located in the same building or separated by some distance. Replication protects the data from failures ranging from destruction of a volume to a complete site disaster, with no impact on data availability or performance. Similar to a snapshot, a replica represents the contents of a volume at a specific point in time.
  • Page 41: Volume Collections

    Dell Failover For more information on deploying your cluster with Windows Server 2003 operating systems, see the Clusters with Microsoft Windows Server 2003 Installation and Troubleshooting Guide at support.dell.com/manuals. Dell Failover For more information on deploying your cluster with Windows Server 2008 operating systems, see the...
  • Page 43: Troubleshooting

    Troubleshooting The following section describes general cluster problems you may encounter and the probable causes and solutions for each problem. Problem Probable Cause Corrective Action The nodes cannot access the storage The storage system is not cabled Ensure that the cables are connected system, or the cluster software is not properly to the nodes or the cabling properly from the node to the storage...
  • Page 44 Problem Probable Cause Corrective Action • Use the Event Viewer and • The system has just been look for the following events booted and services are still starting. logged by the Cluster Service: Microsoft Cluster Service successfully formed a cluster on this node.
  • Page 45 Problem Probable Cause Corrective Action One or more nodes may have the Configure the Internet Connection Internet Connection Firewall enabled, Firewall to allow communications that blocking RPC communications are required by the MSCS and the between the nodes. clustered applications or services. For more information, see the article KB883398 at support.microsoft.com.
  • Page 46 Problem Probable Cause Corrective Action The storage array firmware upgrade The Telnet program sends an extra User serial connection for the array process using Telnet, exits without line after you press <Enter> firmware upgrade. allowing you to enter y to the To clear the extra linefeed in Windows Telnet: following message:...
  • Page 47 Cluster Data Form You can attach the following form in a convenient location near each cluster node or rack to record information about the cluster. Use the form when you call for technical support. Table 1. Cluster Configuration Information Cluster Information Cluster Solution Cluster name and IP address Server type...
  • Page 49 iSCSI Configuration Worksheet...

Table of Contents