Mellanox Memory Free Pci-Express Hca (Sdr) - HP Cluster Platform Interconnects v2010 User Manual

Hp cluster platform infiniband interconnect installation and user's guide
Hide thumbs Also See for Cluster Platform Interconnects v2010:
Table of Contents

Advertisement

no data is being passed. The activity link blinks when data is being passed. If the LEDs are not
active, either the physical or the logical (or both) connections have not been established.
Port 1
Port 2

8.7 Mellanox Memory Free PCI-Express HCA (SDR)

The Mellanox Memory Free PCI-Express HCA supports InfiniBand protocols. It is single data
rate (SDR) card with one 4x InfiniBand 20 Gb/s port.
Free PCI-Express HCA.
Figure 8-7 Mellanox Memory Free PCI–Express HCA (SDR)
Features of the Mellanox Mem-Free PCI-Express HCA include:
PCI-Express x8 version 1.0a compatible card
Single 4X InfiniBand port Version 1.2 compatible Host Channel Adapter
InfiniBand Compatible Verbs API interface for both Linux and Windows operating systems
4X (10 Gb/s) InfiniBand port with standard copper connector
Hardware support for up to 16 million QPs, EEs and CQs
Memory Protection and Translation Tables fully implemented in hardware
IB Native layer 4 DMA hardware acceleration
Multicast support
Programmable MTU size from 256 to 2K bytes
Four Virtual Lanes supported plus Management Lane
Support for InfiniBand transport mechanisms (UC, UD, RC, RAW)
LED Name
Physical Link - Green
Data Activity - Yellow
LED Name
Physical Link - Green
Data Activity - Yellow
Figure 8-7
shows the Mellanox Memory

8.7 Mellanox Memory Free PCI-Express HCA (SDR)

101

Advertisement

Table of Contents
loading

This manual is also suitable for:

Cluster platform

Table of Contents