InfiniHost Dual Port 10Gb InfiniBand to PCI-X HCA [128MB, ROHS (R5), Low Profile] Mellanox MHET2X-1TC These second generation cards unleash performance with dual 10Gb/s links. These InfiniBand cards are based on an architecture that enables 10Gb/s Database or High Performance Computing (HPC) clustering, and provide the high throughput and low CPU utilization required by these applications. The cards come with either 128 or 256MB of DDR memory. Key Features
InfiniBand HCA Application Support The InfiniBand architecture defines and supports many applications, most with remote direct memory access (RDMA) capabilities. This dexterity enables high performance clustering, communication and storage traffic to be run over an InfiniBand fabric. These HCA cards have hardware support for the following protocols: MPI (for HPC clusters), DAT (for databases), SDP (for legacy applications), IPoIB (Internet Protocol over InfiniBand), NFS over RDMA (for network attached storage), SRP (for block storage), and many embedded applications like video streaming, aerospace, military, and electronic controls. High Performance Mellanox Silicon InfiniHost is a single chip dual-port 10Gb/s InfiniBand host channel adapter with a PCI-X interface and an integrated physical layer. The device features an HCA core that is capable of full wire-speed transmissions over a 10Gb/s InfiniBand link. The core features a full implementation of the InfiniBand architecture with hardware transport. These features fully support RDMA transfers that drastically reduce CPU overhead and enable the host processors to spend its cycles on applications and not on communications Software Support All InfiniHost HCA cards include verbs interface and device drivers for both Windows and Linux operating systems. In addition, the cards include internal Subnet Management Agent (SMA) and General Service Agents, eliminating the requirement for an external management agent CPU.
|