Mellanox MHGH28-XTC

Цена: 97210 р
Наличие: Под заказ
Артикул: MHGH28-XTC
Производитель:Mellanox
ConnectX IB Dual Port 20Gb InfiniBand to PCIe x8 2.5 GT/s HCA [No Memory, ROHS (R5), Low Profile]

Mellanox MHGH28-XTC

Mellanox ConnectX IB InfiniBand Host Channel Adapter (HCA) cards deliver low-latency and high-bandwidth for performance-driven server and storage clustering applications in Enterprise Data Centers, High-Performance Computing, and Embedded environments. Clustered data bases, parallelized applications, transactional services and high-performance embedded I/O applications will achieve significant performance improvements resulting in reduced completion time and lower cost per operation. ConnectX IB simplifies network deployment by consolidating clustering, communications, storage, and management I/O and providing enhanced performance in virtualized server environments.

Key Features

  • 1.2us MPI ping latency
  • 10, 20, or 40Gb/s InfiniBand ports
  • PCI Express 2.0 (up to 5GT/s)
  • CPU offload of transport operations
  • End-to-end QoS and congestion control
  • Hardware-based I/O virtualization
  • TCP/UDP/IP stateless offload
Specifications
  • Dual 4X InfiniBand ports
  • microGiGaCN or QSFP connectors
  • Supports active cables & fiber adapters
  • PCI Express 2.0 x8 (1.1 compatible)
  • Single chip architecture
  • Link status LED indicators
  • Low profile, small form factor (13.6cm x 6.4cm without bracket)
  • RoHS-5 compliant
  • 1-year warranty

World Class Performance and Scalability

Clustered applications running on multi-socket servers using multi-core processors will benefit from the reliable transport connections and advanced multicast support offered by ConnectX IB. Servers supporting PCI Express 2.0 with 5GT/s will be able to take advantage of 40Gb/s InfiniBand, balancing the I/O requirement of these high-end servers. End-to-end Quality of Service (QoS) enables partitioning and guaranteed service levels while hardware-based congestion control prevents network hot spots from degrading the effective throughput. ConnectX is capable of scaling to tens-of-thousands of server and storage nodes.

Hardware Offload Architecture

Clustered and client/server applications achieve maximum performance over ConnectX IB because CPU cycles are available to focus on critical application processing instead of networking functions. Network protocol processing and data movement overhead such as RDMA and Send/Receive semantics are completed in the adapter without CPU intervention. Applications utilizing TCP/UDP/IP transport can achieve industry-leading throughput when run over ConnectX IB and its hardware-based stateless offload engines.

I/O Virtualization

ConnectX IB support for hardware-based I/O virtualization is complementary to Intel and AMD virtualization technologies. Virtual machines (VM) within the server are enabled with dedicated I/O adapter resources and guaranteed isolation and protection. Hypervisor offload features remove software-based virtualization overheads and free up CPU cycles enabling native OS performance for VMs and higher server utilization by supporting more VMs per physical server.

Storage Accelerated

A unified InfiniBand cluster for computing and storage achieves significant cost-performance advantages over multi-fabric networks. Standard block and file access protocols leveraging InfiniBand RDMA result in high-performance storage access. Data reliability is improved through the use of T10-compliant Data Integrity Field (DIF). Fibre Channel (FC) over InfiniBand (FCoIB) features enable the use of cost-effective bridges for connecting to FC SANs.

Software Support

All Mellanox adapter cards are compatible with TCP/IP and OpenFabrics-based RDMA protocols and software. They are also compatible with InfiniBand and cluster management software available from OEMs. The adapter cards are supported with major operating system distributions.

 

Системы хранения данных RaidShop.ru © 2024
Данная информация не является публичной офертой, определяемой положениями статей 435,437 Гражданского Кодекса РФ