system_details:slurm_hardware

This page gives an overview of the Grid.UP computers and details the various types of file systems, nodes, and system services available to end-users.

Grid.UP is a general purpose capability system and is designed to be a well balanced system, designed to handle tasks which require:

  • many cores
  • large symmetric multi-processing nodes
  • high memory
  • a fast interconnect
  • a lot of work space on disk
  • a fast I/O subsystem

The set of node available to end-users comprises a large number of compute nodes, or “worker nodes”. We distinguish the following different node flavours:

  • batch : CPU-only “thin” compute nodes,
  • big : CPU and GPU-enhanced “fat” compute nodes which have more memory than the default worker nodes
  • ceft : CPU-only compute nodes and STORAGE exclusive for ceft registered users
  • lsrelcm : CPU-only compute nodes exclusive for lsre/lcm registered users

The table below lists the current available node types.

# Nodes Node name Node flavour / partition CPU SKU CPU Cores per Node Accelerator(s) DIMMs Total memory per node Local storage Network connectivity
22 ava01-22 batch 2x Intel E5-2450 CPUs 16 N/A 68 GB DDR3 RAM clocked at 1600MHz 66 GB 1x 500GB 7.2K SATA HDD for local storage (not user accessible) * 1x Broadcom Corporation NetXtreme BCM5720 NICs
* 1x Infiniband
1 avafat01 big 2x Intel E5-2650 CPUs 32 NVIDIA Tesla 20-Series M2075 128GB of DDR3 RAM clocked at 1333MHz 128 GB 1x 500GB 7.2K SATA HDD for local storage (not user accessible) * 1x Broadcom Corporation NetXtreme BCM5720 NICs
* 1x Infiniband
1 ceft01 ceft 2 x Intel E5-2683 v4 CPUs 32 N/A 128Gb of DDR4 Synchronous 2400 MHz 128 GB * 1x 126GB SuperMicro SSD for local storage (not user accessible)
* LSI2208 RAID 5 (55TB) (exclusive for ceft registered users)
1x Intel Corporation I350 Gigabit NICs
1 ceft02 ceft
big
2 x Intel E5-2650 v4 CPUs 24 N/A 64Gb of DDR4 Synchronous 2133 MHz 64 GB * 1x 126GB SuperMicro SSD for local storage (not user accessible)
* 6x 6TB LSI2208 RAID 5 (28TB) (exclusive for ceft registered users)
1x Intel Corporation I350 Gigabit NICs
6 cfp01-06 big 2x Intel E5-2650 CPUs 16 N/A 64GB of DDR3 RAM clocked at 1600MHz 64 GB 1x 500GB 7.2K SATA HDD for local storage (not user accessible) * 1x Intel Corporation I350 Gigabit NICs
* 1x Infiniband
1 cfp07 big 2x Intel E5-2640 CPUs 20 N/A 192GB of DDR4 RAM 192 GB 1x 1TB 7.2K SATA HDD for local storage (not user accessible) 2x Intel Corporation I350 Gigabit NICs
1 cfp08 big 2x AMD EPYC 7301 16-Core Processor 32 N/A 256GB of DDR4 RAM 256 GB 1x 240GB 7.2K SATA HDD for local storage (not user accessible) 2x Intel Corporation I350 Gigabit NICs
1 cfp09 big 2x 16c Intel® Xeon® Gold 6226R 32 N/A <tbd> 512 GB 1x 252 GB (not user accessible) Gigabit NICs
1 cfp10 big 1x 24c AMD EPYC 7443 24-Core 24 N/A <tbd> 512 GB 1x 252 GB (not user accessible) Gigabit NICs
1 cfp11 cfp 2 x AMD EPYC 7443P 24-Core Processor 48 ASPEED Graphics Family <tbd> 514 GB 1x 1.92 TB (not user accessible) BCM57416 NetXtreme-E Dual-Media 10G RDMA Ethernet
10 cfpsmall01-10 batch 1x Intel(R) Xeon(R) CPU E5430 8 N/A 8GB of DDR3 RAM clocked at 1600MHz 8 GB 1x 250GB (not user accessible) Intel Corporation I350 Gigabit NICs
1 cristalflow big 2x e5-2640v4 40 N/A 4x 16GB 64 GB * 1x SSD 128GB (not user accessible)
* 4x 6TB RAID 5 (18TB) (exclusive for cristalflow registered users)
2x Gigabit NICs
6 demec01-06 big 2x Intel(R) Xeon(R) CPU E5-2640 v2 32 N/A 64GB of DDR3 RAM clocked at 1600MHz 64 GB 1x 500GB 7.2K SATA HDD (not user accessible) 2x Intel Corporation I350 Gigabit NICs
2 inegi01-02 big 2 x Intel E5-2680 v2 CPUs 20 N/A 64GB of DDR3 RAM clocked at 1866MHz 64 GB 1x 500GB 7.2K SATA HDD for local storage (not user accessible) * 1x Intel Corporation I350 Gigabit NICs
* 1x QDR Mellanox MT27500 IB HCA
2 ventos01-02 big <tbd> 64
32
N/A 96 96 GB
128 GB
27 TB (exclusive for ventos registered users) 2x Gigabit NICs
newest
5 lsrelcm01-05 lsrelcm 2x Intel(R) Xeon(R) Silver 4316 CPU 40 N/A 64 64 GB 1x 500 GB (not user accessible) 2x NetXtreme BCM5719 Gigabit Ethernet PCIe

Some cluster nodes are linked by a single Qlogic 12200 switch.

  • system_details/slurm_hardware.txt
  • Last modified: 2024/03/01 16:05
  • by ptsilva