High Performance and Parallel Computing Office of Information Technology

Easley Cluster (2020)

Dell PowerEdge HPC Cluster

10,000 Cores | 50 TB RAM | 4 PB Disk | ~500 TFlops


Easley User's Guide

Cluster Components

HEAD AND LOGIN NODES

The head and login nodes are Dell PowerEdge R640 servers, with the following specifications:

  • 2x Intel(R) Xeon(R) Gold 6248R V3 "Sky Lake" CPUs (24 cores/CPU)
  • 192GB of memory
  • 1TB local SSD

Operating System: CentOS 7

COMPUTE NODES

186 Compute Nodes

There are multiple classes of compute nodes in this cluster:

  • 132 "Standard" nodes. Each node has 192GB of memory and 2x Intel(R) Xeon(R) Gold 6248R CPU @ 3.00GHz CPUs (24 cores/CPU)
  • 21 Type I Large Memory nodes with 384 GB RAM.
  • 9 Type II Large Memory" nodes with 768 GB RAM
  • 9 Type I GPU nodes with 2x nVidia Tesla T4 GPUs
  • 2 Type II GPU with 4x nVidia Tesla T4 GPUs
  • 13 AMD Nodes with 2x AMD EPYC 7662 64-Core Processor (64 cores/CPU

Operating System: CentOS 7

STORAGE NODES

4x Dell PowerEdge R440 servers

  • 2x management nodes
  • 2x storage nodes for GPFS
  • Intel(R) Xeon(R) Gold 5220 CPU @ 2.20GHz
  • Each node has 192 GB RAM

Dell SAN

Dell ME 4084 Storage Controllers

  • 4PB of disk presented to all nodes via GPFS
  • All nodes are Infiniband EDR connected

HPC LINKS

For more Information: Send your email request to hpcadmin@auburn.edu

Last Updated: December 7th, 2020