Compute Systems – Pascal and Family


In 2017 NVIDIA’s Educator GPU Grant program made specialized hardware available to several groups at the University.  Two cards, one directly given by NVIDIA to CIS and one provided by the same program and generously transferred from Electrical and Computer Engineering, were combined with assets provided by centralized IT and Steve Vittitoe, a local University benefactor. This collection allowed students and staff to build a new specialized server named

It provides MST/CIS faculty and other collaborators with a locally administered asset for complex numerical work.



  • Two NVIDIA Titan X Pascal GPUs (~7000 CUDA cores)
  • Two Intel – Xeon E5-2620 v4 x86-64 V4 2.1GHz 8-Core Processors (16 physical cores)
  • added: NVIDIA SLI Bridge unit to link the GPU cards [‘4 slot’]
  • Motherboard – Supermicro MBD-X10DAX EATX Dual-CPU LGA2011-3


  • 120 GB SSD (“/” – OS/application storage)
  • 120 GB SSD (/media/ssdstorage – ~70 GB; swap partition – ~39 GB )
  • 3 TB 7200 RPM Standard HDD (/media/storage)
  • 3 TB Variable-speed Standard HDD [Western Digital ‘green’] (/media/storage2)


  • System: 128 GB Registered DDR4-2133 SDRAM (upgrade from 64 GB initial configuration on 7/10/2018)
  • Titan X Pascal:12 GB RAM per card (24 GB total)


  • EVGA SuperNOVA G2 1300W 80+ Gold Fully-Modular ATX PSU, attached to an external UPS unit – a CyberPower Systems LX1500GU (1500VA AVR)


  • Corsair – H100i Liquid CPU Cooler
  • Corsair – H80i Liquid CPU Cooler
  • Large case and power supply fans

Current software:

  • OS: GNU/Linux Ubuntu 18.04 (upgraded Fall 2019; originally GNU/Linux Ubuntu 16.04.3 LTS)
  • Libraries: CUDA suites v9.2, 10.0, 10.1 (/usr/local/CUDA) [Driver Version: 418.67]
  • Languages: Oracle/Sun Java 11, GNU Fortran, GNU C, GNU C++, Python v2 (python); Python v3 (python3)

(Pascal has access to /mnt/sharedFiles over a standard network connection, see below.)

Particular thanks to Charles Morris, the first principal student sysadmin, who helped build the system.

“The Mathematicians”

A quartet of additional machines was brought online in stages beginning in Spring 2019, all running GNU/Linux Ubuntu 18.04.

These are:

  • (AMD Ryzen 5 1600X @ 3.60 GHz / 32 GB RAM; GEFORCE GTX 1060)
  • (Intel Core i3-6100 2-Core @ 3.70GHz /  20 GB RAM; GEFORCE GTX 1060)
  • (Intel Celeron G3930 2-Core @ 2.90GHz / 16 GB RAM; GEFORCE GTX 1060)
  • (Intel Core i5-4570 4-Core @ 3.20GHz / 32 GB RAM; GEFORCE GTX 660)

A private network links the systems via secondary ethernet cards.  An NFS mount (/mnt/sharedFiles) is provided by noether to the other systems over that private network.

Other Systems
Several other systems can be used as auxiliary compute nodes, assuming they are not supporting courses as front ends to the small form factor parallel nodes:


These are not linked to the private network, but do have access to the same NFS shares via standard University connections. Two systems also play supporting roles, and are linked to the private network:

  • (Intel Celeron J1900 Celeron on ASRock IMB-151D Bay Trail / 8 GB RAM) – similar to the LittleFe nodes – and
  • a Synology DS418 network attached storage (NAS) system (four 4 TB RAID 10 disks providing ~8 TB storage) providing local-control large storage to support both Windows and Linux systems

Additional Software

See the Cluster Use page for more info.

Thanks also to Dane Towner and Erik Kispert for system build and software help.