Research Computing Services
IIHR maintains a diverse set of computing resources and facilities. The primary computer platform is a High Performance Compute (HPC) cluster called Helium. It is a shared system built from Hewlett-Packard DL160 nodes that features 3,304 total cores, 10.6 TB of memory, more than 500 TB of storage, a 40 Gbps Voltaire Infiniband QDR message passing fabric for MPI communications, and three Ethernet networks for management and NFS storage.
-
Most popular related searches
The cluster queuing system, Sun Grid Engine, provi...
The cluster queuing system, Sun Grid Engine, provides access to very large jobs, well beyond the limits of the dedicated hardware for any individual user. The programming environment includes OpenMP, MPI, and the Intel and GNU compiler and tool suites. The cluster was acquired with funding from the NIH, AFOSR, and a number of individual researcher-led contributions, in addition to monies from The College of Engineering and the University of Iowa Office of the Provost. It is operated by IIHR—Hydroscience & Engineering in conjunction with ITS and a group of collaborative researchers.
IIHR also operates an HPC cluster designed around ...
IIHR also operates an HPC cluster designed around the SGI Altix 450 shared memory system to run the institute’s own premier ship hydrodynamics code, CFDShip-Iowa. This system uses SGI’s NUMAflex shared memory architecture, tying all memory into one extremely low latency address space, and presents a single system image (SuSE SLES Linux) across the cluster. The SGI Altix 450 is comprised of 36 1.6 GHz Intel Itanium2 64-bit cores, each with 8 MB of L2 cache and 2 GB of local RAM for a total of 72 GB of extremely low latency shared memory. Packaged as 18 single socket dual core nodes, each node has two 300 GB serial attached SCSI (SAS) drives for local scratch space. Working space is provided by 5.4 TB of SAS storage.
Smaller clusters are dedicated to a set of specialized applications. A dedicated PIV render cluster comprised of three dual socket dual core Xeon processors in Mac Pro systems is used for image processing of PIV data. An additional three dual socket dual core Xeon Mac Pro systems are dedicated to Fluent jobs running under Windows-64.
HPC at IIHR is augmented by 10 Silicon Mechanics storage units providing 500 TB of storage in a RAID 60 configuration. This storage space is replicated to an offsite location with hourly snapshots taken for user invoked file recovery. IIHR also operates dedicated project storage arrays based on Apple XServe/XServe RAID architecture, with close to 15 TB of additional storage.
Very-large-scale computations are done at national and international computation centers, accessed through longstanding IIHR-center relationships. In addition to the NSF and DOD/DOE centers, IIHR has developed a continuing collaboration with the National Center for High Performance Computing (NCHC) in Taiwan.
Supporting the local centralized facilities are a 35 Linux workstations and more than 240 individual PCs running MS Windows 7. There are 24 PC-based servers handling web, ftp, security, and specialized database services. In addition, a number of Blu-Ray mass-storage devices, publication-quality color printers, scanners, cameras, and other peripherals are in use.
This hardware is complemented by a carefully selected set of public domain, commercial, and proprietary software packages that include Tecplot, Gridgen, Fluent, FlowLab, Matlab, Origin, ERDAS, ERMapper, ERSI, Skyview, and the core GNU utilities. Additionally, software such as AutoCAD, MS Windows, MS Office, OS X, Mathematica, IDL, SigmaPlot, and SAS, are used under university-wide site licenses.
Customer reviews
No reviews were found for Research Computing Services. Be the first to review!