Physical Sciences, with support from IAT-NACS, has assembled a high-performance computing cluster for climate modeling and other computational-intensive research.
Called “Greenplanet,” the cluster comprises nodes purchased by faculty in Earth Systems Sciences (ESS), Chemistry, and Physics, and it is expected that Math faculty will also participate. At this time, Greenplanet includes almost 900 CPUs and is still growing.
IAT provides secure, climate-controlled space in the Academic Data Center, system administration services as a team with Physical Sciences IT staff, and consultation on code parallelization and optimization.
According to Assistant Professor Keith Moore of ESS, Greenplanet is “a flexible cluster, suitable for massively parallel complex computations (such as climate simulations), and for smaller-scale use on a single node as a workstation.”
A typical node features 8 64-bit Intel CPUs. Greenplanet features the Load Sharing Facility (LSF) for job management and the Lustre caching file system for extremely high-performance access to the large datasets typical of climate modeling. Two message passing techniques are available for parallel code: OpenMP for communication between CPUs on a node, and MPI for communication between CPUs on different nodes. Greenplanet also has the high-performance Infiniband interlink between nodes for high-speed communications. There is extensive instrumentation available for tuning jobs to optimal execution speed and use of all available computational capacity in the cluster.
Software includes the Climate Systems Modeling package, parallel Matlab, and quantum chemistry packages such as Gaussian and Turbomole.