Overview

Aug 2016

Kernel-Upgrades Completed: RHEL6u8 on all systems! See timeline here

Jul 2016

Please welcome our Summer Intern: Riana Freedman! Read about her here

New Python 2.7 & Python 3.5 are now available on DEAC as module files

Python 2.6 packages have pushed to the UCS Nodes: ASE, Cython, SciPy & Nose

R 3.3.0 "Supposedly Educational" is now on DEAC!

Oracle Java 1.8.0_101 is available as a module file now on DEAC!

Jun 2016

Kernel-Upgrades: Head-Node 3 upgrade / test head-node. Full details here

New Notification tab to the DEAC Website.

May 2016

MEME-Suite 4.9.0 is now available on DEAC! Learn more

STATA 14 is now available on DEAC! Learn more here

Mathematica 10.4.1 is now available on DEAC! Learn more here

MATLAB2016a is now available on DEAC! Learn more here

All BC11 nodes have moved to the SLURM scheduler.

All BC14 nodes have moved to the SLURM scheduler.

All UCS Chassis 8 nodes have moved to the SLURM scheduler.

SLURM JOBS can be submitted from Head Nodes 1, 2, 3 & 4

ABySS 1.9.0 is now available on DEAC! Thanks to Dr. Setaro from Biology! Learn more here

All BC10 nodes have moved to the SLURM scheduler.

All BC13 nodes have moved to the SLURM scheduler.

All UCS Chassis 7 nodes have moved to the SLURM scheduler.

April 2016

How to reference the DEAC cluter: Citation page

All BC06 nodes have moved to the SLURM scheduler.

All UCS Chassis 6 nodes have moved to the SLURM scheduler.

All BC09 nodes have moved to the SLURM scheduler.

All BC12 nodes have moved to the SLURM scheduler.

New /deac research group paths are ready for use! Check the Cluster Resources page for more info!

All BC07 nodes have moved to the SLURM scheduler.

All BC08 nodes have moved to the SLURM scheduler. Check the Torque 2 SLURM list

March 2016

All UCS Chassis 5 nodes have moved to the SLURM scheduler.

All BC05 nodes have moved to the SLURM scheduler.

All UCS Chassis 4 nodes have moved to the SLURM scheduler.

All BC01 nodes have moved to the SLURM scheduler. Check the Torque 2 SLURM list

Quantum Espresso 5.3.0 is now available on DEAC

February 2016

All UCS Chassis 2 & 3 nodes have moved to the SLURM scheduler

All BC02 & BC04 nodes have moved to the SLURM scheduler

Check the Torque 2 SLURM list

January 2016

DEAC YouTube Channel is now open. Tutorial videos in how to use DEAC HPC services.

XCrySDen-1.5.60 is now available on DEAC

December 2015

RHEL6 update 7 has been implemented for all nodes

November 2015

Mathematica 10.3 is now available on DEAC! Check the new features for Mathematica 10.3

MATLAB2015b is now available on DEAC! See what's new for MATLAB2015b

October 2015

You can follow us now on Twitter: @WakeHPC .

Maple 18 is now available on DEAC! Check what is new with Maple 18

September 2015

MAGMA library for CUDA 6.5 is now available in the DEAC Cluster. Check MAGMA's main page.

August 2015

Mathematica 10.2 is now available in the DEAC Cluster. Check their featured videos & screencasts at Wolfram's site.

CUDA 6.5 is now available in the DEAC Cluster. You can see industry and research case studies from NVIDIA

July 2015

MATLAB 2015a is now available in the DEAC Cluster. Check the Cluster's Wiki.

June 2015

Shiny is now available in RStudio IDE on the three Head-Nodes

May 2015

SOAPdeNovo-2.04 is now a load-module for the cluster

April 2015

VCF Tools is now a load-module for the cluster

March 2015

NAMD 2.10 - Serial and Infiniband modules are now available.

February 2015

Bioconductor in R is now available.
Trinity is now a load-module for the cluster.
SAMTools is now a load-module for the cluster.
BWA (Burrows_Wheeler Aligner) is now a load-module for the cluster.

January 2015

RHEL6 update 6 has been applied to all of the head & compute-nodes.

September 2014

27 additional UCS Cisco B-series blades are now online: Please check the Wiki Support page for job submission instructions to run on these blades here

August 2014

Mathematica 9.0.1 & 10.0.0 are now available

July 2014

New 29 UCS Cisco B-series blades are now online: Please check the Wiki Support page for job submission instructions to run on these blades here.

June 2014

Wake Forest Graduate student Cody Stevens has joined the DEAC HPC System Administation team as a Summer Intern.
RStudio IDE has been installed for all three head-nodes.

May 2014

BladeCenter08 (clan08) now has Infiniband capabilities
BladeCenter07 (clan07) now has Infiniband capabilities
RHEL6 update 5 has been performed for all the compute-nodes

April 2014

Adam Carlsonhas joined the DEAC HPC System Administration team. You can read his BIO in the Staff page

March 2014

RHEL 6 Update 5 implementation has started on the compute-nodes

January 2014

Damian Valles has joined the DEAC HPC System Administration team. You can read his BIO in the Staff page

April 2010

New Support Wiki goes live! Site requires login using your cluster credentials.

December 2009

FY10 Annual Purchase brings 28 new Nehalem-based HS22 blades with 48GB of RAM to the cluster, 14 of which are Infiniband capable. Check out the Resources page for more information.

November 2008

The FY09 Annual Purchase upgrades the entire cluster to 2GB RAM/core and adds an additional 42 HS21XM blades with 8 processing cores/blade. Check out the Resources page for more information.

July 2008

Carl Langefeld of Public Health Sciences joins the cluster, expanding the cluster by 35 HS21XM blades, each with 8 processing cores/blade and 2GB RAM/core.

October 2007

FY08 Hardware Purchase Online

October 2007

IBM SUR Grant

December 2006

PTRP A1A and FY07 Hardware Purchase