Skip to main content
IT Service Status
IT Service Status

Genomics Compute Cluster on Quest

Northwestern IT provides the Genomics Compute Cluster (GCC), a 7,824-core allocation on Quest, consisting of 188 compute nodes, 3 high-memory nodes, 2 GPU nodes, and 655 TB of dedicated scratch space. The GCC can be used to align sequencing data, run RNA-Seq, ChIP-seq, single-cell analysis, whole genome analysis, custom code, utilize GPUs for machine learning and model training, and more.

The GCC is available for all computational genomics researchers at Northwestern and is funded by the Feinberg School of Medicine and the Weinberg College of Arts & Sciences to foster and empower computational genomics research at Northwestern.

How can I apply to use the Genomics Compute Cluster?

Northwestern researchers may apply to use the genomics compute cluster by completing the  Genomics Compute Cluster User Registration formProspective users will be prompted to provide a brief statement of research to be performed on the GCC.

Once requests are approved, researchers will be added to the GCC buy-in group on Quest, allocation b1042, and may begin using the GCC after attending an orientation.  

Genomics Compute Cluster Compute and Storage Resources

Compute Nodes

7,824 cores across 188 compute nodes

  • 20 Quest12 compute nodes with 64 cores and 256GB RAM each (1,280 cores total) 
  • 37 Quest11 compute nodes with 64 cores and 256GB RAM each (2,368 cores total)  
  • 70 Quest10 compute nodes with 52 cores and 192GB RAM each (3,640 cores total)  
  • 6 Quest9 compute nodes with 40 cores and 192GB RAM each (240 cores total)
  • 2 Quest10 GPU nodes, with 4 NVIDIA A100 cards each (104 cores total)  
  • 3 High Memory Quest11 nodes with 64 cores and 2TB RAM each (192 cores total)  

For specific information about Quest architectures, see Quest Technical Specifications  

Partitions for GCC Compute Nodes

Job priority for the GCC is assigned by partition and depends on the researcher’s affiliation with Feinberg School of Medicine and Weinberg College of Arts & Sciences. The number of nodes in each partition prioritizes resource availability for Feinberg and Weinberg researchers and jobs that are 48 hours or less.  

Feinberg and Weinberg Researchers

Partition Name Nodes/Cores Maximum Job Duration
genomics 133/7,528 2 days 
genomicslong 107/5,280 10 days
genomics-gpu 2/104 2 days, see GPUs on Quest for job submission guidelines
genomics-himem 3/192 7 days 
genomics-burst 76/3,880 project-based access, email 

All Other Researchers

Partition Name Nodes/Cores Maximum Job Duration
genomicsguest 60/3,048 2 days
genomicsguestex 10/448 10 days or less
genomicsguest-gpu 2/104 2 days, see GPUs on Quest for job submission
genomicsguest-himem 3/192 7 days 

Scratch Storage Space


The Genomics Compute Cluster provides researchers with 655 TB shared scratch storage space in /projects/b1042 for short-term storage.  For maximum flexibility, this scratch space is open to all users of the Genomics Compute Cluster, with a 40 TB individual quota in place for individual researchers.       

To see how much storage you are using in /projects/b1042, use the command “b1042check” at the Quest command line:

[quest_demo@quser24 ~]$ b1042check
quest_demo currently has 183380 files & directories using 3954 GB of their 40960 GB quota in /projects/b1042

To keep this shared GCC scratch storage available for everyone’s use, files that not have been accessed for at least 30 days are deleted through a purge process on a monthly schedule, but may be purged earlier if /projects/b1042 is getting full. Researchers will receive an email reminder of this deletion process every month and expired files that remain in b1042 will be deleted.    

Researchers may not run “touch” commands or similar commands for the purpose of altering their files' timestamps to circumvent this file deletion policy. Research Computing will impose strict storage limits on researchers who violate this policy.   

Any files that need to be kept longer than 30 days should be moved to the researcher’s own storage in their project or home directory, or  transferred to other storage services such as RDSS or FSMResfiles.  

Researchers are requested to use the GCC scratch storage space conscientiously by deleting files that they no longer need in /projects/b1042 as soon as they're done with them, rather than occupying storage for the full 30 days unnecessarily.

Shared Library Space

Shared reference files are available to the genomics research community in the 20 TB, read-only storage space in /projects/genomicsshare. To see a list of files, look in /projects/genomicsshare/README. To request files be added to the shared genomics reference library, email   and include "genomicsshare" in the subject line. To request files be added to the shared genomics reference library,  email   and include "genomicsshare" in the subject line.

Genomics Compute Cluster Community

Genomics researchers at Northwestern are invited to join the GCC slack channel at, as a way to connect with other GCC researchers and stay informed about the GCC.  

To request help, email .