Skip to end of metadata
Go to start of metadata


In the HPC center provided for DEEP by the NPS ITACS department, we have a single high-performance server, domex, dedicated to our exclusive usage as well as the ability to run processes on the hamming supercomputer.



Domex is a 64-core, 512GB memory node that is dedicated for DEEP Lab use. If you need access to domex, please ask Michael McCarrin.


After we have created the user accounts, you can access the server by using an SSH client and connecting to the hostname Useres can use an SFTP/SCP client to upload files.


Storage on Domex is limited and provisioned according to need. The command "df -h" can and should be used to check the status of each filesystem.

Home Directories

Users of Domex have limited capacity in their home directories. Instead, use /work or /scratch to deposit files.


Users will be given access to a directory under the /work filesystem where they should upload their data files for staging. The /work filesystem has 30TB available.


The /scratch filesystem is a raid set comprised of high-performance SSD drives and it can sustain high I/O rates if necessary. The filesystems is available for users that need a filesystem with a better read-write performance; however, storage there is also very limited. The filesystem is 1.1TB in size.


The Real Data Corpus is hosted on Domex and available for all users who have access to the appropriate group, corpus. IRB group access will only be given to those users who complete the citi IRB training.




The Hamming supercomputer cluster is a resource that is available to any student at Naval Postgraduate School. The resources available there are provisioned and managed by the ITACS HPC group. Please contact the ITACS helpdesk for an account on the Hamming Supercomputer.


After the account is granted, users can use the standard SSH and SFTP clients to access it at:


Unlike stand-alone servers, Hamming (as well as any supercomputer), requires that you submit jobs to a batch scheduler for processing. These jobs are scripts written in any one of a number of scripting languages, most commonly BASH, called "submit scripts". Then, the user will "submit" these jobs to the cluster using the qsub command to be prioritized for execution.

Usage Guidelines

Running standard Linux commands on the login node is strictly prohibited The login node is only the cluster entry-point; however, users may use an interactive session for testing and modeling their submission scripts.

Interactive Session

Use the command "qsub -I" to acquire an interactive session where you are free to work on a submission script. Please be aware that interactive sessions are killed after 24-hours. They should not be used to for the actual work of your research, just to author and test a submit script for submission to the cluster with qsub.



On Hamming, the DEEP Lab has a group directory under the /work filesystem available there. Please note that this is not the same as the /work filesystem on This directory is also available for DEEP Lab users to store files which they are actively working on.


Hamming also stores a separate copy of the Real Data Corpus, syncronized with domex. Jobs run on hamming can access the corpus at the path /work/deep/corp.

  • No labels