There are many ways to transfer data on HPC. The one you choose will depend on your file size, its source location, and convenience. A secure shell (SSH, SFTP) is required when connecting to an HPC head node to transfer data.
HPC has a dedicated data transfer node, hpc-transfer.usc.edu, that is described on the HPC Data Transfer Server page.
For the fastest possible data transfer rates use hpc-transfer.usc.edu, and transfer files directly to your /staging directory. Keep in mind that data on /staging is not backed up, so it is good practice to save a copy of the data to your project directory (assuming its file size is smaller than your project’s disk quota), where it will be backed up nightly.
Options for Data Transfer
HPC supports large file transfer directly–through command line based UNIX utilities such as rsync and sftp, and through the commercial applications Aspera and Globus Connect. Additionally, you are welcome to install your own file transfer applications in your project directory.
- Transferring files between your computer and HPC: Commercial software with graphical user interfaces make it easy to transfer files between HPC and your personal laptop or desktop. See Transferring Files between your Laptop and HPC.
- Transferring files from the Command Line: Linux and macOS provide transfer utilities that can be run from a command line. These include rsync, sftp, and scp. See Transferring Files from a Command Line.
- Transferring very large files to HPC: HPC maintains two commercial applications that support very large file transfer, IBM’s Aspera and Globus’ GlobusConnect. See Transferring Very Large Files to HPC
Anonymous Data Transfers
All transfers to and from HPC must be made by authenticated users. HPC does not support anonymous transfers. If you are working with collaborators and they need to transfer data, one option is to sponsor them for temporary USC iVIP accounts. See ITS’s iVIP page for more information.
Getting Help: For questions about transferring data, please contact firstname.lastname@example.org.