SCIAMA
High Performance Compute Cluster
Transferring Data to and from SCIAMA
From a user perspective there are two data areas on Sciama. Your home account which, although not currently enforced, should not exceed 20 Gbytes and a project that can be several Tbytes. The project area can be accessed from /mnt/lustre/ . An area corresponding to your account name will be created upon request. It should be stressed that NO DATA ON SCIAMA IS BACKED UP. The size of the project data area will be monitored.
There are a few methods for transferring data to and from Sciama from the command line:
SCP:
For single file transfer. The SCP syntax is as follows. You can either push the data:-
SFTP:
For transferring small directories from your local PC to SCIAMA use the SFTP syntax as follows:-
put local-path
exit
get remote-path
exit
Rsync:
Rsync can resume sending at the last file if it is interrupted during the transfer . For that reason, it’s best to send a number of smaller files rather than a single large file, as it can only start at the beginning of whole files. Again you can push or pull the data. The Rsync push syntax is:-
Rclone:
Rclone uses the rsync command but can be configured for many different cloud sites. load module rclone to use it. On first use you will need to configure your cloud storage, use rclone config and follow the documentation: https://rclone.org/docs/ and https://rclone.org/drive/ for Google Drive setup, note you will need to create your own client ID for Google, https://rclone.org/drive/#making-your-own-client-id.
See also Rclone Knowledge Article.
Globus
Globus is the preferred method to transfer very large amounts of data between institutes. You will need a globus account which is free, go to globus.org and login with your @port.ac.uk account. The University Globus Endpoint server has 73TB of storage available in /data and there are two collections "University Of Portsmouth ICG" and "SCIAMA Lustre ICG" available.
To transfer from Lustre to another institute you can connect to the "SCIAMA Lustre ICG" collection which is a direct mount to our Lustre storage. To transfer to SCIAMA you will need to use the "University of Portsmouth ICG" collection, transfer data to the Globus server then move it to SCIAMA.
To transfer your data to SCIAMA's Lustre storage from the globus machine use:
Note that the 73TB on the Globus server is shared so please remove any data from the server once transfer is complete!
Important : the server will be updated and rebooted on the 10th of each month at 6am. Globus transfers should continue after reboot.