SCIAMA
High Performance Compute Cluster
Transferring Data to and from SCIAMA
From a user perspective there are two data areas on Sciama. Your home account which, although not currently enforced, should not exceed 10 Gbytes and a project that can be several Tbytes. The project area can be accessed from /mnt/lustre/ . An area corresponding to your account name will be created upon request. It should be stressed that NO DATA ON SCIAMA IS BACKED UP. The size of the project data area will be monitored.
There are a few methods for transferring data to and from Sciama from the command line:
SCP:
For single file transfer. The SCP syntax is as follows. You can either push the data:-
SFTP:
for transferring small directories. The SFTP syntax is as follows:-
put local-path
get remote-path
exit
Rsync:
Rsync can resume sending at the last file if it is interrupted during the transfer . For that reason, it’s best to send a number of smaller files rather than a single large file, as it can only start at the beginning of whole files. Again you can push or pull the data. The Rsync push syntax is:-
Rclone:
Rclone uses the rsync command but can be configured for many different cloud sites. load module rclone to use it. On first use you will need to configure your cloud storage, use ‘rclone config’ and follow the documentation: https://rclone.org/docs/ and https://rclone.org/drive/ for Google Drive setup, note you will need to create your own client ID for Google, https://rclone.org/drive/#making-your-own-client-id.
See also Rclone Knowledge Article.