0% found this document useful (0 votes)
56 views

Artemis-Cheat Sheet-Phase-3

This document provides information about logging into and using the HPC cluster Artemis at the University of Sydney, including available storage, job queues, module commands, and examples of PBS scripts for serial, parallel, GPU, and data transfer jobs. It also lists some useful Linux commands for managing files, running programs, and transferring data between local and remote systems.

Uploaded by

saadixyz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views

Artemis-Cheat Sheet-Phase-3

This document provides information about logging into and using the HPC cluster Artemis at the University of Sydney, including available storage, job queues, module commands, and examples of PBS scripts for serial, parallel, GPU, and data transfer jobs. It also lists some useful Linux commands for managing files, running programs, and transferring data between local and remote systems.

Uploaded by

saadixyz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Logging In: Queue Resource limits: GPU Job:

ssh [email protected] #!/bin/bash


ssh -X [email protected] Max #PBS -P MyProj
Memory
Max Cores Memory #PBS -l select=1:ncpus=1:mem=4GB:ngpus=1
Queue per
Walltime per per core
Data Storage: node #PBS -l walltime=1:00:00
Job/User
cd "$PBS_O_WORKDIR"
On Artemis: Small 1 day 24/128 <123GB < 20GB module load cuda
/home/abcd1234 – 10 GB (per User Limit) <Commands to run computation>
Normal 7 days 120/128 <123GB < 20GB
/project/MyProj – 1 TB (Project limit)
/scratch/MyProj – 378 TB Large 21 days 288/288 <123GB < 20GB Data Transfer Job:
High 123 GB #!/bin/bash
21 days 192/192 > 20 GB
Memory to 6 TB #PBS -P MyProj
On RCOS:
/rds/PRJ-MyProj GPU 7 days 252/252 <185GB N/A #PBS -l select=1:ncpus=1:mem=4GB
#PBS -l walltime=1:00:00
small-
12 hours 4/24 <123GB N/A #PBS -q dtq
express
Module commands: cd "$PBS_O_WORKDIR"
module avail scavenger 2 days 288/288 <123GB N/A rsync -axP /scratch/MyProj/MyData /rds/PRJ-
module load matlab dtq 10 days 2/8 <16GB N/A MyProj/MyArchive/
module avail matlab
module load matlab/R2017a Interactive 4 hours 4/4 <123GB N/A
module unload matlab Useful, but optional, PBS directives:
Grey shading: defaultQ sub-queues #PBS -j oe
Interactive: must access via command line (qsub -I) #PBS -W umask=022
Job Queues (all users): Max cores per user: 600 #PBS -M [email protected]
defaultQ (default if unspecified) Max array job elements: 1000 #PBS -m abe
small-express Max simultaneous jobs per user: 200
scavenger N/A = Not Applicable
Useful PBS commands:
dtq (data transfer queue for data transfer only) jobstat
Minimal PBS Scripts: qsub MyPbsScript
User guides, training, and support: Serial job: qdel 1234567
#!/bin/bash qstat -u abcd1234
#PBS -P MyProj qstat -f 1234567
Install software, Artemis troubleshooting: #PBS -l select=1:ncpus=1:mem=4GB qstat -xf 1234567
https://sydney.edu.au/trackit #PBS -l walltime=1:00:00
ICT Services -> Research -> High-Performance cd "$PBS_O_WORKDIR" Interactive Access (all on one line):
Computing Request <Commands to run computation> $ qsub -IXP MyProj -l
select=1:ncpus=1:mem=4GB,walltime=1:00:00
HPC Website: Parallel Job:
https://informatics.sydney.edu.au/services/artemis/ #!/bin/bash
Transferring data to/from Artemis:
#PBS -P MyProj
sftp [email protected]
Training: #PBS -l select=5:ncpus=4:mem=4GB:mpiprocs=4
#PBS -l walltime=1:00:00 For GUI sftp, try CyberDuck or FileZilla
https://informatics.sydney.edu.au/services/training/ cd "$PBS_O_WORKDIR"
[email protected] <Commands to run computation> rsync -axP /scratch/MyProj/data /rds/PRJ-
MyProj/MyArchive/

Date published: 13 Jun 2018. © The University of Sydney


Linux Commands:

Manage Files Manage running programs download file from remote


get filename
computer
list the contents of the current <Ctrl+s> upload file to remote
ls stop the screen from scrolling put filename
directory <control+s> computer
list all files, including hidden <Ctrl+q> change directory on remote
ls -a resume scrolling cd dirname
("dot") files <control+q> computer
list files, showing the long <Ctrl+z> change directory on local
ls -lh version: permissions, size and suspend a program lcd dirname
<control+z> computer
date last modified <Ctrl+c>
vi filename open filename in vim terminate a program
<control+c> smbclient commands
nano filename open filename in nano
cat filename display the contents of filename Miscellaneous Commands download file from remote
less filename display the contents of filename get filename
computer
copy file1 to file2; file1 remains !! repeat the last command upload file to remote
cp file1 file2 put filename
unchanged; file2 is overwritten history list previous commands computer
rename a filename sort input filename in download multiple files to
mv old new from old to new and numerical and/or alphabetical mget
sort filename from remote computer
delete old file order, and order displays on upload multiple files to
rm filename remove (delete) filename the screen mput
remote computer
show the path in your change directory on remote
which command directory where a cd dirname
computer
Manage Directories particular command is located toggle recursive file
find lines with the word recurse on/off
grep "string" transfer on/off
pwd show present working directory "string" in filename and
filename toggle confirmation for file
cd change back to home directory display on the screen prompt on/off
transfers on/off
cd .. change to the previous tar -zcf create .tar.gz file called change directory on local
directory (back one) dirname.tar.gz archive.tar.gz containig the !cd dirname
computer
cd dirname change to a directory dirname directory dirname !ls list files on local computer
named dirname tar -zxf extract .tar.gz archive called execute any shell command
mkdir dirname make a new directory dirname.tar.gz dirname.tar.gz !<shell command> on local computer with the
named dirname display files contained in exclamation mark
rmdir dirname remove a directory tar -ztvf
archive dirname.tar.gz without
named dirname, which must be dirname.tar.gz
extracting them
empty

Date published: 13 Jun 2018. © The University of Sydney

You might also like