GET STARTED $
cd
~/crg-‐course
$
vagrant
up $
vagrant
ssh
Login in your course laptop Once in the virtual machine $
cd
~/nextflow-‐tutorial
$
git
pull
$
nextflow
info
PROCESS INPUTS input:
val
x
from
ch_1
file
y
from
ch_2
file
'data.fa'
from
ch_3
stdin
from
from
ch_4
set
(x,
'file.txt')
from
ch_5 process
procName
{
! ! ! ! ! ! ! ! !
"""
PIPELINES PARAMETERS params.p1
=
'alpha'
params.p2
=
'beta'
: Simply declares some variables prefixed by params When launching your script you can override
the default values $
nextflow
run
-‐-‐p1
'delta'
-‐-‐p2
'gamma'<br/>
COLLECT FILE The operator collectFile allows to gather
items produced by upstream processes my_items.collectFile(storeDir:'path/name')
{
!
def
key
=
get_a_key_from_the_item(it)
def
content
=
get_the_item_value(it)
[
key,
content
]
! } Collect the items and group them into files
HOW USE DOCKER Specify in the config file the Docker image to use ! process
{
container
=
} Add the with-docker flag when launching it ! $
nextflow
run
-‐with-‐docker
<br/>
HOW USE THE CLUSTER //
default
properties
for
any
process
process
{
executor
=
'crg'
queue
=
'short'
cpus
=
2
memory
=
'4GB'
scratch
=
true
}
! ! Define the CRG executor in nextflow.config
EXAMPLE 5 $
ssh
username@ant-‐login.linux.crg.es $
module
avail
$
module
purge
$
module
load
nextflow/0.12.3-‐goolf-‐1.4.10-‐no-‐OFED-‐Java-‐1.7.0_21 $
curl
-‐fsSL
get.nextflow.io
|
bash Login in ANT-LOGIN If you have module configured: Otherwise install it downloading from internet
EXAMPLE 5 Create the following nextflow.config file: process
{
executor
=
'crg'
queue
=
'course'
scratch
=
true
} $
nextflow
run
rnatoy
-‐with-‐docker
-‐with-‐trace Launch the pipeline execution: