GATK BASERECALIBRATORSPARK
Run gatk BaseRecalibratorSpark.
URL: https://gatk.broadinstitute.org/hc/en-us/articles/9570309302171-BaseRecalibratorSpark-BETA-
Example
This wrapper can be used in the following way:
rule gatk_baserecalibratorspark:
input:
bam="mapped/{sample}.bam",
ref="genome.fasta",
dict="genome.dict",
known="dbsnp.vcf.gz", # optional known sites
output:
recal_table="recal/{sample}.grp",
log:
"logs/gatk/baserecalibrator/{sample}.log",
params:
extra="", # optional
java_opts="", # optional
#spark_runner="", # optional, local by default
#spark_v5.5.2-17-g33d5b76="", # optional
#spark_extra="", # optional
resources:
mem_mb=1024,
threads: 8
wrapper:
"v5.5.2-17-g33d5b76/bio/gatk/baserecalibratorspark"
Note that input, output and log file paths can be chosen freely.
When running with
snakemake --use-conda
the software dependencies will be automatically deployed into an isolated environment before execution.
Notes
The java_opts param allows for additional arguments to be passed to the java compiler, e.g. “-XX:ParallelGCThreads=10” (not for -XmX or -Djava.io.tmpdir, since they are handled automatically).
The extra param allows for additional program arguments for baserecalibratorspark.
The spark_runner param = “LOCAL”|”SPARK”|”GCS” allows to set the spark_runner. Set the parameter to “LOCAL” or don’t set it at all to run on local machine.
The spark_master param allows to set the URL of the Spark Master to submit the job. Set to “local[number_of_cores]” for local execution. Don’t set it at all for local execution with number of cores determined by snakemake.
The spark_extra param allows for additional spark arguments.
Software dependencies
gatk4=4.6.1.0
snakemake-wrapper-utils=0.6.2
Input/Output
Input:
bam file
fasta reference
vcf.gz of known variants
Output:
recalibration table for the bam
Code
__author__ = "Christopher Schröder"
__copyright__ = "Copyright 2020, Christopher Schröder"
__email__ = "christopher.schroeder@tu-dortmund.de"
__license__ = "MIT"
import tempfile
from snakemake.shell import shell
from snakemake_wrapper_utils.java import get_java_opts
extra = snakemake.params.get("extra", "")
spark_runner = snakemake.params.get("spark_runner", "LOCAL")
spark_master = snakemake.params.get(
"spark_master", "local[{}]".format(snakemake.threads)
)
spark_extra = snakemake.params.get("spark_extra", "")
java_opts = get_java_opts(snakemake)
log = snakemake.log_fmt_shell(stdout=True, stderr=True)
known = snakemake.input.get("known", "")
if known:
known = "--known-sites {}".format(known)
with tempfile.TemporaryDirectory() as tmpdir:
shell(
"gatk --java-options '{java_opts}' BaseRecalibratorSpark"
" --input {snakemake.input.bam}"
" --reference {snakemake.input.ref}"
" {extra}"
" --tmp-dir {tmpdir}"
" --output {snakemake.output.recal_table} {known}"
" -- --spark-runner {spark_runner} --spark-master {spark_master} {spark_extra}"
" {log}"
)