GATK APPLYBQSRSPARK

https://img.shields.io/github/issues-pr/snakemake/snakemake-wrappers/bio/gatk/applybqsrspark?label=version%20update%20pull%20requests

ApplyBQSRSpark: Apply base quality score recalibration on Spark; uses output of the BaseRecalibrator tool.

URL: https://gatk.broadinstitute.org/hc/en-us/articles/9570424849051-ApplyBQSRSpark-BETA-

Example

This wrapper can be used in the following way:

rule gatk_applybqsr_spark:
    input:
        bam="mapped/{sample}.bam",
        ref="genome.fasta",
        dict="genome.dict",
        recal_table="recal/{sample}.grp",
    output:
        bam="recal/{sample}.bam",
    log:
        "logs/gatk/gatk_applybqsr_spark/{sample}.log",
    params:
        extra="",  # optional
        java_opts="",  # optional
        #spark_runner="",  # optional, local by default
        #spark_v3.8.0-49-g6f33607="",  # optional
        #spark_extra="", # optional
        embed_ref=True,  # embed reference in cram output
        exceed_thread_limit=True,  # samtools is also parallized and thread limit is not guaranteed anymore
    resources:
        mem_mb=1024,
    wrapper:
        "v3.8.0-49-g6f33607/bio/gatk/applybqsrspark"

Note that input, output and log file paths can be chosen freely.

When running with

snakemake --use-conda

the software dependencies will be automatically deployed into an isolated environment before execution.

Notes

  • The java_opts param allows for additional arguments to be passed to the java compiler, e.g. “-XX:ParallelGCThreads=10” (not for -XmX or -Djava.io.tmpdir, since they are handled automatically).

  • The extra param allows for additional program arguments for applybqsrspark.

  • The spark_runner param = “LOCAL”|”SPARK”|”GCS” allows to set the spark_runner. Set the parameter to “LOCAL” or don’t set it at all to run on local machine.

  • The spark_master param allows to set the URL of the Spark Master to submit the job. Set to “local[number_of_cores]” for local execution. Don’t set it at all for local execution with number of cores determined by snakemake.

  • The spark_extra param allows for additional spark arguments.

Software dependencies

  • gatk4=4.5.0.0

  • snakemake-wrapper-utils=0.6.2

  • samtools=1.19.2

Input/Output

Input:

  • bam file

  • fasta reference

  • recalibration table for the bam

Output:

  • recalibrated bam file

Authors

  • Filipe G. Vieira

Code

__author__ = "Filipe G. Vieira, Christopher Schröder"
__copyright__ = "Copyright 2021, Filipe G. Vieira"
__license__ = "MIT"

import tempfile
import random
from pathlib import Path

from snakemake.shell import shell
from snakemake_wrapper_utils.java import get_java_opts

extra = snakemake.params.get("extra", "")
spark_runner = snakemake.params.get("spark_runner", "LOCAL")
spark_master = snakemake.params.get(
    "spark_master", "local[{}]".format(snakemake.threads)
)
spark_extra = snakemake.params.get("spark_extra", "")
reference = snakemake.input.get("ref")
embed_ref = snakemake.params.get("embed_ref", False)
exceed_thread_limit = snakemake.params.get("exceed_thread_limit", False)
java_opts = get_java_opts(snakemake)

log = snakemake.log_fmt_shell(stdout=True, stderr=True)

if exceed_thread_limit:
    samtools_threads = snakemake.threads
else:
    samtools_threads = 1

if snakemake.output.bam.endswith(".cram") and embed_ref:
    output = "/dev/stdout --create-output-bam-splitting-index false"
    pipe_cmd = " | samtools view -h -O cram,embed_ref -T {reference} -o {snakemake.output.bam} -@ {samtools_threads} -"
else:
    output = snakemake.output.bam
    pipe_cmd = ""


with tempfile.TemporaryDirectory() as tmpdir:
    # This folder must not exist; it is created by GATK
    tmpdir_shards = Path(tmpdir) / "shards_{:06d}".format(random.randrange(10**6))

    shell(
        "(gatk --java-options '{java_opts}' ApplyBQSRSpark"
        " --input {snakemake.input.bam}"
        " --bqsr-recal-file {snakemake.input.recal_table}"
        " --reference {snakemake.input.ref}"
        " {extra}"
        " --tmp-dir {tmpdir}"
        " --output-shard-tmp-dir {tmpdir_shards}"
        " --output {output}"
        " -- --spark-runner {spark_runner} --spark-master {spark_master} {spark_extra}"
        + pipe_cmd
        + ") {log}"
    )