Log in to save to my catalogue

doepipeline: a systematic approach to optimizing multi-level and multi-step data processing workflow...

doepipeline: a systematic approach to optimizing multi-level and multi-step data processing workflow...

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_cf8fcb2f6f8d4bdea6c88d2f2d16fcad

doepipeline: a systematic approach to optimizing multi-level and multi-step data processing workflows

About this item

Full title

doepipeline: a systematic approach to optimizing multi-level and multi-step data processing workflows

Publisher

England: BioMed Central Ltd

Journal title

BMC bioinformatics, 2019-10, Vol.20 (1), p.498-13, Article 498

Language

English

Formats

Publication information

Publisher

England: BioMed Central Ltd

More information

Scope and Contents

Contents

Selecting the proper parameter settings for bioinformatic software tools is challenging. Not only will each parameter have an individual effect on the outcome, but there are also potential interaction effects between parameters. Both of these effects may be difficult to predict. To make the situation even more complex, multiple tools may be run in a sequential pipeline where the final output depends on the parameter configuration for each tool in the pipeline. Because of the complexity and difficulty of predicting outcomes, in practice parameters are often left at default settings or set based on personal or peer experience obtained in a trial and error fashion. To allow for the reliable and efficient selection of parameters for bioinformatic pipelines, a systematic approach is needed.
We present doepipeline, a novel approach to optimizing bioinformatic software parameters, based on core concepts of the Design of Experiments methodology and recent advances in subset designs. Optimal parameter settings are first approximated in a screening phase using a subset design that efficiently spans the entire search space, then optimized in the subsequent phase using response surface designs and OLS modeling. Doepipeline was used to optimize parameters in four use cases; 1) de-novo assembly, 2) scaffolding of a fragmented genome assembly, 3) k-mer taxonomic classification of Oxford Nanopore Technologies MinION reads, and 4) genetic variant calling. In all four cases, doepipeline found parameter settings that produced a better outcome with respect to the characteristic measured when compared to using default values. Our approach is implemented and available in the Python package doepipeline.
Our proposed methodology provides a systematic and robust framework for optimizing software parameter settings, in contrast to labor- and time-intensive manual parameter tweaking. Implementation in doepipeline makes our methodology accessible and user-friendly, and allows for automatic optimization of tools in a wide range of cases. The source code of doepipeline is available at https://github.com/clicumu/doepipeline and it can be installed through conda-forge....

Alternative Titles

Full title

doepipeline: a systematic approach to optimizing multi-level and multi-step data processing workflows

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_doaj_primary_oai_doaj_org_article_cf8fcb2f6f8d4bdea6c88d2f2d16fcad

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_cf8fcb2f6f8d4bdea6c88d2f2d16fcad

Other Identifiers

ISSN

1471-2105

E-ISSN

1471-2105

DOI

10.1186/s12859-019-3091-z

How to access this item