Mercurial > repos > pjbriggs > amplicon_analysis_pipeline
changeset 3:3ab198df8f3f draft
planemo upload for repository https://github.com/pjbriggs/Amplicon_analysis-galaxy commit 15390f18b91d838880d952eb2714f689bbd8a042
author | pjbriggs |
---|---|
date | Thu, 18 Oct 2018 09:18:04 -0400 |
parents | 43d6f81bc667 |
children | 86a12d75ebe4 |
files | README.rst amplicon_analysis_pipeline.py amplicon_analysis_pipeline.xml install_amplicon_analysis.sh install_tool_deps.sh tool_dependencies.xml |
diffstat | 6 files changed, 483 insertions(+), 807 deletions(-) [+] |
line wrap: on
line diff
--- a/README.rst Wed Jun 13 07:45:06 2018 -0400 +++ b/README.rst Thu Oct 18 09:18:04 2018 -0400 @@ -26,20 +26,8 @@ instance to detect the dependencies and reference data correctly at run time. -1. Install the dependencies ---------------------------- - -The ``install_tool_deps.sh`` script can be used to fetch and install the -dependencies locally, for example:: - - install_tool_deps.sh /path/to/local_tool_dependencies - -This can take some time to complete. When finished it should have -created a set of directories containing the dependencies under the -specified top level directory. - -2. Install the tool files -------------------------- +1. Install the tool from the toolshed +------------------------------------- The core tool is hosted on the Galaxy toolshed, so it can be installed directly from there (this is the recommended route): @@ -58,7 +46,7 @@ <tool file="Amplicon_analysis/amplicon_analysis_pipeline.xml" /> -3. Install the reference data +2. Install the reference data ----------------------------- The script ``References.sh`` from the pipeline package at @@ -72,33 +60,14 @@ will install the data in ``/path/to/pipeline/data``. **NB** The final amount of data downloaded and uncompressed will be -around 6GB. - -4. Configure dependencies and reference data in Galaxy ------------------------------------------------------- - -The final steps are to make your Galaxy installation aware of the -tool dependencies and reference data, so it can locate them both when -the tool is run. - -To target the tool dependencies installed previously, add the -following lines to the ``dependency_resolvers_conf.xml`` file in the -Galaxy ``config`` directory:: +around 9GB. - <dependency_resolvers> - ... - <galaxy_packages base_path="/path/to/local_tool_dependencies" /> - <galaxy_packages base_path="/path/to/local_tool_dependencies" versionless="true" /> - ... - </dependency_resolvers> +3. Configure reference data location in Galaxy +---------------------------------------------- -(NB it is recommended to place these *before* the ``<conda ... />`` -resolvers) - -(If you're not familiar with dependency resolvers in Galaxy then -see the documentation at -https://docs.galaxyproject.org/en/master/admin/dependency_resolvers.html -for more details.) +The final step is to make your Galaxy installation aware of the +location of the reference data, so it can locate them both when the +tool is run. The tool locates the reference data via an environment variable called ``AMPLICON_ANALYSIS_REF_DATA_PATH``, which needs to set to the parent @@ -108,7 +77,8 @@ installation is configured: * **For local instances:** add a line to set it in the - ``config/local_env.sh`` file of your Galaxy installation, e.g.:: + ``config/local_env.sh`` file of your Galaxy installation (you + may need to create a new empty file first), e.g.:: export AMPLICON_ANALYSIS_REF_DATA_PATH=/path/to/pipeline/data @@ -124,9 +94,9 @@ <tool id="amplicon_analysis_pipeline" destination="amplicon_analysis"/> (For more about job destinations see the Galaxy documentation at - https://galaxyproject.org/admin/config/jobs/#job-destinations) + https://docs.galaxyproject.org/en/master/admin/jobs.html#job-destinations) -5. Enable rendering of HTML outputs from pipeline +4. Enable rendering of HTML outputs from pipeline ------------------------------------------------- To ensure that HTML outputs are displayed correctly in Galaxy @@ -171,46 +141,32 @@ https://github.com/galaxyproject/galaxy/issues/4490 and https://github.com/galaxyproject/galaxy/issues/1676 -Appendix: availability of tool dependencies -=========================================== - -The tool takes its dependencies from the underlying pipeline script (see -https://github.com/MTutino/Amplicon_analysis/blob/master/README.md -for details). +Appendix: installing the dependencies manually +============================================== -As noted above, currently the ``install_tool_deps.sh`` script can be -used to manually install the dependencies for a local tool install. +If the tool is installed from the Galaxy toolshed (recommended) then +the dependencies should be installed automatically and this step can +be skipped. -In principle these should also be available if the tool were installed -from a toolshed. However it would be preferrable in this case to get as -many of the dependencies as possible via the ``conda`` dependency -resolver. +Otherwise the ``install_amplicon_analysis_deps.sh`` script can be used +to fetch and install the dependencies locally, for example:: -The following are known to be available via conda, with the required -version: + install_amplicon_analysis.sh /path/to/local_tool_dependencies - - cutadapt 1.8.1 - - sickle-trim 1.33 - - bioawk 1.0 - - fastqc 0.11.3 - - R 3.2.0 - -Some dependencies are available but with the "wrong" versions: +(This is the same script as is used to install dependencies from the +toolshed.) This can take some time to complete, and when completed will +have created a directory called ``Amplicon_analysis-1.2.3`` containing +the dependencies under the specified top level directory. - - spades (need 3.5.0) - - qiime (need 1.8.0) - - blast (need 2.2.26) - - vsearch (need 1.1.3) - -The following dependencies are currently unavailable: +**NB** The installed dependencies will occupy around 2.6G of disk +space. - - fasta_number (need 02jun2015) - - fasta-splitter (need 0.2.4) - - rdp_classifier (need 2.2) - - microbiomeutil (need r20110519) +You will need to make sure that the ``bin`` subdirectory of this +directory is on Galaxy's ``PATH`` at runtime, for the tool to be able +to access the dependencies - for example by adding a line to the +``local_env.sh`` file like:: -(NB usearch 6.1.544 and 8.0.1623 are special cases which must be -handled outside of Galaxy's dependency management systems.) + export PATH=/path/to/local_tool_dependencies/Amplicon_analysis-1.2.3/bin:$PATH History ======= @@ -218,6 +174,8 @@ ========== ====================================================================== Version Changes ---------- ---------------------------------------------------------------------- +1.2.3.0 Updated to Amplicon_Analysis_Pipeline version 1.2.3; install + dependencies via tool_dependencies.xml. 1.2.2.0 Updated to Amplicon_Analysis_Pipeline version 1.2.2 (removes jackknifed analysis which is not captured by Galaxy tool) 1.2.1.0 Updated to Amplicon_Analysis_Pipeline version 1.2.1 (adds
--- a/amplicon_analysis_pipeline.py Wed Jun 13 07:45:06 2018 -0400 +++ b/amplicon_analysis_pipeline.py Thu Oct 18 09:18:04 2018 -0400 @@ -60,9 +60,10 @@ sys.stderr.write("%s\n\n" % ('*'*width)) def clean_up_name(sample): - # Remove trailing "_L[0-9]+_001" from Fastq - # pair names - split_name = sample.split('_') + # Remove extensions and trailing "_L[0-9]+_001" from + # Fastq pair names + sample_name = '.'.join(sample.split('.')[:1]) + split_name = sample_name.split('_') if split_name[-1] == "001": split_name = split_name[:-1] if split_name[-1].startswith('L'): @@ -139,10 +140,12 @@ # Link to FASTQs and construct Final_name.txt file sample_names = [] + print "-- making Final_name.txt" with open("Final_name.txt",'w') as final_name: fastqs = iter(args.fastq_pairs) for sample_name,fqr1,fqr2 in zip(fastqs,fastqs,fastqs): sample_name = clean_up_name(sample_name) + print " %s" % sample_name r1 = "%s_R1_.fastq" % sample_name r2 = "%s_R2_.fastq" % sample_name os.symlink(fqr1,r1)
--- a/amplicon_analysis_pipeline.xml Wed Jun 13 07:45:06 2018 -0400 +++ b/amplicon_analysis_pipeline.xml Thu Oct 18 09:18:04 2018 -0400 @@ -1,21 +1,7 @@ -<tool id="amplicon_analysis_pipeline" name="Amplicon Analysis Pipeline" version="1.2.2.0"> +<tool id="amplicon_analysis_pipeline" name="Amplicon Analysis Pipeline" version="1.2.3.0"> <description>analyse 16S rRNA data from Illumina Miseq paired-end reads</description> <requirements> - <requirement type="package" version="1.2.2">amplicon_analysis_pipeline</requirement> - <requirement type="package" version="1.11">cutadapt</requirement> - <requirement type="package" version="1.33">sickle</requirement> - <requirement type="package" version="27-08-2013">bioawk</requirement> - <requirement type="package" version="2.8.1">pandaseq</requirement> - <requirement type="package" version="3.5.0">spades</requirement> - <requirement type="package" version="0.11.3">fastqc</requirement> - <requirement type="package" version="1.8.0">qiime</requirement> - <requirement type="package" version="2.2.26">blast</requirement> - <requirement type="package" version="0.2.4">fasta-splitter</requirement> - <requirement type="package" version="2.2">rdp-classifier</requirement> - <requirement type="package" version="3.2.0">R</requirement> - <requirement type="package" version="1.1.3">vsearch</requirement> - <requirement type="package" version="2010-04-29">microbiomeutil</requirement> - <requirement type="package">fasta_number</requirement> + <requirement type="package" version="1.2.3">amplicon_analysis_pipeline</requirement> </requirements> <stdio> <exit_code range="1:" />
--- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/install_amplicon_analysis.sh Thu Oct 18 09:18:04 2018 -0400 @@ -0,0 +1,425 @@ +#!/bin/sh -e +# +# Prototype script to setup a conda environment with the +# dependencies needed for the Amplicon_analysis_pipeline +# script +# +# Handle command line +usage() +{ + echo "Usage: $(basename $0) [DIR]" + echo "" + echo "Installs the Amplicon_analysis_pipeline package plus" + echo "dependencies in directory DIR (or current directory " + echo "if DIR not supplied)" +} +if [ ! -z "$1" ] ; then + # Check if help was requested + case "$1" in + --help|-h) + usage + exit 0 + ;; + esac + # Assume it's the installation directory + cd $1 +fi +# Versions +PIPELINE_VERSION=1.2.3 +RDP_CLASSIFIER_VERSION=2.2 +# Directories +TOP_DIR=$(pwd)/Amplicon_analysis-${PIPELINE_VERSION} +BIN_DIR=${TOP_DIR}/bin +CONDA_DIR=${TOP_DIR}/conda +CONDA_BIN=${CONDA_DIR}/bin +CONDA_LIB=${CONDA_DIR}/lib +CONDA=${CONDA_BIN}/conda +ENV_NAME="amplicon_analysis_pipeline@${PIPELINE_VERSION}" +ENV_DIR=${CONDA_DIR}/envs/$ENV_NAME +# +# Functions +# +# Report failure and terminate script +fail() +{ + echo "" + echo ERROR $@ >&2 + echo "" + echo "$(basename $0): installation failed" + exit 1 +} +# +# Rewrite the shebangs in the installed conda scripts +# to remove the full path to conda 'bin' directory +rewrite_conda_shebangs() +{ + pattern="s,^#!${CONDA_BIN}/,#!/usr/bin/env ,g" + find ${CONDA_BIN} -type f -exec sed -i "$pattern" {} \; +} +# +# Install conda +install_conda() +{ + echo "++++++++++++++++" + echo "Installing conda" + echo "++++++++++++++++" + if [ -e ${CONDA_DIR} ] ; then + echo "*** $CONDA_DIR already exists ***" >&2 + return + fi + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + wget -q https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh + bash ./Miniconda2-latest-Linux-x86_64.sh -b -p ${CONDA_DIR} + echo Installed conda in ${CONDA_DIR} + # Update the installation files + # This is to avoid problems when the length the installation + # directory path exceeds the limit for the shebang statement + # in the conda files + echo "" + echo -n "Rewriting conda shebangs..." + rewrite_conda_shebangs + echo "ok" + echo -n "Adding conda bin to PATH..." + PATH=${CONDA_BIN}:$PATH + echo "ok" + cd $cwd + rm -rf $wd/* + rmdir $wd +} +# +# Create conda environment +install_conda_packages() +{ + echo "+++++++++++++++++++++++++" + echo "Installing conda packages" + echo "+++++++++++++++++++++++++" + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + cat >environment.yml <<EOF +name: ${ENV_NAME} +channels: + - defaults + - conda-forge + - bioconda +dependencies: + - python=2.7 + - cutadapt=1.11 + - sickle-trim=1.33 + - bioawk=1.0 + - pandaseq=2.8.1 + - spades=3.5.0 + - fastqc=0.11.3 + - qiime=1.8.0 + - blast-legacy=2.2.26 + - fasta-splitter=0.2.4 + - rdp_classifier=$RDP_CLASSIFIER_VERSION + - vsearch=1.1.3 + # Need to explicitly specify libgfortran + # version (otherwise get version incompatible + # with numpy=1.7.1) + - libgfortran=1.0 + # Compilers needed to build R + - gcc_linux-64 + - gxx_linux-64 + - gfortran_linux-64 +EOF + ${CONDA} env create --name "${ENV_NAME}" -f environment.yml + echo Created conda environment in ${ENV_DIR} + cd $cwd + rm -rf $wd/* + rmdir $wd +} +# +# Install all the non-conda dependencies in a single +# function (invokes separate functions for each package) +install_non_conda_packages() +{ + echo "+++++++++++++++++++++++++++++" + echo "Installing non-conda packages" + echo "+++++++++++++++++++++++++++++" + # Temporary working directory + local wd=$(mktemp -d) + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + # Amplicon analysis pipeline + echo -n "Installing Amplicon_analysis_pipeline..." + if [ -e ${BIN_DIR}/Amplicon_analysis_pipeline.sh ] ; then + echo "already installed" + else + install_amplicon_analysis_pipeline + echo "ok" + fi + # ChimeraSlayer + echo -n "Installing ChimeraSlayer..." + if [ -e ${BIN_DIR}/ChimeraSlayer.pl ] ; then + echo "already installed" + else + install_chimeraslayer + echo "ok" + fi + # Uclust + echo -n "Installing uclust for QIIME/pyNAST..." + if [ -e ${BIN_DIR}/uclust ] ; then + echo "already installed" + else + install_uclust + echo "ok" + fi + # R 3.2.1" + echo -n "Checking for R 3.2.1..." + if [ -e ${BIN_DIR}/R ] ; then + echo "R already installed" + else + echo "not found" + install_R_3_2_1 + fi +} +# +# Amplicon analyis pipeline +install_amplicon_analysis_pipeline() +{ + local wd=$(mktemp -d) + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + wget -q https://github.com/MTutino/Amplicon_analysis/archive/v${PIPELINE_VERSION}.tar.gz + tar zxf v${PIPELINE_VERSION}.tar.gz + cd Amplicon_analysis-${PIPELINE_VERSION} + INSTALL_DIR=${TOP_DIR}/share/amplicon_analysis_pipeline-${PIPELINE_VERSION} + mkdir -p $INSTALL_DIR + ln -s $INSTALL_DIR ${TOP_DIR}/share/amplicon_analysis_pipeline + for f in *.sh ; do + /bin/cp $f $INSTALL_DIR + done + /bin/cp -r uc2otutab $INSTALL_DIR + mkdir -p ${BIN_DIR} + cat >${BIN_DIR}/Amplicon_analysis_pipeline.sh <<EOF +#!/usr/bin/env bash +# +# Point to Qiime config +export QIIME_CONFIG_FP=${TOP_DIR}/qiime/qiime_config +# Set up the RDP jar file +export RDP_JAR_PATH=${TOP_DIR}/share/rdp_classifier/rdp_classifier-${RDP_CLASSIFIER_VERSION}.jar +# Put the scripts onto the PATH +export PATH=${BIN_DIR}:${INSTALL_DIR}:\$PATH +# Activate the conda environment +export PATH=${CONDA_BIN}:\$PATH +source ${CONDA_BIN}/activate ${ENV_NAME} +# Execute the driver script with the supplied arguments +$INSTALL_DIR/Amplicon_analysis_pipeline.sh \$@ +exit \$? +EOF + chmod 0755 ${BIN_DIR}/Amplicon_analysis_pipeline.sh + cat >${BIN_DIR}/install_reference_data.sh <<EOF +#!/usr/bin/env bash -e +# +function usage() { + echo "Usage: \$(basename \$0) DIR" +} +if [ -z "\$1" ] ; then + usage + exit 0 +elif [ "\$1" == "--help" ] || [ "\$1" == "-h" ] ; then + usage + echo "" + echo "Install reference data into DIR" + exit 0 +fi +echo "==========================================" +echo "Installing Amplicon analysis pipeline data" +echo "==========================================" +if [ ! -e "\$1" ] ; then + echo "Making directory \$1" + mkdir -p \$1 +fi +cd \$1 +DATA_DIR=\$(pwd) +echo "Installing reference data under \$DATA_DIR" +$INSTALL_DIR/References.sh +echo "" +echo "Use '-r \$DATA_DIR' when running Amplicon_analysis_pipeline.sh" +echo "to use the reference data from this directory" +echo "" +echo "\$(basename \$0): finished" +EOF + chmod 0755 ${BIN_DIR}/install_reference_data.sh + cd $cwd + rm -rf $wd/* + rmdir $wd +} +# +# ChimeraSlayer +install_chimeraslayer() +{ + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + wget -q https://sourceforge.net/projects/microbiomeutil/files/__OLD_VERSIONS/microbiomeutil_2010-04-29.tar.gz + tar zxf microbiomeutil_2010-04-29.tar.gz + cd microbiomeutil_2010-04-29 + INSTALL_DIR=${TOP_DIR}/share/microbiome_chimeraslayer-2010-04-29 + mkdir -p $INSTALL_DIR + ln -s $INSTALL_DIR ${TOP_DIR}/share/microbiome_chimeraslayer + /bin/cp -r ChimeraSlayer $INSTALL_DIR + cat >${BIN_DIR}/ChimeraSlayer.pl <<EOF +#!/usr/bin/env bash +export PATH=$INSTALL_DIR:\$PATH +$INSTALL_DIR/ChimeraSlayer/ChimeraSlayer.pl $@ +EOF + chmod 0755 ${INSTALL_DIR}/ChimeraSlayer/ChimeraSlayer.pl + chmod 0755 ${BIN_DIR}/ChimeraSlayer.pl + cd $cwd + rm -rf $wd/* + rmdir $wd +} +# +# uclust required for QIIME/pyNAST +# License only allows this version to be used with those two packages +# See: http://drive5.com/uclust/downloads1_2_22q.html +install_uclust() +{ + local wd=$(mktemp -d) + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + wget -q http://drive5.com/uclust/uclustq1.2.22_i86linux64 + INSTALL_DIR=${TOP_DIR}/share/uclust-1.2.22 + mkdir -p $INSTALL_DIR + ln -s $INSTALL_DIR ${TOP_DIR}/share/uclust + /bin/mv uclustq1.2.22_i86linux64 ${INSTALL_DIR}/uclust + chmod 0755 ${INSTALL_DIR}/uclust + ln -s ${INSTALL_DIR}/uclust ${BIN_DIR} + cd $cwd + rm -rf $wd/* + rmdir $wd +} +# +# R 3.2.1 +# Can't use version from conda due to dependency conflicts +install_R_3_2_1() +{ + . ${CONDA_BIN}/activate ${ENV_NAME} + local cwd=$(pwd) + local wd=$(mktemp -d) + cd $wd + echo -n "Fetching R 3.2.1 source code..." + wget -q http://cran.r-project.org/src/base/R-3/R-3.2.1.tar.gz + echo "ok" + INSTALL_DIR=${TOP_DIR} + mkdir -p $INSTALL_DIR + echo -n "Unpacking source code..." + tar xzf R-3.2.1.tar.gz >INSTALL.log 2>&1 + echo "ok" + cd R-3.2.1 + echo -n "Running configure..." + ./configure --prefix=$INSTALL_DIR --with-x=no --with-readline=no >>INSTALL.log 2>&1 + echo "ok" + echo -n "Running make..." + make >>INSTALL.log 2>&1 + echo "ok" + echo -n "Running make install..." + make install >>INSTALL.log 2>&1 + echo "ok" + cd $cwd + rm -rf $wd/* + rmdir $wd + . ${CONDA_BIN}/deactivate +} +setup_pipeline_environment() +{ + echo "+++++++++++++++++++++++++++++++" + echo "Setting up pipeline environment" + echo "+++++++++++++++++++++++++++++++" + # vsearch113 + echo -n "Setting up vsearch113..." + if [ -e ${BIN_DIR}/vsearch113 ] ; then + echo "already exists" + elif [ ! -e ${ENV_DIR}/bin/vsearch ] ; then + echo "failed" + fail "vsearch not found" + else + ln -s ${ENV_DIR}/bin/vsearch ${BIN_DIR}/vsearch113 + echo "ok" + fi + # fasta_splitter.pl + echo -n "Setting up fasta_splitter.pl..." + if [ -e ${BIN_DIR}/fasta-splitter.pl ] ; then + echo "already exists" + elif [ ! -e ${ENV_DIR}/share/fasta-splitter/fasta-splitter.pl ] ; then + echo "failed" + fail "fasta-splitter.pl not found" + else + ln -s ${ENV_DIR}/share/fasta-splitter/fasta-splitter.pl ${BIN_DIR}/fasta-splitter.pl + echo "ok" + fi + # rdp_classifier.jar + local rdp_classifier_jar=rdp_classifier-${RDP_CLASSIFIER_VERSION}.jar + echo -n "Setting up rdp_classifier.jar..." + if [ -e ${TOP_DIR}/share/rdp_classifier/${rdp_classifier_jar} ] ; then + echo "already exists" + elif [ ! -e ${ENV_DIR}/share/rdp_classifier/rdp_classifier.jar ] ; then + echo "failed" + fail "rdp_classifier.jar not found" + else + mkdir -p ${TOP_DIR}/share/rdp_classifier + ln -s ${ENV_DIR}/share/rdp_classifier/rdp_classifier.jar ${TOP_DIR}/share/rdp_classifier/${rdp_classifier_jar} + echo "ok" + fi + # qiime_config + echo -n "Setting up qiime_config..." + if [ -e ${TOP_DIR}/qiime/qiime_config ] ; then + echo "already exists" + else + mkdir -p ${TOP_DIR}/qiime + cat >${TOP_DIR}/qiime/qiime_config <<EOF-qiime-config +qiime_scripts_dir ${ENV_DIR}/bin +EOF-qiime-config + echo "ok" + fi +} +# +# Remove the compilers from the conda environment +# Not sure if this step is necessary +remove_conda_compilers() +{ + echo "+++++++++++++++++++++++++++++++++++++++++" + echo "Removing compilers from conda environment" + echo "+++++++++++++++++++++++++++++++++++++++++" + ${CONDA} remove -y -n ${ENV_NAME} gcc_linux-64 gxx_linux-64 gfortran_linux-64 +} +# +# Top level script does the installation +echo "=======================================" +echo "Amplicon_analysis_pipeline installation" +echo "=======================================" +echo "Installing into ${TOP_DIR}" +if [ -e ${TOP_DIR} ] ; then + fail "Directory already exists" +fi +mkdir -p ${TOP_DIR} +install_conda +install_conda_packages +install_non_conda_packages +setup_pipeline_environment +remove_conda_compilers +echo "====================================" +echo "Amplicon_analysis_pipeline installed" +echo "====================================" +echo "" +echo "Install reference data using:" +echo "" +echo "\$ ${BIN_DIR}/install_reference_data.sh DIR" +echo "" +echo "Run pipeline scripts using:" +echo "" +echo "\$ ${BIN_DIR}/Amplicon_analysis_pipeline.sh ..." +echo "" +echo "(or add ${BIN_DIR} to your PATH)" +echo "" +echo "$(basename $0): finished" +## +#
--- a/install_tool_deps.sh Wed Jun 13 07:45:06 2018 -0400 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 @@ -1,712 +0,0 @@ -#!/bin/bash -e -# -# Install the tool dependencies for Amplicon_analysis_pipeline.sh for -# testing from command line -# -function install_python_package() { - echo Installing $2 $3 from $4 under $1 - local install_dir=$1 - local install_dirs="$install_dir $install_dir/bin $install_dir/lib/python2.7/site-packages" - for d in $install_dirs ; do - if [ ! -d $d ] ; then - mkdir -p $d - fi - done - wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q $4 - if [ ! -f "$(basename $4)" ] ; then - echo "No archive $(basename $4)" - exit 1 - fi - tar xzf $(basename $4) - if [ ! -d "$5" ] ; then - echo "No directory $5" - exit 1 - fi - cd $5 - /bin/bash <<EOF -export PYTHONPATH=$install_dir:$PYTHONPATH && \ -export PYTHONPATH=$install_dir/lib/python2.7/site-packages:$PYTHONPATH && \ -python setup.py install --prefix=$install_dir --install-scripts=$install_dir/bin --install-lib=$install_dir/lib/python2.7/site-packages >>$INSTALL_DIR/INSTALLATION.log 2>&1 -EOF - popd - rm -rf $wd/* - rmdir $wd -} -function install_amplicon_analysis_pipeline_1_2_2() { - install_amplicon_analysis_pipeline $1 1.2.2 -} -function install_amplicon_analysis_pipeline_1_2_1() { - install_amplicon_analysis_pipeline $1 1.2.1 -} -function install_amplicon_analysis_pipeline_1_1() { - install_amplicon_analysis_pipeline $1 1.1 -} -function install_amplicon_analysis_pipeline_1_0() { - install_amplicon_analysis_pipeline $1 1.0 -} -function install_amplicon_analysis_pipeline() { - version=$2 - echo Installing Amplicon_analysis $version - install_dir=$1/amplicon_analysis_pipeline/$version - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - echo Moving to $install_dir - pushd $install_dir - wget -q https://github.com/MTutino/Amplicon_analysis/archive/v${version}.tar.gz - tar zxf v${version}.tar.gz - mv Amplicon_analysis-${version} Amplicon_analysis - rm -rf v${version}.tar.gz - popd - # Make setup file - cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup Amplicon_analysis/$version -echo Setting up Amplicon analysis pipeline $version -export PATH=$install_dir/Amplicon_analysis:\$PATH -## AMPLICON_ANALYSIS_REF_DATA_PATH should be set in -## config/local_env.sh or in the job_conf.xml file -## - see the README -##export AMPLICON_ANALYSIS_REF_DATA_PATH= -# -EOF -} -function install_amplicon_analysis_pipeline_1_0_patched() { - version="1.0-patched" - echo Installing Amplicon_analysis $version - install_dir=$1/amplicon_analysis_pipeline/$version - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - echo Moving to $install_dir - pushd $install_dir - # Clone and patch analysis pipeline scripts - git clone https://github.com/pjbriggs/Amplicon_analysis.git - cd Amplicon_analysis - git checkout -b $version - branches= - if [ ! -z "$branches" ] ; then - for branch in $branches ; do - git checkout -b $branch origin/$branch - git checkout $version - git merge -m "Merge $branch into $version" $branch - done - fi - cd .. - popd - # Make setup file - cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup Amplicon_analysis/$version -echo Setting up Amplicon analysis pipeline $version -export PATH=$install_dir/Amplicon_analysis:\$PATH -## AMPLICON_ANALYSIS_REF_DATA_PATH should be set in -## config/local_env.sh or in the job_conf.xml file -## - see the README -##export AMPLICON_ANALYSIS_REF_DATA_PATH= -# -EOF -} -function install_cutadapt_1_11() { - echo Installing cutadapt 1.11 - INSTALL_DIR=$1/cutadapt/1.11 - if [ -f $INSTALL_DIR/env.sh ] ; then - return - fi - mkdir -p $INSTALL_DIR - install_python_package $INSTALL_DIR cutadapt 1.11 \ - https://pypi.python.org/packages/47/bf/9045e90dac084a90aa2bb72c7d5aadefaea96a5776f445f5b5d9a7a2c78b/cutadapt-1.11.tar.gz \ - cutadapt-1.11 - # Make setup file - cat > $INSTALL_DIR/env.sh <<EOF -#!/bin/sh -# Source this to setup cutadapt/1.11 -echo Setting up cutadapt 1.11 -#if [ -f $1/python/2.7.10/env.sh ] ; then -# . $1/python/2.7.10/env.sh -#fi -export PATH=$INSTALL_DIR/bin:\$PATH -export PYTHONPATH=$INSTALL_DIR:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib/python2.7:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib/python2.7/site-packages:\$PYTHONPATH -# -EOF -} -function install_sickle_1_33() { - echo Installing sickle 1.33 - INSTALL_DIR=$1/sickle/1.33 - if [ -f $INSTALL_DIR/env.sh ] ; then - return - fi - mkdir -p $INSTALL_DIR - mkdir -p $INSTALL_DIR/bin - wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://github.com/najoshi/sickle/archive/v1.33.tar.gz - tar zxf v1.33.tar.gz - cd sickle-1.33 - make >$INSTALL_DIR/INSTALLATION.log 2>&1 - mv sickle $INSTALL_DIR/bin - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $INSTALL_DIR/env.sh <<EOF -#!/bin/sh -# Source this to setup sickle/1.33 -echo Setting up sickle 1.33 -export PATH=$INSTALL_DIR/bin:\$PATH -# -EOF -} -function install_bioawk_27_08_2013() { - echo Installing bioawk 27-08-2013 - INSTALL_DIR=$1/bioawk/27-08-2013 - if [ -f $INSTALL_DIR/env.sh ] ; then - return - fi - mkdir -p $INSTALL_DIR - mkdir -p $INSTALL_DIR/bin - wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://github.com/lh3/bioawk/archive/v1.0.tar.gz - tar zxf v1.0.tar.gz - cd bioawk-1.0 - make >$INSTALL_DIR/INSTALLATION.log 2>&1 - mv bioawk $INSTALL_DIR/bin - mv maketab $INSTALL_DIR/bin - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $INSTALL_DIR/env.sh <<EOF -#!/bin/sh -# Source this to setup bioawk/2013-07-13 -echo Setting up bioawk 2013-07-13 -export PATH=$INSTALL_DIR/bin:\$PATH -# -EOF -} -function install_pandaseq_2_8_1() { - # Taken from https://github.com/fls-bioinformatics-core/galaxy-tools/blob/master/local_dependency_installers/pandaseq.sh - echo Installing pandaseq 2.8.1 - local install_dir=$1/pandaseq/2.8.1 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://github.com/neufeld/pandaseq/archive/v2.8.1.tar.gz - tar xzf v2.8.1.tar.gz - cd pandaseq-2.8.1 - ./autogen.sh >$install_dir/INSTALLATION.log 2>&1 - ./configure --prefix=$install_dir >>$install_dir/INSTALLATION.log 2>&1 - make; make install >>$install_dir/INSTALLATION.log 2>&1 - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $1/pandaseq/2.8.1/env.sh <<EOF -#!/bin/sh -# Source this to setup pandaseq/2.8.1 -echo Setting up pandaseq 2.8.1 -export PATH=$install_dir/bin:\$PATH -export LD_LIBRARY_PATH=$install_dir/lib:\$LD_LIBRARY_PATH -# -EOF -} -function install_spades_3_5_0() { - # See http://spades.bioinf.spbau.ru/release3.5.0/manual.html - echo Installing spades 3.5.0 - local install_dir=$1/spades/3.5.0 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q http://spades.bioinf.spbau.ru/release3.5.0/SPAdes-3.5.0-Linux.tar.gz - tar zxf SPAdes-3.5.0-Linux.tar.gz - cd SPAdes-3.5.0-Linux - mv bin $install_dir - mv share $install_dir - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $1/spades/3.5.0/env.sh <<EOF -#!/bin/sh -# Source this to setup spades/3.5.0 -echo Setting up spades 3.5.0 -export PATH=$install_dir/bin:\$PATH -# -EOF -} -function install_fastqc_0_11_3() { - echo Installing fastqc 0.11.3 - local install_dir=$1/fastqc/0.11.3 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q http://www.bioinformatics.babraham.ac.uk/projects/fastqc/fastqc_v0.11.3.zip - unzip -qq fastqc_v0.11.3.zip - cd FastQC - chmod 0755 fastqc - mv * $install_dir - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $1/fastqc/0.11.3/env.sh <<EOF -#!/bin/sh -# Source this to setup fastqc/0.11.3 -echo Setting up fastqc 0.11.3 -export PATH=$install_dir:\$PATH -# -EOF -} -function install_qiime_1_8_0() { - # See http://qiime.org/1.8.0/install/install.html - echo Installing qiime 1.8.0 - INSTALL_DIR=$1/qiime/1.8.0 - if [ -f $INSTALL_DIR/env.sh ] ; then - return - fi - mkdir -p $INSTALL_DIR - # Atlas 3.10 (precompiled) - # NB this stolen from galaxyproject/iuc-tools - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://depot.galaxyproject.org/software/atlas/atlas_3.10.2_linux_x64.tar.gz - tar zxvf atlas_3.10.2_linux_x64.tar.gz - mv lib $INSTALL_DIR - command -v gfortran || return 0 - BUNDLED_LGF_CANON=$INSTALL_DIR/lib/libgfortran.so.3.0.0 - BUNDLED_LGF_VERS=`objdump -p $BUNDLED_LGF_CANON | grep GFORTRAN_1 | sed -r 's/.*GFORTRAN_1\.([0-9])+/\1/' | sort -n | tail -1` - echo 'program test; end program test' > test.f90 - gfortran -o test test.f90 - LGF=`ldd test | grep libgfortran | awk '{print $3}'` - LGF_CANON=`readlink -f $LGF` - LGF_VERS=`objdump -p $LGF_CANON | grep GFORTRAN_1 | sed -r 's/.*GFORTRAN_1\.([0-9])+/\1/' | sort -n | tail -1` - if [ $LGF_VERS -gt $BUNDLED_LGF_VERS ]; then - cp -p $BUNDLED_LGF_CANON ${BUNDLED_LGF_CANON}.bundled - cp -p $LGF_CANON $BUNDLED_LGF_CANON - fi - popd - rm -rf $wd/* - rmdir $wd - # Atlas 3.10 (build from source) - # NB this stolen from galaxyproject/iuc-tools - ##local wd=$(mktemp -d) - ##echo Moving to $wd - ##pushd $wd - ##wget -q https://depot.galaxyproject.org/software/atlas/atlas_3.10.2+gx0_src_all.tar.bz2 - ##wget -q https://depot.galaxyproject.org/software/lapack/lapack_3.5.0_src_all.tar.gz - ##wget -q https://depot.galaxyproject.org/software/atlas/atlas_patch-blas-lapack-1.0_src_all.diff - ##wget -q https://depot.galaxyproject.org/software/atlas/atlas_patch-shared-lib-1.0_src_all.diff - ##wget -q https://depot.galaxyproject.org/software/atlas/atlas_patch-cpu-throttle-1.0_src_all.diff - ##tar -jxvf atlas_3.10.2+gx0_src_all.tar.bz2 - ##cd ATLAS - ##mkdir build - ##patch -p1 < ../atlas_patch-blas-lapack-1.0_src_all.diff - ##patch -p1 < ../atlas_patch-shared-lib-1.0_src_all.diff - ##patch -p1 < ../atlas_patch-cpu-throttle-1.0_src_all.diff - ##cd build - ##../configure --prefix="$INSTALL_DIR" -D c -DWALL -b 64 -Fa alg '-fPIC' --with-netlib-lapack-tarfile=../../lapack_3.5.0_src_all.tar.gz -v 2 -t 0 -Si cputhrchk 0 - ##make - ##make install - ##popd - ##rm -rf $wd/* - ##rmdir $wd - export ATLAS_LIB_DIR=$INSTALL_DIR/lib - export ATLAS_INCLUDE_DIR=$INSTALL_DIR/include - export ATLAS_BLAS_LIB_DIR=$INSTALL_DIR/lib/atlas - export ATLAS_LAPACK_LIB_DIR=$INSTALL_DIR/lib/atlas - export ATLAS_ROOT_PATH=$INSTALL_DIR - export LD_LIBRARY_PATH=$INSTALL_DIR/lib:$LD_LIBRARY_PATH - export LD_LIBRARY_PATH=$INSTALL_DIR/lib/atlas:$LD_LIBRARY_PATH - # Numpy 1.7.1 - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://depot.galaxyproject.org/software/numpy/numpy_1.7_src_all.tar.gz - tar -zxvf numpy_1.7_src_all.tar.gz - cd numpy-1.7.1 - cat > site.cfg <<EOF -[DEFAULT] -library_dirs = $ATLAS_LIB_DIR -include_dirs = $ATLAS_INCLUDE_DIR -[blas_opt] -libraries = blas, atlas -[lapack_opt] -libraries = lapack, atlas -EOF - export PYTHONPATH=$PYTHONPATH:$INSTALL_DIR/lib/python2.7 - export ATLAS=$ATLAS_ROOT_PATH - python setup.py install --install-lib $INSTALL_DIR/lib/python2.7 --install-scripts $INSTALL_DIR/bin - popd - rm -rf $wd/* - rmdir $wd - # Python packages - ##install_python_package $INSTALL_DIR numpy 1.7.1 \ - ## https://pypi.python.org/packages/84/fb/5e9dfeeb5d8909d659e6892c97c9aa66d3798fad50e1d3d66b3c614a9c35/numpy-1.7.1.tar.gz \ - ## numpy-1.7.1 - install_python_package $INSTALL_DIR matplotlib 1.3.1 \ - https://pypi.python.org/packages/d4/d0/17f17792a4d50994397052220dbe3ac9850ecbde0297b7572933fa4a5c98/matplotlib-1.3.1.tar.gz \ - matplotlib-1.3.1 - install_python_package $INSTALL_DIR qiime 1.8.0 \ - https://github.com/biocore/qiime/archive/1.8.0.tar.gz \ - qiime-1.8.0 - install_python_package $INSTALL_DIR pycogent 1.5.3 \ - https://pypi.python.org/packages/1f/9f/c6f6afe09a3d62a6e809c7745413ffff0f1e8e04d88ab7b56faedf31fe28/cogent-1.5.3.tgz \ - cogent-1.5.3 - install_python_package $INSTALL_DIR pyqi 0.3.1 \ - https://pypi.python.org/packages/60/f0/a7392f5f5caf59a50ccaddbb35a458514953512b7dd6053567cb02849c6e/pyqi-0.3.1.tar.gz \ - pyqi-0.3.1 - install_python_package $INSTALL_DIR biom-format 1.3.1 \ - https://pypi.python.org/packages/98/3b/4e80a9a5c4a3c6764aa8c0c994973e7df71eee02fc6b8cc6e1d06a64ab7e/biom-format-1.3.1.tar.gz \ - biom-format-1.3.1 - install_python_package $INSTALL_DIR qcli 0.1.0 \ - https://pypi.python.org/packages/9a/9a/9c634aed339a5f063e0c954ae439d03b33a7159aa50c6f21034fe2d48fe8/qcli-0.1.0.tar.gz \ - qcli-0.1.0 - install_python_package $INSTALL_DIR pynast 1.2.2 \ - https://pypi.python.org/packages/a0/82/f381ff91afd7a2d92e74c7790823e256d87d5cd0a98c12eaac3d3ec64b8f/pynast-1.2.2.tar.gz \ - pynast-1.2.2 - install_python_package $INSTALL_DIR emperor 0.9.3 \ - https://pypi.python.org/packages/cd/f1/5d502a16a348efe1af7a8d4f41b639c9a165bca0b2f9db36bce89ad1ab40/emperor-0.9.3.tar.gz \ - emperor-0.9.3 - # Update the acceptable Python version - sed -i 's/acceptable_version = (2,7,3)/acceptable_version = (2,7,6)/g' $INSTALL_DIR/bin/print_qiime_config.py - # Non-Python dependencies - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q http://www.microbesonline.org/fasttree/FastTree - chmod 0755 FastTree - mv FastTree $INSTALL_DIR/bin - # Config file - sed -i 's,qiime_scripts_dir,qiime_scripts_dir\t'"$INSTALL_DIR\/bin"',g' $INSTALL_DIR/lib/python2.7/site-packages/qiime/support_files/qiime_config - popd - rm -rf $wd/* - rmdir $wd - # Make setup file - cat > $INSTALL_DIR/env.sh <<EOF -#!/bin/sh -# Source this to setup qiime/1.8.0 -echo Setting up qiime 1.8.0 -#if [ -f $1/python/2.7.10/env.sh ] ; then -# . $1/python/2.7.10/env.sh -#fi -export QIIME_CONFIG_FP=$INSTALL_DIR/lib/python2.7/site-packages/qiime/support_files/qiime_config -export PATH=$INSTALL_DIR/bin:\$PATH -export PYTHONPATH=$INSTALL_DIR:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib/python2.7:\$PYTHONPATH -export PYTHONPATH=$INSTALL_DIR/lib/python2.7/site-packages:\$PYTHONPATH -export LD_LIBRARY_PATH=$ATLAS_LIB_DIR:\$LD_LIBRARY_PATH -export LD_LIBRARY_PATH=$ATLAS_LIB_DIR/atlas::\$LD_LIBRARY_PATH -# -EOF -} -function install_vsearch_1_1_3() { - echo Installing vsearch 1.1.3 - local install_dir=$1/vsearch/1.1.3 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir/bin - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://github.com/torognes/vsearch/releases/download/v1.1.3/vsearch-1.1.3-linux-x86_64 - chmod 0755 vsearch-1.1.3-linux-x86_64 - mv vsearch-1.1.3-linux-x86_64 $install_dir/bin/vsearch - ln -s $install_dir/bin/vsearch $install_dir/bin/vsearch113 - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup vsearch/1.1.3 -echo Setting up vsearch 1.1.3 -export PATH=$install_dir/bin:\$PATH -# -EOF -} -function install_microbiomeutil_2010_04_29() { - # Provides ChimeraSlayer - echo Installing microbiomeutil 2010-04-29 - local install_dir=$1/microbiomeutil/2010-04-29 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://sourceforge.net/projects/microbiomeutil/files/__OLD_VERSIONS/microbiomeutil_2010-04-29.tar.gz - tar zxf microbiomeutil_2010-04-29.tar.gz - cd microbiomeutil_2010-04-29 - make >$install_dir/INSTALLATION.log 2>&1 - mv * $install_dir - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup microbiomeutil/2010-04-29 -echo Setting up microbiomeutil 2010-04-29 -export PATH=$install_dir/ChimeraSlayer:\$PATH -# -EOF -} -function install_blast_2_2_26() { - echo Installing blast 2.2.26 - local install_dir=$1/blast/2.2.26 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q ftp://ftp.ncbi.nlm.nih.gov/blast/executables/legacy/2.2.26/blast-2.2.26-x64-linux.tar.gz - tar zxf blast-2.2.26-x64-linux.tar.gz - cd blast-2.2.26 - mv * $install_dir - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup blast/2.2.26 -echo Setting up blast 2.2.26 -export PATH=$install_dir/bin:\$PATH -# -EOF -} -function install_fasta_number() { - # See http://drive5.com/python/fasta_number_py.html - echo Installing fasta_number - # Install to "default" version i.e. essentially a versionless - # installation (see Galaxy dependency resolver docs) - local install_dir=$1/fasta_number - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - # Download and use MD5 as local version - wget -q http://drive5.com/python/python_scripts.tar.gz - local version=$(md5sum python_scripts.tar.gz | cut -d" " -f1) - # Check for existing installation - local default_dir=$install_dir/default - install_dir=$install_dir/$version - if [ -f $install_dir/env.sh ] ; then - return - fi - # Install scripts and make 'default' link - mkdir -p $install_dir/bin - mkdir -p $install_dir/lib - tar zxf python_scripts.tar.gz - mv fasta_number.py $install_dir/bin - mv die.py $install_dir/lib - ln -s $version $default_dir - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup fasta_number/$version -echo Setting up fasta_number $version -export PATH=$install_dir/bin:\$PATH -export PYTHONPATH=$install_dir/lib:\$PYTHONPATH -# -EOF -} -function install_fasta_splitter_0_2_4() { - echo Installing fasta-splitter 0.2.4 - local install_dir=$1/fasta-splitter/0.2.4 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir/bin - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - # Install Perl packages using cpanm - mkdir -p $install_dir/lib/perl5 - wget -q -L https://cpanmin.us/ -O cpanm - chmod +x cpanm - for package in "File::Util" ; do - /bin/bash <<EOF -export PATH=$install_dir/bin:$PATH PERL5LIB=$install_dir/lib/perl5:$PERL5LIB && \ -./cpanm -l $install_dir $package >>$install_dir/INSTALLATION.log -EOF - done - # Install fasta-splitter - wget -q http://kirill-kryukov.com/study/tools/fasta-splitter/files/fasta-splitter-0.2.4.zip - unzip -qq fasta-splitter-0.2.4.zip - chmod 0755 fasta-splitter.pl - mv fasta-splitter.pl $install_dir/bin - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup fasta-splitter/0.2.4 -echo Setting up fasta-splitter 0.2.4 -export PATH=$install_dir/bin:\$PATH -export PERL5LIB=$install_dir/lib/perl5:\$PERL5LIB -# -EOF -} -function install_rdp_classifier_2_2() { - echo Installing rdp-classifier 2.2R - local install_dir=$1/rdp-classifier/2.2 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q https://sourceforge.net/projects/rdp-classifier/files/rdp-classifier/rdp_classifier_2.2.zip - unzip -qq rdp_classifier_2.2.zip - cd rdp_classifier_2.2 - mv * $install_dir - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup rdp-classifier/2.2 -echo Setting up RDP classifier 2.2 -export RDP_JAR_PATH=$install_dir/rdp_classifier-2.2.jar -# -EOF -} -function install_R_3_2_0() { - # Adapted from https://github.com/fls-bioinformatics-core/galaxy-tools/blob/master/local_dependency_installers/R.sh - echo Installing R 3.2.0 - local install_dir=$1/R/3.2.0 - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q http://cran.r-project.org/src/base/R-3/R-3.2.0.tar.gz - tar xzf R-3.2.0.tar.gz - cd R-3.2.0 - ./configure --prefix=$install_dir - make - make install - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup R/3.2.0 -echo Setting up R 3.2.0 -export PATH=$install_dir/bin:\$PATH -export TCL_LIBRARY=$install_dir/lib/libtcl8.4.so -export TK_LIBRARY=$install_dir/lib/libtk8.4.so -# -EOF -} -function install_uc2otutab() { - # See http://drive5.com/python/uc2otutab_py.html - echo Installing uc2otutab - # Install to "default" version i.e. essentially a versionless - # installation (see Galaxy dependency resolver docs) - local install_dir=$1/uc2otutab/default - if [ -f $install_dir/env.sh ] ; then - return - fi - mkdir -p $install_dir/bin - local wd=$(mktemp -d) - echo Moving to $wd - pushd $wd - wget -q http://drive5.com/python/python_scripts.tar.gz - tar zxf python_scripts.tar.gz - mv die.py fasta.py progress.py uc.py $install_dir/bin - echo "#!/usr/bin/env python" >$install_dir/bin/uc2otutab.py - cat uc2otutab.py >>$install_dir/bin/uc2otutab.py - chmod +x $install_dir/bin/uc2otutab.py - popd - # Clean up - rm -rf $wd/* - rmdir $wd - # Make setup file -cat > $install_dir/env.sh <<EOF -#!/bin/sh -# Source this to setup uc2otutab/default -echo Setting up uc2otutab \(default\) -export PATH=$install_dir/bin:\$PATH -# -EOF -} -########################################################## -# Main script starts here -########################################################## -# Fetch top-level installation directory from command line -TOP_DIR=$1 -if [ -z "$TOP_DIR" ] ; then - echo Usage: $(basename $0) DIR - exit -fi -if [ -z "$(echo $TOP_DIR | grep ^/)" ] ; then - TOP_DIR=$(pwd)/$TOP_DIR -fi -if [ ! -d "$TOP_DIR" ] ; then - mkdir -p $TOP_DIR -fi -# Install dependencies -install_amplicon_analysis_pipeline_1_2_2 $TOP_DIR -install_cutadapt_1_11 $TOP_DIR -install_sickle_1_33 $TOP_DIR -install_bioawk_27_08_2013 $TOP_DIR -install_pandaseq_2_8_1 $TOP_DIR -install_spades_3_5_0 $TOP_DIR -install_fastqc_0_11_3 $TOP_DIR -install_qiime_1_8_0 $TOP_DIR -install_vsearch_1_1_3 $TOP_DIR -install_microbiomeutil_2010_04_29 $TOP_DIR -install_blast_2_2_26 $TOP_DIR -install_fasta_number $TOP_DIR -install_fasta_splitter_0_2_4 $TOP_DIR -install_rdp_classifier_2_2 $TOP_DIR -install_R_3_2_0 $TOP_DIR -install_uc2otutab $TOP_DIR -## -#
--- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/tool_dependencies.xml Thu Oct 18 09:18:04 2018 -0400 @@ -0,0 +1,16 @@ +<?xml version="1.0"?> +<tool_dependency> + <package name="amplicon_analysis_pipeline" version="1.2.3"> + <install version="1.0"> + <actions> + <action type="download_file">https://raw.githubusercontent.com/pjbriggs/Amplicon_analysis-galaxy/master/install_amplicon_analysis.sh</action> + <action type="shell_command"> + sh ./install_amplicon_analysis.sh $INSTALL_DIR + </action> + <action type="set_environment"> + <environment_variable name="PATH" action="prepend_to">$INSTALL_DIR/Amplicon_analysis-1.2.3/bin</environment_variable> + </action> + </actions> + </install> + </package> +</tool_dependency>