Previous changeset 2:43d6f81bc667 (2018-06-13) Next changeset 4:86a12d75ebe4 (2019-12-20) |
Commit message:
planemo upload for repository https://github.com/pjbriggs/Amplicon_analysis-galaxy commit 15390f18b91d838880d952eb2714f689bbd8a042 |
modified:
README.rst amplicon_analysis_pipeline.py amplicon_analysis_pipeline.xml |
added:
install_amplicon_analysis.sh tool_dependencies.xml |
removed:
install_tool_deps.sh |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f README.rst --- a/README.rst Wed Jun 13 07:45:06 2018 -0400 +++ b/README.rst Thu Oct 18 09:18:04 2018 -0400 |
b |
@@ -26,20 +26,8 @@ instance to detect the dependencies and reference data correctly at run time. -1. Install the dependencies ---------------------------- - -The ``install_tool_deps.sh`` script can be used to fetch and install the -dependencies locally, for example:: - - install_tool_deps.sh /path/to/local_tool_dependencies - -This can take some time to complete. When finished it should have -created a set of directories containing the dependencies under the -specified top level directory. - -2. Install the tool files -------------------------- +1. Install the tool from the toolshed +------------------------------------- The core tool is hosted on the Galaxy toolshed, so it can be installed directly from there (this is the recommended route): @@ -58,7 +46,7 @@ <tool file="Amplicon_analysis/amplicon_analysis_pipeline.xml" /> -3. Install the reference data +2. Install the reference data ----------------------------- The script ``References.sh`` from the pipeline package at @@ -72,33 +60,14 @@ will install the data in ``/path/to/pipeline/data``. **NB** The final amount of data downloaded and uncompressed will be -around 6GB. - -4. Configure dependencies and reference data in Galaxy ------------------------------------------------------- - -The final steps are to make your Galaxy installation aware of the -tool dependencies and reference data, so it can locate them both when -the tool is run. - -To target the tool dependencies installed previously, add the -following lines to the ``dependency_resolvers_conf.xml`` file in the -Galaxy ``config`` directory:: +around 9GB. - <dependency_resolvers> - ... - <galaxy_packages base_path="/path/to/local_tool_dependencies" /> - <galaxy_packages base_path="/path/to/local_tool_dependencies" versionless="true" /> - ... - </dependency_resolvers> +3. Configure reference data location in Galaxy +---------------------------------------------- -(NB it is recommended to place these *before* the ``<conda ... />`` -resolvers) - -(If you're not familiar with dependency resolvers in Galaxy then -see the documentation at -https://docs.galaxyproject.org/en/master/admin/dependency_resolvers.html -for more details.) +The final step is to make your Galaxy installation aware of the +location of the reference data, so it can locate them both when the +tool is run. The tool locates the reference data via an environment variable called ``AMPLICON_ANALYSIS_REF_DATA_PATH``, which needs to set to the parent @@ -108,7 +77,8 @@ installation is configured: * **For local instances:** add a line to set it in the - ``config/local_env.sh`` file of your Galaxy installation, e.g.:: + ``config/local_env.sh`` file of your Galaxy installation (you + may need to create a new empty file first), e.g.:: export AMPLICON_ANALYSIS_REF_DATA_PATH=/path/to/pipeline/data @@ -124,9 +94,9 @@ <tool id="amplicon_analysis_pipeline" destination="amplicon_analysis"/> (For more about job destinations see the Galaxy documentation at - https://galaxyproject.org/admin/config/jobs/#job-destinations) + https://docs.galaxyproject.org/en/master/admin/jobs.html#job-destinations) -5. Enable rendering of HTML outputs from pipeline +4. Enable rendering of HTML outputs from pipeline ------------------------------------------------- To ensure that HTML outputs are displayed correctly in Galaxy @@ -171,46 +141,32 @@ https://github.com/galaxyproject/galaxy/issues/4490 and https://github.com/galaxyproject/galaxy/issues/1676 -Appendix: availability of tool dependencies -=========================================== - -The tool takes its dependencies from the underlying pipeline script (see -https://github.com/MTutino/Amplicon_analysis/blob/master/README.md -for details). +Appendix: installing the dependencies manually +============================================== -As noted above, currently the ``install_tool_deps.sh`` script can be -used to manually install the dependencies for a local tool install. +If the tool is installed from the Galaxy toolshed (recommended) then +the dependencies should be installed automatically and this step can +be skipped. -In principle these should also be available if the tool were installed -from a toolshed. However it would be preferrable in this case to get as -many of the dependencies as possible via the ``conda`` dependency -resolver. +Otherwise the ``install_amplicon_analysis_deps.sh`` script can be used +to fetch and install the dependencies locally, for example:: -The following are known to be available via conda, with the required -version: + install_amplicon_analysis.sh /path/to/local_tool_dependencies - - cutadapt 1.8.1 - - sickle-trim 1.33 - - bioawk 1.0 - - fastqc 0.11.3 - - R 3.2.0 - -Some dependencies are available but with the "wrong" versions: +(This is the same script as is used to install dependencies from the +toolshed.) This can take some time to complete, and when completed will +have created a directory called ``Amplicon_analysis-1.2.3`` containing +the dependencies under the specified top level directory. - - spades (need 3.5.0) - - qiime (need 1.8.0) - - blast (need 2.2.26) - - vsearch (need 1.1.3) - -The following dependencies are currently unavailable: +**NB** The installed dependencies will occupy around 2.6G of disk +space. - - fasta_number (need 02jun2015) - - fasta-splitter (need 0.2.4) - - rdp_classifier (need 2.2) - - microbiomeutil (need r20110519) +You will need to make sure that the ``bin`` subdirectory of this +directory is on Galaxy's ``PATH`` at runtime, for the tool to be able +to access the dependencies - for example by adding a line to the +``local_env.sh`` file like:: -(NB usearch 6.1.544 and 8.0.1623 are special cases which must be -handled outside of Galaxy's dependency management systems.) + export PATH=/path/to/local_tool_dependencies/Amplicon_analysis-1.2.3/bin:$PATH History ======= @@ -218,6 +174,8 @@ ========== ====================================================================== Version Changes ---------- ---------------------------------------------------------------------- +1.2.3.0 Updated to Amplicon_Analysis_Pipeline version 1.2.3; install + dependencies via tool_dependencies.xml. 1.2.2.0 Updated to Amplicon_Analysis_Pipeline version 1.2.2 (removes jackknifed analysis which is not captured by Galaxy tool) 1.2.1.0 Updated to Amplicon_Analysis_Pipeline version 1.2.1 (adds |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f amplicon_analysis_pipeline.py --- a/amplicon_analysis_pipeline.py Wed Jun 13 07:45:06 2018 -0400 +++ b/amplicon_analysis_pipeline.py Thu Oct 18 09:18:04 2018 -0400 |
[ |
@@ -60,9 +60,10 @@ sys.stderr.write("%s\n\n" % ('*'*width)) def clean_up_name(sample): - # Remove trailing "_L[0-9]+_001" from Fastq - # pair names - split_name = sample.split('_') + # Remove extensions and trailing "_L[0-9]+_001" from + # Fastq pair names + sample_name = '.'.join(sample.split('.')[:1]) + split_name = sample_name.split('_') if split_name[-1] == "001": split_name = split_name[:-1] if split_name[-1].startswith('L'): @@ -139,10 +140,12 @@ # Link to FASTQs and construct Final_name.txt file sample_names = [] + print "-- making Final_name.txt" with open("Final_name.txt",'w') as final_name: fastqs = iter(args.fastq_pairs) for sample_name,fqr1,fqr2 in zip(fastqs,fastqs,fastqs): sample_name = clean_up_name(sample_name) + print " %s" % sample_name r1 = "%s_R1_.fastq" % sample_name r2 = "%s_R2_.fastq" % sample_name os.symlink(fqr1,r1) |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f amplicon_analysis_pipeline.xml --- a/amplicon_analysis_pipeline.xml Wed Jun 13 07:45:06 2018 -0400 +++ b/amplicon_analysis_pipeline.xml Thu Oct 18 09:18:04 2018 -0400 |
b |
@@ -1,21 +1,7 @@ -<tool id="amplicon_analysis_pipeline" name="Amplicon Analysis Pipeline" version="1.2.2.0"> +<tool id="amplicon_analysis_pipeline" name="Amplicon Analysis Pipeline" version="1.2.3.0"> <description>analyse 16S rRNA data from Illumina Miseq paired-end reads</description> <requirements> - <requirement type="package" version="1.2.2">amplicon_analysis_pipeline</requirement> - <requirement type="package" version="1.11">cutadapt</requirement> - <requirement type="package" version="1.33">sickle</requirement> - <requirement type="package" version="27-08-2013">bioawk</requirement> - <requirement type="package" version="2.8.1">pandaseq</requirement> - <requirement type="package" version="3.5.0">spades</requirement> - <requirement type="package" version="0.11.3">fastqc</requirement> - <requirement type="package" version="1.8.0">qiime</requirement> - <requirement type="package" version="2.2.26">blast</requirement> - <requirement type="package" version="0.2.4">fasta-splitter</requirement> - <requirement type="package" version="2.2">rdp-classifier</requirement> - <requirement type="package" version="3.2.0">R</requirement> - <requirement type="package" version="1.1.3">vsearch</requirement> - <requirement type="package" version="2010-04-29">microbiomeutil</requirement> - <requirement type="package">fasta_number</requirement> + <requirement type="package" version="1.2.3">amplicon_analysis_pipeline</requirement> </requirements> <stdio> <exit_code range="1:" /> |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f install_amplicon_analysis.sh --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/install_amplicon_analysis.sh Thu Oct 18 09:18:04 2018 -0400 |
[ |
b'@@ -0,0 +1,425 @@\n+#!/bin/sh -e\n+#\n+# Prototype script to setup a conda environment with the\n+# dependencies needed for the Amplicon_analysis_pipeline\n+# script\n+#\n+# Handle command line\n+usage()\n+{\n+ echo "Usage: $(basename $0) [DIR]"\n+ echo ""\n+ echo "Installs the Amplicon_analysis_pipeline package plus"\n+ echo "dependencies in directory DIR (or current directory "\n+ echo "if DIR not supplied)"\n+}\n+if [ ! -z "$1" ] ; then\n+ # Check if help was requested\n+ case "$1" in\n+\t--help|-h)\n+\t usage\n+\t exit 0\n+\t ;;\n+ esac\n+ # Assume it\'s the installation directory\n+ cd $1\n+fi\n+# Versions\n+PIPELINE_VERSION=1.2.3\n+RDP_CLASSIFIER_VERSION=2.2\n+# Directories\n+TOP_DIR=$(pwd)/Amplicon_analysis-${PIPELINE_VERSION}\n+BIN_DIR=${TOP_DIR}/bin\n+CONDA_DIR=${TOP_DIR}/conda\n+CONDA_BIN=${CONDA_DIR}/bin\n+CONDA_LIB=${CONDA_DIR}/lib\n+CONDA=${CONDA_BIN}/conda\n+ENV_NAME="amplicon_analysis_pipeline@${PIPELINE_VERSION}"\n+ENV_DIR=${CONDA_DIR}/envs/$ENV_NAME\n+#\n+# Functions\n+#\n+# Report failure and terminate script\n+fail()\n+{\n+ echo ""\n+ echo ERROR $@ >&2\n+ echo ""\n+ echo "$(basename $0): installation failed"\n+ exit 1\n+}\n+#\n+# Rewrite the shebangs in the installed conda scripts\n+# to remove the full path to conda \'bin\' directory\n+rewrite_conda_shebangs()\n+{\n+ pattern="s,^#!${CONDA_BIN}/,#!/usr/bin/env ,g"\n+ find ${CONDA_BIN} -type f -exec sed -i "$pattern" {} \\;\n+}\n+#\n+# Install conda\n+install_conda()\n+{\n+ echo "++++++++++++++++"\n+ echo "Installing conda"\n+ echo "++++++++++++++++"\n+ if [ -e ${CONDA_DIR} ] ; then\n+\techo "*** $CONDA_DIR already exists ***" >&2\n+\treturn\n+ fi\n+ local cwd=$(pwd)\n+ local wd=$(mktemp -d)\n+ cd $wd\n+ wget -q https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh\n+ bash ./Miniconda2-latest-Linux-x86_64.sh -b -p ${CONDA_DIR}\n+ echo Installed conda in ${CONDA_DIR}\n+ # Update the installation files\n+ # This is to avoid problems when the length the installation\n+ # directory path exceeds the limit for the shebang statement\n+ # in the conda files\n+ echo ""\n+ echo -n "Rewriting conda shebangs..."\n+ rewrite_conda_shebangs\n+ echo "ok"\n+ echo -n "Adding conda bin to PATH..."\n+ PATH=${CONDA_BIN}:$PATH\n+ echo "ok"\n+ cd $cwd\n+ rm -rf $wd/*\n+ rmdir $wd\n+}\n+#\n+# Create conda environment\n+install_conda_packages()\n+{\n+ echo "+++++++++++++++++++++++++"\n+ echo "Installing conda packages"\n+ echo "+++++++++++++++++++++++++"\n+ local cwd=$(pwd)\n+ local wd=$(mktemp -d)\n+ cd $wd\n+ cat >environment.yml <<EOF\n+name: ${ENV_NAME}\n+channels:\n+ - defaults\n+ - conda-forge\n+ - bioconda\n+dependencies:\n+ - python=2.7\n+ - cutadapt=1.11\n+ - sickle-trim=1.33\n+ - bioawk=1.0\n+ - pandaseq=2.8.1\n+ - spades=3.5.0\n+ - fastqc=0.11.3\n+ - qiime=1.8.0\n+ - blast-legacy=2.2.26\n+ - fasta-splitter=0.2.4\n+ - rdp_classifier=$RDP_CLASSIFIER_VERSION\n+ - vsearch=1.1.3\n+ # Need to explicitly specify libgfortran\n+ # version (otherwise get version incompatible\n+ # with numpy=1.7.1)\n+ - libgfortran=1.0\n+ # Compilers needed to build R\n+ - gcc_linux-64\n+ - gxx_linux-64\n+ - gfortran_linux-64\n+EOF\n+ ${CONDA} env create --name "${ENV_NAME}" -f environment.yml\n+ echo Created conda environment in ${ENV_DIR}\n+ cd $cwd\n+ rm -rf $wd/*\n+ rmdir $wd\n+}\n+#\n+# Install all the non-conda dependencies in a single\n+# function (invokes separate functions for each package)\n+install_non_conda_packages()\n+{\n+ echo "+++++++++++++++++++++++++++++"\n+ echo "Installing non-conda packages"\n+ echo "+++++++++++++++++++++++++++++"\n+ # Temporary working directory\n+ local wd=$(mktemp -d)\n+ local cwd=$(pwd)\n+ local wd=$(mktemp -d)\n+ cd $wd\n+ # Amplicon analysis pipeline\n+ echo -n "Installing Amplicon_analysis_pipeline..."\n+ if [ -e ${BIN_DIR}/Amplicon_analysis_pipeline.sh ] ; then\n+\techo "already installed"\n+ else\n+\tinstall_amplicon_analysis_pipeline\n+\techo "ok"\n+ fi\n+ # ChimeraSlaye'..b'hmod 0755 ${INSTALL_DIR}/uclust\n+ ln -s ${INSTALL_DIR}/uclust ${BIN_DIR}\n+ cd $cwd\n+ rm -rf $wd/*\n+ rmdir $wd\n+}\n+#\n+# R 3.2.1\n+# Can\'t use version from conda due to dependency conflicts\n+install_R_3_2_1()\n+{\n+ . ${CONDA_BIN}/activate ${ENV_NAME}\n+ local cwd=$(pwd)\n+ local wd=$(mktemp -d)\n+ cd $wd\n+ echo -n "Fetching R 3.2.1 source code..."\n+ wget -q http://cran.r-project.org/src/base/R-3/R-3.2.1.tar.gz\n+ echo "ok"\n+ INSTALL_DIR=${TOP_DIR}\n+ mkdir -p $INSTALL_DIR\n+ echo -n "Unpacking source code..."\n+ tar xzf R-3.2.1.tar.gz >INSTALL.log 2>&1\n+ echo "ok"\n+ cd R-3.2.1\n+ echo -n "Running configure..."\n+ ./configure --prefix=$INSTALL_DIR --with-x=no --with-readline=no >>INSTALL.log 2>&1\n+ echo "ok"\n+ echo -n "Running make..."\n+ make >>INSTALL.log 2>&1\n+ echo "ok"\n+ echo -n "Running make install..."\n+ make install >>INSTALL.log 2>&1\n+ echo "ok"\n+ cd $cwd\n+ rm -rf $wd/*\n+ rmdir $wd\n+ . ${CONDA_BIN}/deactivate\n+}\n+setup_pipeline_environment()\n+{\n+ echo "+++++++++++++++++++++++++++++++"\n+ echo "Setting up pipeline environment"\n+ echo "+++++++++++++++++++++++++++++++"\n+ # vsearch113\n+ echo -n "Setting up vsearch113..."\n+ if [ -e ${BIN_DIR}/vsearch113 ] ; then\n+\techo "already exists"\n+ elif [ ! -e ${ENV_DIR}/bin/vsearch ] ; then\n+\techo "failed"\n+\tfail "vsearch not found"\n+ else\n+\tln -s ${ENV_DIR}/bin/vsearch ${BIN_DIR}/vsearch113\n+\techo "ok"\n+ fi\n+ # fasta_splitter.pl\n+ echo -n "Setting up fasta_splitter.pl..."\n+ if [ -e ${BIN_DIR}/fasta-splitter.pl ] ; then\n+\techo "already exists"\n+ elif [ ! -e ${ENV_DIR}/share/fasta-splitter/fasta-splitter.pl ] ; then\n+\techo "failed"\n+\tfail "fasta-splitter.pl not found"\n+ else\n+\tln -s ${ENV_DIR}/share/fasta-splitter/fasta-splitter.pl ${BIN_DIR}/fasta-splitter.pl\n+\techo "ok"\n+ fi\n+ # rdp_classifier.jar\n+ local rdp_classifier_jar=rdp_classifier-${RDP_CLASSIFIER_VERSION}.jar\n+ echo -n "Setting up rdp_classifier.jar..."\n+ if [ -e ${TOP_DIR}/share/rdp_classifier/${rdp_classifier_jar} ] ; then\n+\techo "already exists"\n+ elif [ ! -e ${ENV_DIR}/share/rdp_classifier/rdp_classifier.jar ] ; then\n+\techo "failed"\n+\tfail "rdp_classifier.jar not found"\n+ else\n+\tmkdir -p ${TOP_DIR}/share/rdp_classifier\n+\tln -s ${ENV_DIR}/share/rdp_classifier/rdp_classifier.jar ${TOP_DIR}/share/rdp_classifier/${rdp_classifier_jar}\n+\techo "ok"\t\n+ fi\n+ # qiime_config\n+ echo -n "Setting up qiime_config..."\n+ if [ -e ${TOP_DIR}/qiime/qiime_config ] ; then\n+\techo "already exists"\n+ else\n+\tmkdir -p ${TOP_DIR}/qiime\n+\tcat >${TOP_DIR}/qiime/qiime_config <<EOF-qiime-config\n+qiime_scripts_dir\t${ENV_DIR}/bin\n+EOF-qiime-config\n+\techo "ok"\n+ fi\n+}\n+#\n+# Remove the compilers from the conda environment\n+# Not sure if this step is necessary\n+remove_conda_compilers()\n+{\n+ echo "+++++++++++++++++++++++++++++++++++++++++"\n+ echo "Removing compilers from conda environment"\n+ echo "+++++++++++++++++++++++++++++++++++++++++"\n+ ${CONDA} remove -y -n ${ENV_NAME} gcc_linux-64 gxx_linux-64 gfortran_linux-64\n+}\n+#\n+# Top level script does the installation\n+echo "======================================="\n+echo "Amplicon_analysis_pipeline installation"\n+echo "======================================="\n+echo "Installing into ${TOP_DIR}"\n+if [ -e ${TOP_DIR} ] ; then\n+ fail "Directory already exists"\n+fi\n+mkdir -p ${TOP_DIR}\n+install_conda\n+install_conda_packages\n+install_non_conda_packages\n+setup_pipeline_environment\n+remove_conda_compilers\n+echo "===================================="\n+echo "Amplicon_analysis_pipeline installed"\n+echo "===================================="\n+echo ""\n+echo "Install reference data using:"\n+echo ""\n+echo "\\$ ${BIN_DIR}/install_reference_data.sh DIR"\n+echo ""\n+echo "Run pipeline scripts using:"\n+echo ""\n+echo "\\$ ${BIN_DIR}/Amplicon_analysis_pipeline.sh ..."\n+echo ""\n+echo "(or add ${BIN_DIR} to your PATH)"\n+echo ""\n+echo "$(basename $0): finished"\n+##\n+#\n' |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f install_tool_deps.sh --- a/install_tool_deps.sh Wed Jun 13 07:45:06 2018 -0400 +++ /dev/null Thu Jan 01 00:00:00 1970 +0000 |
[ |
b'@@ -1,712 +0,0 @@\n-#!/bin/bash -e\n-#\n-# Install the tool dependencies for Amplicon_analysis_pipeline.sh for\n-# testing from command line\n-#\n-function install_python_package() {\n- echo Installing $2 $3 from $4 under $1\n- local install_dir=$1\n- local install_dirs="$install_dir $install_dir/bin $install_dir/lib/python2.7/site-packages"\n- for d in $install_dirs ; do\n-\tif [ ! -d $d ] ; then\n-\t mkdir -p $d\n-\tfi\n- done\n- wd=$(mktemp -d)\n- echo Moving to $wd\n- pushd $wd\n- wget -q $4\n- if [ ! -f "$(basename $4)" ] ; then\n-\techo "No archive $(basename $4)"\n-\texit 1\n- fi\n- tar xzf $(basename $4)\n- if [ ! -d "$5" ] ; then\n-\techo "No directory $5"\n-\texit 1\n- fi\n- cd $5\n- /bin/bash <<EOF\n-export PYTHONPATH=$install_dir:$PYTHONPATH && \\\n-export PYTHONPATH=$install_dir/lib/python2.7/site-packages:$PYTHONPATH && \\\n-python setup.py install --prefix=$install_dir --install-scripts=$install_dir/bin --install-lib=$install_dir/lib/python2.7/site-packages >>$INSTALL_DIR/INSTALLATION.log 2>&1\n-EOF\n- popd\n- rm -rf $wd/*\n- rmdir $wd\n-}\n-function install_amplicon_analysis_pipeline_1_2_2() {\n- install_amplicon_analysis_pipeline $1 1.2.2\n-}\n-function install_amplicon_analysis_pipeline_1_2_1() {\n- install_amplicon_analysis_pipeline $1 1.2.1\n-}\n-function install_amplicon_analysis_pipeline_1_1() {\n- install_amplicon_analysis_pipeline $1 1.1\n-}\n-function install_amplicon_analysis_pipeline_1_0() {\n- install_amplicon_analysis_pipeline $1 1.0\n-}\n-function install_amplicon_analysis_pipeline() {\n- version=$2\n- echo Installing Amplicon_analysis $version\n- install_dir=$1/amplicon_analysis_pipeline/$version\n- if [ -f $install_dir/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $install_dir\n- echo Moving to $install_dir\n- pushd $install_dir\n- wget -q https://github.com/MTutino/Amplicon_analysis/archive/v${version}.tar.gz\n- tar zxf v${version}.tar.gz\n- mv Amplicon_analysis-${version} Amplicon_analysis\n- rm -rf v${version}.tar.gz\n- popd\n- # Make setup file\n- cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup Amplicon_analysis/$version\n-echo Setting up Amplicon analysis pipeline $version\n-export PATH=$install_dir/Amplicon_analysis:\\$PATH\n-## AMPLICON_ANALYSIS_REF_DATA_PATH should be set in\n-## config/local_env.sh or in the job_conf.xml file\n-## - see the README\n-##export AMPLICON_ANALYSIS_REF_DATA_PATH=\n-#\n-EOF\n-}\n-function install_amplicon_analysis_pipeline_1_0_patched() {\n- version="1.0-patched"\n- echo Installing Amplicon_analysis $version\n- install_dir=$1/amplicon_analysis_pipeline/$version\n- if [ -f $install_dir/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $install_dir\n- echo Moving to $install_dir\n- pushd $install_dir\n- # Clone and patch analysis pipeline scripts\n- git clone https://github.com/pjbriggs/Amplicon_analysis.git\n- cd Amplicon_analysis\n- git checkout -b $version\n- branches=\n- if [ ! -z "$branches" ] ; then\n-\tfor branch in $branches ; do\n-\t git checkout -b $branch origin/$branch\n-\t git checkout $version\n-\t git merge -m "Merge $branch into $version" $branch\n-\tdone\n- fi\n- cd ..\n- popd\n- # Make setup file\n- cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup Amplicon_analysis/$version\n-echo Setting up Amplicon analysis pipeline $version\n-export PATH=$install_dir/Amplicon_analysis:\\$PATH\n-## AMPLICON_ANALYSIS_REF_DATA_PATH should be set in\n-## config/local_env.sh or in the job_conf.xml file\n-## - see the README\n-##export AMPLICON_ANALYSIS_REF_DATA_PATH=\n-#\n-EOF\n-}\n-function install_cutadapt_1_11() {\n- echo Installing cutadapt 1.11\n- INSTALL_DIR=$1/cutadapt/1.11\n- if [ -f $INSTALL_DIR/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $INSTALL_DIR\n- install_python_package $INSTALL_DIR cutadapt 1.11 \\\n-\thttps://pypi.python.org/packages/47/bf/9045e90dac084a90aa2bb72c7d5aadefaea96a5776f445f5b5d9a7a2c78b/cutadapt-1.11.tar.gz \\\n-\tcutadapt-1.11\n- # Make set'..b'er.pl\n- mv fasta-splitter.pl $install_dir/bin\n- popd\n- # Clean up\n- rm -rf $wd/*\n- rmdir $wd\n- # Make setup file\n-cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup fasta-splitter/0.2.4\n-echo Setting up fasta-splitter 0.2.4\n-export PATH=$install_dir/bin:\\$PATH\n-export PERL5LIB=$install_dir/lib/perl5:\\$PERL5LIB\n-#\n-EOF\n-}\n-function install_rdp_classifier_2_2() {\n- echo Installing rdp-classifier 2.2R\n- local install_dir=$1/rdp-classifier/2.2\n- if [ -f $install_dir/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $install_dir\n- local wd=$(mktemp -d)\n- echo Moving to $wd\n- pushd $wd\n- wget -q https://sourceforge.net/projects/rdp-classifier/files/rdp-classifier/rdp_classifier_2.2.zip\n- unzip -qq rdp_classifier_2.2.zip\n- cd rdp_classifier_2.2\n- mv * $install_dir\n- popd\n- # Clean up\n- rm -rf $wd/*\n- rmdir $wd\n- # Make setup file\n-cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup rdp-classifier/2.2\n-echo Setting up RDP classifier 2.2\n-export RDP_JAR_PATH=$install_dir/rdp_classifier-2.2.jar\n-#\n-EOF\n-}\n-function install_R_3_2_0() {\n- # Adapted from https://github.com/fls-bioinformatics-core/galaxy-tools/blob/master/local_dependency_installers/R.sh\n- echo Installing R 3.2.0\n- local install_dir=$1/R/3.2.0\n- if [ -f $install_dir/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $install_dir\n- local wd=$(mktemp -d)\n- echo Moving to $wd\n- pushd $wd\n- wget -q http://cran.r-project.org/src/base/R-3/R-3.2.0.tar.gz\n- tar xzf R-3.2.0.tar.gz\n- cd R-3.2.0\n- ./configure --prefix=$install_dir\n- make\n- make install\n- popd\n- # Clean up\n- rm -rf $wd/*\n- rmdir $wd\n- # Make setup file\n-cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup R/3.2.0\n-echo Setting up R 3.2.0\n-export PATH=$install_dir/bin:\\$PATH\n-export TCL_LIBRARY=$install_dir/lib/libtcl8.4.so\n-export TK_LIBRARY=$install_dir/lib/libtk8.4.so\n-#\n-EOF\n-}\n-function install_uc2otutab() {\n- # See http://drive5.com/python/uc2otutab_py.html\n- echo Installing uc2otutab\n- # Install to "default" version i.e. essentially a versionless\n- # installation (see Galaxy dependency resolver docs)\n- local install_dir=$1/uc2otutab/default\n- if [ -f $install_dir/env.sh ] ; then\n-\treturn\n- fi\n- mkdir -p $install_dir/bin\n- local wd=$(mktemp -d)\n- echo Moving to $wd\n- pushd $wd\n- wget -q http://drive5.com/python/python_scripts.tar.gz\n- tar zxf python_scripts.tar.gz\n- mv die.py fasta.py progress.py uc.py $install_dir/bin\n- echo "#!/usr/bin/env python" >$install_dir/bin/uc2otutab.py\n- cat uc2otutab.py >>$install_dir/bin/uc2otutab.py\n- chmod +x $install_dir/bin/uc2otutab.py\n- popd\n- # Clean up\n- rm -rf $wd/*\n- rmdir $wd\n- # Make setup file\n-cat > $install_dir/env.sh <<EOF\n-#!/bin/sh\n-# Source this to setup uc2otutab/default\n-echo Setting up uc2otutab \\(default\\)\n-export PATH=$install_dir/bin:\\$PATH\n-#\n-EOF\n-}\n-##########################################################\n-# Main script starts here\n-##########################################################\n-# Fetch top-level installation directory from command line\n-TOP_DIR=$1\n-if [ -z "$TOP_DIR" ] ; then\n- echo Usage: $(basename $0) DIR\n- exit\n-fi\n-if [ -z "$(echo $TOP_DIR | grep ^/)" ] ; then\n- TOP_DIR=$(pwd)/$TOP_DIR\n-fi\n-if [ ! -d "$TOP_DIR" ] ; then\n- mkdir -p $TOP_DIR\n-fi\n-# Install dependencies\n-install_amplicon_analysis_pipeline_1_2_2 $TOP_DIR\n-install_cutadapt_1_11 $TOP_DIR\n-install_sickle_1_33 $TOP_DIR\n-install_bioawk_27_08_2013 $TOP_DIR\n-install_pandaseq_2_8_1 $TOP_DIR\n-install_spades_3_5_0 $TOP_DIR\n-install_fastqc_0_11_3 $TOP_DIR\n-install_qiime_1_8_0 $TOP_DIR\n-install_vsearch_1_1_3 $TOP_DIR\n-install_microbiomeutil_2010_04_29 $TOP_DIR\n-install_blast_2_2_26 $TOP_DIR\n-install_fasta_number $TOP_DIR\n-install_fasta_splitter_0_2_4 $TOP_DIR\n-install_rdp_classifier_2_2 $TOP_DIR\n-install_R_3_2_0 $TOP_DIR\n-install_uc2otutab $TOP_DIR\n-##\n-#\n' |
b |
diff -r 43d6f81bc667 -r 3ab198df8f3f tool_dependencies.xml --- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/tool_dependencies.xml Thu Oct 18 09:18:04 2018 -0400 |
b |
@@ -0,0 +1,16 @@ +<?xml version="1.0"?> +<tool_dependency> + <package name="amplicon_analysis_pipeline" version="1.2.3"> + <install version="1.0"> + <actions> + <action type="download_file">https://raw.githubusercontent.com/pjbriggs/Amplicon_analysis-galaxy/master/install_amplicon_analysis.sh</action> + <action type="shell_command"> + sh ./install_amplicon_analysis.sh $INSTALL_DIR + </action> + <action type="set_environment"> + <environment_variable name="PATH" action="prepend_to">$INSTALL_DIR/Amplicon_analysis-1.2.3/bin</environment_variable> + </action> + </actions> + </install> + </package> +</tool_dependency> |