Skip to content

Commit

Permalink
initial_commit
Browse files Browse the repository at this point in the history
first commit version 1.0 for all scripts!
  • Loading branch information
alexiswl committed Sep 14, 2016
1 parent 9928762 commit 27e198c
Show file tree
Hide file tree
Showing 8 changed files with 872 additions and 0 deletions.
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@

.DS_Store
82 changes: 82 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# ONT Running Scripts
This is a repository of scripts associated with 'running' / 'analysing data of' the Oxford Nanopore MinION.

I have written a total of six scripts (here to be praised and criticized).

Provided below are details and examples of usage of the scripts.

## fast5-transfer-realtime.py
Throughput from the MinION is rapidly increasing. Therefore it is often important to move the data off to a server, prior to doing any subsequent analysis of the data.
This script is to be run from a computer that can see both the server and the laptop that is running the MinION (this can often be the laptop).

The script will continue to search for fast5 files until no more fast5 files are found after 800 seconds. This parameter can be adjusted with the watch option. This will last the entire 48 hours for a high quality run.

#### Dependencies:
If you're on Windows, I would recommend using Cygwin to run these commands.

#### Example
`fast5-transfer-realtime.py --run_name e_coli_R9`
`--reads_directory C:/data/reads --server_directory Z:/`


#### Output
A folder is created within the server directory called YYYY\_MM\_DD\_\<RUN\_NAME>
For subsequent scripts this is often referred to as the 'run_directory'. Inside this directory another folder called dump is created. This is where the fast5 files are placed.

#### Future options
I hope to add ssh and ftp options to this command in the near future.

## Nanonet-realtime

Depending on your default modification settings, it may be that the files in the dump directory are not writable. Although it is possible to fiddle and tinker with these settings, it is a safe options to keep these files here as a back-up. Nanonet-realtime copies these files into a folder called 'reads' which is also a sub-directory of the run_directory. Prior to doing so it places these reads into a tmp directory, performs a 1D nanonet analysis on these reads, exported to a fasta file. The fasta file is placed in the fasta folder (a sub-directory of the run\_directory or working\_directory) with the follwing naming convention \<RUN\_NAME>\_1D\_\<postix_time>.fa

If you are already in the run\_directory when executing this script, the only variable you need is the --run\_name.

#### Dependencies
Nanonet (from ONT)

#### Examples
`nanonet-realtime.py --run_name e_coli_R9 --working_directory /data/2016_09_13_e_coli_R9`

#### Output
Multiple fasta files are created within the fasta directory. Files are moved to the reads directory.

#### Future options
Yet to think of anything...


## Metrichor-cli-wrapper.py
The metrichor-cli can be very intimidating. Prior to running this script I would recommend running metrichor-cli-configuredependencies.sh. If you have run the configure dependencies script in the past you will still need to run the metrichor-cli-setpaths.sh script everytime you log into the server. You may add this script to your .bash_profile, however I cannot then guarantee that it won't interfere with other programs you may run in the future.

The metrichor-cli-wrapper requires two arguments, the run\_directory or working\_directory and the type of workflow you wish to run (I haven't included all of these but typing in metrichor-cli --list will indicate all the numbers and you can update my dictionary within the script).

A log file will be exported to \<run\_directory>\_log. If you do not specify the reads directory, it will assume the reads exist in \<run_directory>\_reads.

The reads will be returned into reads/downloads. You cannot change this.

#### Dependencies
Open up the configure-dependencies script to view what you will need. I had great difficulty
installing the hdf5 modules without root permissions and have done my best to replicate what I did to get it to work. You will also need a gcc compiler of 4.9 or greater.

#### Example
`metrichor-cli-wrapper.py --working_directory /data/2016_09_13_e_coli_R9 --workflow 2D_Basecalling`

#### Future options
Diagnosing bugs within the metrichor code or installation scripts.

## Onecodex-realtime.py
Onecodex is an online metagenomic profiler using a k-mer based exact alignment tool,
(maybe Kraken?).
This script using the Onecodex search tool to obtain tax\_ids for a sample. The script takes the fasta files within the fasta directory (that have been generated by nanonet) and uploads them to onecodex. Output is a tab-delimited file read_name \<tab> tax_id.
Due to stringent alignment required and the inaccuracy of 1D fasta files the alignment rate is still quite poor. As of September 2016, onecodex does not have any limits on using their search tool for research purposes.

#### Dependencies
Python libraries: Biopython, requests and json.

#### Example
`onecodex-realtime.py --run_name outbreak_sputum --run_directory /2019_09_13_pandemics`





199 changes: 199 additions & 0 deletions fast5-transfer-realtime.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,199 @@
#!/usr/bin/env python
import os
import shutil
import time
import argparse
import sys

# This script is designed to transfer data from the output on the laptop produced by MinKNOW to a server.
# This script will be required to be run from a computer that has access to both the laptop and the server.
# Hence it is possible to run this script on the laptop concurrently with MinKNOW if it can see the server.

# CONSTANTS
version = "%(prog)s 1.0"

# Configure arguments
help_descriptor = "This is a script designed to remove fast5 files from a laptop onto a server." + \
"There is not currently any support for FTP or for scp commands. This script only" + \
"works in circumstances where you can map the network drive. You only need three arguments" + \
"for this command to run. 1 - Run name, 2 - Reads directory, 3 - Server directory. The reads" + \
"will then placed into a folder YYYY_MM_DD_<RUN_NAME>/dump in the server directory."

parser = argparse.ArgumentParser(description=help_descriptor)

parser.add_argument('--version', action='version', version=version)
parser.add_argument("--run_name", nargs='?', dest="RUN_NAME", type=str,
help="This is a required argument. What is the name of your run as they appear on the fast5 " +
"files? User_Date_FlowcellID_MinIONID_sequencing_run_<RUNNAME>_5DigitBarcode_Channel_Read",
required=True)
parser.add_argument("--reads_directory", nargs='?', dest="READS_DIRECTORY", type=str,
help="This is the directory that contains the fast5 files produced by MinKNOW.",
required="True")
parser.add_argument("--server_directory", nargs='?', dest="SERVER_DIRECTORY", type=str,
help="This is the directory generally the parent directory of the run folder. If the run folder" +
" or its subdirectory <dump> folder do not exist, they will be created.",
required=True)
parser.add_argument("--run_directory", nargs='?', dest="RUN_DIRECTORY", type=str,
help="This is the parent folder of the dump folder. If not specified, this will become" +
"<server_directory>/<run_directory>")
parser.add_argument("--dump_directory", nargs='?', dest="DUMP_DIRECTORY", type=str,
help="This is the folder on the server that you would like to place the fast5 file in to." +
"If not specified, this becomes <run_directory>/dump")
parser.add_argument("--watch", nargs='?', dest="WATCH", type=int,
help="This time (seconds) allowed with no new fast5 files" +
"entering the reads folder before exiting the script. Default set at 800")
parser.add_argument("--logfile", nargs='?', dest="LOGFILE", type=str,
help="This is the file that some general notes are printed to. If not specified," +
"the file will be RUN_DIRECTORY/log/<run_name>.move.log")
args = parser.parse_args()

# Assign inputs
RUN_NAME = args.RUN_NAME
READS_DIRECTORY = args.READS_DIRECTORY
SERVER_DIRECTORY = args.SERVER_DIRECTORY
RUN_DIRECTORY = args.RUN_DIRECTORY
DUMP_DIRECTORY = args.DUMP_DIRECTORY
WATCH = args.WATCH
LOGFILE = args.LOGFILE

# Defaults
WATCH_DEFAULT = 800
INVALID_SYMBOLS = "~`!@#$%^&*()-+={}[]:>;',</?*-+"

# Set the time
initial_date_suffix = time.strftime("%Y-%m-%d-%H-%M-%S")
date = time.strftime("%Y_%m_%d")

# Does the RUN_NAME contain any 'bad' characters.
for s in RUN_NAME:
if s in INVALID_SYMBOLS:
error_message = "Error, invalid character in filename. Cannot have any of the following characters %s" \
% INVALID_SYMBOLS
sys.exit(error_message)

# Checking to ensure that the reads directory exists
if not os.path.isdir(READS_DIRECTORY):
error_message = "Error: cannot locate or find reads directory %s" % READS_DIRECTORY
sys.exit(error_message)
READS_DIRECTORY = os.path.abspath(READS_DIRECTORY) + "/"

# Checking to ensure that the server directory exists
if not os.path.isdir(SERVER_DIRECTORY):
error_message = "Error: cannot locate or find server directory %s" % SERVER_DIRECTORY
sys.exit(error_message)
SERVER_DIRECTORY = os.path.abspath(SERVER_DIRECTORY) + "/"

# Checking to ensure that the run directory exists.
if RUN_DIRECTORY:
if not os.path.isdir(RUN_DIRECTORY):
error_message = "Error: run directory specified but does not exist %s" % RUN_DIRECTORY
sys.exit(error_message)
else:
RUN_DIRECTORY = SERVER_DIRECTORY + date + "_" + RUN_NAME + "/"
general_message = "Run directory not specified. Using %s" % RUN_DIRECTORY
print(general_message)
if not os.path.isdir(RUN_DIRECTORY):
os.mkdir(RUN_DIRECTORY)

# Checking to ensure that the dump directory exists.
if DUMP_DIRECTORY:
if not os.path.isdir(DUMP_DIRECTORY):
error_message = "Error: dump directory specified but does not exist %s" % DUMP_DIRECTORY
else:
DUMP_DIRECTORY = RUN_DIRECTORY + "dump/"
general_message = "Dump directory not defined. Using %s" % DUMP_DIRECTORY
print(general_message)
if not os.path.isdir(DUMP_DIRECTORY):
os.mkdir(DUMP_DIRECTORY)

if not WATCH:
WATCH = WATCH_DEFAULT
general_message = "Watch option not defined. Using %s" % WATCH_DEFAULT
print(general_message)

# Create the log file
if LOGFILE:
if not os.path.isfile(LOGFILE):
error_message = "Log file specifed but does not exist."
sys.exit(error_message)
else:
log_directory = RUN_DIRECTORY + "log/"
if not os.path.isdir(log_directory):
os.makedirs(log_directory)
LOGFILE = log_directory + date + "_" + RUN_NAME + ".transfer.log"
general_message = "Log file not defined, using %s" % LOGFILE
print(general_message)
# We now begin the process of moving reads across from the read_directory to the server directory
# to prevent the computer from filling up.
# We want to be careful to ensure that reads do not get moved across twice
# and that only fast5 files are moved across.

logger = open(LOGFILE, 'a+')
logger.write("The time is %s:\n" % time.strftime("%c"))
logger.write("Commencing transfer of reads from %s to %s" % (READS_DIRECTORY, DUMP_DIRECTORY))
logger.close()
start_time = time.time()

fast5_files = []
run_exhausted = False
files_moved = 0
patience_counter = 0

os.chdir(READS_DIRECTORY)
os.chmod(RUN_DIRECTORY, 777)

while not run_exhausted:
while len(fast5_files) == 0:
# Create an array of all the fast5 files in the directory the MinION is writing to.
run_string = "sequencing_run_%s" % RUN_NAME
fast5_files = [fast5 for fast5 in os.listdir(READS_DIRECTORY) if fast5.endswith('.fast5') and run_string in fast5]

# Important to transfer the oldest files first.
fast5_files.sort(key=lambda x: os.path.getmtime(x))

# Did we pick anything up?
if len(fast5_files) != 0:
break

# Didn't pick anything up...
# Have we exceeded the number of mini-sleeps?
if patience_counter > WATCH:
run_exhausted = True
break

# No? Take a minute sleep.
if patience_counter != 0:
abstinence_message = "No fast5 files found in the last %d seconds.\n" % patience_counter
sleeping_message = "Waiting 60 seconds, breaking in %d if no more reads created.\n" \
% (WATCH - patience_counter)
logger = open(LOGFILE, 'a+')
print(abstinence_message)
print(sleeping_message)
logger.write(abstinence_message + "\n")
logger.write(sleeping_message + "\n")

time.sleep(60)
patience_counter += 60

# Out of the while loop, either fast5 files have been found or run is exhausted.
patience_counter = 0 # Must be consecutive minute sleeps to exhaust the run.

# Move the files from the MinION directory to the server directory
for read in fast5_files:
if not os.path.isfile(DUMP_DIRECTORY + read):
shutil.move(READS_DIRECTORY + read, DUMP_DIRECTORY)
files_moved += 1
else:
print "Warning, %s already exists in dump directory. Deleting from laptop." % read
os.remove(READS_DIRECTORY + read)
fast5_files = [] # To pass into the inner while loop

# Run has been exhausted
end_time = time.time()

logger = open(LOGFILE, 'a+')
logger.write("No fast5 files found for %d seconds\n" % WATCH)
logger.write("Moved %d files\n" % files_moved)
logger.write("Process completed in %d seconds.\n" % (end_time - start_time))
logger.write("Exiting\n")

53 changes: 53 additions & 0 deletions metrichor-cli-configuredependencies.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
#!/usr/bin/env bash

# Prior to running this script you will need to unzip the metrichor tarball into your home directory.
# You will need to have installed a gcc, I've found 5.4.0 works well.
# You can use easybuild to install a gcc compiler without root permissions see here:
# https://github.com/hpcugent/easybuild/wiki/Step-by-step-guide
# To install hdf5 check out this website here:
# http://nugridstars.org/work-packages/io-technologies/hdf5-useepp-format/installing-hdf5

# This script is designed to allow those without root permissions to install metrichor into their home-directory.
current_directory=$(echo $(pwd))

# Set versions
nvm_version="5.12.0"
metrichor_version="2.40.16"
gcc_version="5.4.0"

# Set and export compiler flags
export CXX=$HOME/.local/easybuild/software/GCC/$gcc_version/bin/g++
export CC=$HOME/.local/easybuild/software/GCC/$gcc_version/bin/gcc
export LD_LIBRARY_PATH=$HOME/.local/easybuild/software/GCC/$gcc_version/lib/:$HOME/.local/easybuild/software/GCC/$gcc_version/lib64/:$LD_LIBRARY_PATH

# Set hdf5 flag
HDF5_HOME=$HOME/src/hdf5/

# Install nvm
cd $HOME/nvm-master
./install.sh


export NVM_DIR=$HOME/.nvm
source $NVM_DIR/nvm.sh # This loads nvm

#Change to metrichor directory and install nvm and npm
cd $HOME/metrichor-cli-$metrichor_version

nvm install $nvm_version

# npm needs to be installed globally, otherwise I get errors I haven't been able to fix.
# export path to npm and node
# export PATH=$PATH:$HOME/.nvm/versions/node/v5.12.0/bin/
npm install npm -g

npm install hdf5 -ws:verbose -g

# export path to metrichor commandline agent
export PATH=$PATH:$HOME/metrichor-cli-2.40.16/bin/

# export node path to link to hdf5 module
export NODE_PATH=$HOME/.nvm/versions/node/v5.12.0/lib/node_modules/

# change back to current directory
cd $current_directory
31 changes: 31 additions & 0 deletions metrichor-cli-setpaths.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
#!/usr/bin/env bash


# Set versions
nvm_version="5.12.0"
metrichor_version="2.40.16"
gcc_version="5.4.0"

# Set and export compiler flags
export CXX=$HOME/.local/easybuild/software/GCC/$gcc_version/bin/g++
export CC=$HOME/.local/easybuild/software/GCC/$gcc_version/bin/gcc
export LD_LIBRARY_PATH=$HOME/.local/easybuild/software/GCC/$gcc_version/lib/:$HOME/.local/easybuild/software/GCC/$gcc_version/lib64/:$LD_LIBRARY_PATH

# Set hdf5 flag
HDF5_HOME=$HOME/src/hdf5/

# Load nvm
export NVM_DIR=$HOME/.nvm
source $NVM_DIR/nvm.sh # This loads nvm

# source API_KEY
source $HOME/.met_apikey

# export path to metrichor commandline agent
export PATH=$PATH:$HOME/metrichor-cli-2.40.16/bin/

# export node path to link to hdf5 module
export NODE_PATH=$HOME/.nvm/versions/node/v${nvm_version}/lib/node_modules/:$NODE_PATH



Loading

0 comments on commit 27e198c

Please sign in to comment.