Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.1.0 Updates #42

Open
wants to merge 70 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
70 commits
Select commit Hold shift + click to select a range
e2c8668
Updating dev versions to 1.1.0dev
atrull314 Oct 7, 2024
474ca48
Template update for nf-core/tools version 3.0.0
nf-core-bot Oct 8, 2024
d263368
Template update for nf-core/tools version 3.0.1
nf-core-bot Oct 9, 2024
6b0496a
Template update for nf-core/tools version 3.0.2
nf-core-bot Oct 11, 2024
ba39bb4
Merge pull request #2 from U-BDS/1.1.0_update
atrull314 Oct 11, 2024
c7ba955
Implementing chromosome splitting of Isoquant to assist with parallel…
atrull314 Oct 23, 2024
f0b1000
Modifications to reference file splitting
atrull314 Oct 24, 2024
7845de0
Changing mtx_merge to use barcodes as key to join on and making sure …
atrull314 Oct 31, 2024
6a4093f
Adjusting outputs and cleanup
atrull314 Nov 5, 2024
54984fb
Fixing BLAZE custom config in README
atrull314 Nov 5, 2024
5aba04a
Fixing nodes in metro diagram
atrull314 Nov 5, 2024
bbfd70c
Upgrade isoquant version
atrull314 Nov 6, 2024
8233fc7
Merge branch 'feature_isoquant_performance' into dev
atrull314 Nov 6, 2024
5629165
Allowing read counts to run if FastQC is disabled
atrull314 Nov 12, 2024
2b9251d
Increasing resources for seurat step
atrull314 Nov 15, 2024
5a73588
Fixing linting errors
atrull314 Nov 15, 2024
3a40ec7
Updating Documentation
atrull314 Nov 15, 2024
75e0240
Update CHANGELOG.md
atrull314 Nov 15, 2024
068781a
Merging Template Update
atrull314 Nov 27, 2024
a7a3129
Fixing linting issues
atrull314 Nov 27, 2024
9732fd6
Fixing Harshil alignment
atrull314 Nov 27, 2024
e4b681d
Merge pull request #40 from U-BDS/dev
atrull314 Dec 3, 2024
fd93533
Initial work to add in OARFISH and allow the pipeline to selectively …
atrull314 Dec 5, 2024
74e95fb
Fixing multiQC issues
atrull314 Dec 6, 2024
8d6a241
Removing comments and unused modules/subworkflows
atrull314 Dec 9, 2024
6a94f29
Increasing memory for tag_barcodes
atrull314 Dec 9, 2024
cf31ba3
Adding check for genome_fa when preparing references
atrull314 Dec 9, 2024
911087d
Fixing order of creating genome fai
atrull314 Dec 9, 2024
3a9bef2
Various Fixes for multiqc report and test configs
atrull314 Dec 10, 2024
7b81f3c
Fixing linting
atrull314 Dec 13, 2024
86df3a6
Fixing prettier
atrull314 Dec 13, 2024
48186b5
Update output.md
atrull314 Dec 13, 2024
6bdbae0
Update README.md
atrull314 Dec 13, 2024
fc09a47
Updating metro
atrull314 Dec 19, 2024
057c3ac
Fixing markdown linting
atrull314 Dec 19, 2024
674d689
Updating metro
atrull314 Dec 19, 2024
3023e3a
Fixing issue when skip_dedup is used
atrull314 Dec 19, 2024
c4db265
Adding recommended flags for oarfish
atrull314 Dec 19, 2024
ed8d926
Add transcript fasta to test_full config
atrull314 Dec 19, 2024
42c66de
Temporary addition of new params
atrull314 Dec 19, 2024
78f5c29
Correcting variable name
atrull314 Dec 19, 2024
461407c
Fixing umitools dedup issue
atrull314 Dec 19, 2024
9059a41
Fixing UMITOOLS_DEDUP issue
atrull314 Dec 20, 2024
3477e01
Splitting save_secondary parameter between genome and transcriptome
atrull314 Dec 20, 2024
c43897c
Fixing linting issues
atrull314 Dec 20, 2024
b0a8ddf
Update output.md
atrull314 Dec 20, 2024
f017f6f
Updating CHANGELOG
atrull314 Dec 20, 2024
7363e6a
Update usage.md
atrull314 Dec 20, 2024
7fa82b4
Update README.md
atrull314 Dec 20, 2024
5c2f8d4
Fixing dedup issue
atrull314 Dec 20, 2024
4ea8c39
Fixing alignment on tube map
atrull314 Dec 20, 2024
902f5b1
Update CITATIONS.md
atrull314 Dec 20, 2024
7fdfd14
documentation updates
lianov Dec 23, 2024
f1f1524
fixi harshil alignment and removal commented code
lianov Dec 23, 2024
5c964ec
Merge pull request #41 from U-BDS/dev
atrull314 Dec 24, 2024
fcdaab2
Updating version to 1.1.0
atrull314 Jan 7, 2025
3e25f05
Removing commented line
atrull314 Jan 7, 2025
eac0a73
Fixing blaze version
atrull314 Jan 7, 2025
fd584dd
Disabling RSeQC for transcriptome analysis
atrull314 Jan 7, 2025
c125fe1
Adding stringi package to seurat conda
atrull314 Jan 8, 2025
57258d7
Fixing harshil alignment and adding stubs for local modules
atrull314 Jan 8, 2025
6c6e7ec
Fixing harshil alignment
atrull314 Jan 8, 2025
d1f406e
Modifiying quantifier param to make it more scalable
atrull314 Jan 10, 2025
5643cd7
Fixing linting issues
atrull314 Jan 13, 2025
080ab7a
Fixing linting
atrull314 Jan 13, 2025
67d0ab3
Making sure versions are used for all processes
atrull314 Jan 13, 2025
9951f69
Fixing null
atrull314 Jan 13, 2025
7eb96bc
Update usage.md
atrull314 Jan 13, 2025
b2708bc
enabling dedup in oarfish
lianov Jan 13, 2025
7feb7b8
updated output.md with dedup details
lianov Jan 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions .github/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ If you'd like to write some code for nf-core/scnanoseq, the standard workflow is
1. Check that there isn't already an issue about your idea in the [nf-core/scnanoseq issues](https://github.com/nf-core/scnanoseq/issues) to avoid duplicating work. If there isn't one already, please create one so that others know you're working on this
2. [Fork](https://help.github.com/en/github/getting-started-with-github/fork-a-repo) the [nf-core/scnanoseq repository](https://github.com/nf-core/scnanoseq) to your GitHub account
3. Make the necessary changes / additions within your forked repository following [Pipeline conventions](#pipeline-contribution-conventions)
4. Use `nf-core schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
4. Use `nf-core pipelines schema build` and add any new parameters to the pipeline JSON schema (requires [nf-core tools](https://github.com/nf-core/tools) >= 1.10).
5. Submit a Pull Request against the `dev` branch and wait for the code to be reviewed and merged

If you're not used to this workflow with git, you can start with some [docs from GitHub](https://help.github.com/en/github/collaborating-with-issues-and-pull-requests) or even their [excellent `git` resources](https://try.github.io/).
Expand All @@ -40,7 +40,7 @@ There are typically two types of tests that run:
### Lint tests

`nf-core` has a [set of guidelines](https://nf-co.re/developers/guidelines) which all pipelines must adhere to.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core lint <pipeline-directory>` command.
To enforce these and ensure that all pipelines stay in sync, we have developed a helper tool which runs checks on the pipeline code. This is in the [nf-core/tools repository](https://github.com/nf-core/tools) and once installed can be run locally with the `nf-core pipelines lint <pipeline-directory>` command.

If any failures or warnings are encountered, please follow the listed URL for more documentation.

Expand Down Expand Up @@ -75,7 +75,7 @@ If you wish to contribute a new step, please use the following coding standards:
2. Write the process block (see below).
3. Define the output channel if needed (see below).
4. Add any new parameters to `nextflow.config` with a default (see below).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core schema build` tool).
5. Add any new parameters to `nextflow_schema.json` with help text (via the `nf-core pipelines schema build` tool).
6. Add sanity checks and validation for all relevant parameters.
7. Perform local tests to validate that the new code works as expected.
8. If applicable, add a new test command in `.github/workflow/ci.yml`.
Expand All @@ -86,11 +86,11 @@ If you wish to contribute a new step, please use the following coding standards:

Parameters should be initialised / defined with default values in `nextflow.config` under the `params` scope.

Once there, use `nf-core schema build` to add to `nextflow_schema.json`.
Once there, use `nf-core pipelines schema build` to add to `nextflow_schema.json`.

### Default processes resource requirements

Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/master/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.
Sensible defaults for process resource requirements (CPUs / memory / time) for a process should be defined in `conf/base.config`. These should generally be specified generic with `withLabel:` selectors so they can be shared across multiple processes/steps of the pipeline. A nf-core standard set of labels that should be followed where possible can be seen in the [nf-core pipeline template](https://github.com/nf-core/tools/blob/main/nf_core/pipeline-template/conf/base.config), which has the default process as a single core-process, and then different levels of multi-core configurations for increasingly large memory requirements defined with standardised labels.

The process resources can be passed on to the tool dynamically within the process with the `${task.cpus}` and `${task.memory}` variables in the `script:` block.

Expand All @@ -103,7 +103,7 @@ Please use the following naming schemes, to make it easy to understand what is g

### Nextflow version bumping

If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core bump-version --nextflow . [min-nf-version]`
If you are using a new feature from core Nextflow, you may bump the minimum required version of nextflow in the pipeline with: `nf-core pipelines bump-version --nextflow . [min-nf-version]`

### Images and figures

Expand Down
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Learn more about contributing: [CONTRIBUTING.md](https://github.com/nf-core/scna
- [ ] If you've fixed a bug or added code that should be tested, add tests!
- [ ] If you've added a new tool - have you followed the pipeline conventions in the [contribution docs](https://github.com/nf-core/scnanoseq/tree/master/.github/CONTRIBUTING.md)
- [ ] If necessary, also make a PR on the nf-core/scnanoseq _branch_ on the [nf-core/test-datasets](https://github.com/nf-core/test-datasets) repository.
- [ ] Make sure your code lints (`nf-core lint`).
- [ ] Make sure your code lints (`nf-core pipelines lint`).
- [ ] Ensure the test suite passes (`nextflow run . -profile test,docker --outdir <OUTDIR>`).
- [ ] Check for unexpected warnings in debug mode (`nextflow run . -profile debug,test,docker --outdir <OUTDIR>`).
- [ ] Usage Documentation in `docs/usage.md` is updated.
Expand Down
25 changes: 21 additions & 4 deletions .github/workflows/awsfulltest.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,35 @@
name: nf-core AWS full size tests
# This workflow is triggered on published releases.
# This workflow is triggered on PRs opened against the master branch.
# It can be additionally triggered manually with GitHub actions workflow dispatch button.
# It runs the -profile 'test_full' on AWS batch

on:
release:
types: [published]
pull_request:
branches:
- master
workflow_dispatch:
pull_request_review:
types: [submitted]

jobs:
run-platform:
name: Run AWS full tests
if: github.repository == 'nf-core/scnanoseq'
# run only if the PR is approved by at least 2 reviewers and against the master branch or manually triggered
if: github.repository == 'nf-core/scnanoseq' && github.event.review.state == 'approved' && github.event.pull_request.base.ref == 'master' || github.event_name == 'workflow_dispatch'
runs-on: ubuntu-latest
steps:
- uses: octokit/[email protected]
id: check_approvals
with:
route: GET /repos/${{ github.repository }}/pulls/${{ github.event.pull_request.number }}/reviews
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- id: test_variables
if: github.event_name != 'workflow_dispatch'
run: |
JSON_RESPONSE='${{ steps.check_approvals.outputs.data }}'
CURRENT_APPROVALS_COUNT=$(echo $JSON_RESPONSE | jq -c '[.[] | select(.state | contains("APPROVED")) ] | length')
test $CURRENT_APPROVALS_COUNT -ge 2 || exit 1 # At least 2 approvals are required
- name: Launch workflow via Seqera Platform
uses: seqeralabs/action-tower-launch@v2
with:
Expand Down
54 changes: 48 additions & 6 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,37 +7,79 @@ on:
pull_request:
release:
types: [published]
workflow_dispatch:

env:
NXF_ANSI_LOG: false
NXF_SINGULARITY_CACHEDIR: ${{ github.workspace }}/.singularity
NXF_SINGULARITY_LIBRARYDIR: ${{ github.workspace }}/.singularity

concurrency:
group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"
cancel-in-progress: true

jobs:
test:
name: Run pipeline with test data
name: "Run pipeline with test data (${{ matrix.NXF_VER }} | ${{ matrix.test_name }} | ${{ matrix.profile }})"
# Only run on push if this is the nf-core dev branch (merged PRs)
if: "${{ github.event_name != 'push' || (github.event_name == 'push' && github.repository == 'nf-core/scnanoseq') }}"
runs-on: ubuntu-latest
strategy:
matrix:
NXF_VER:
- "23.04.0"
- "24.04.2"
- "latest-everything"
profile:
- "conda"
- "docker"
- "singularity"
test_name:
- "test"
isMaster:
- ${{ github.base_ref == 'master' }}
# Exclude conda and singularity on dev
exclude:
- isMaster: false
profile: "conda"
- isMaster: false
profile: "singularity"
steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4

- name: Install Nextflow
- name: Set up Nextflow
uses: nf-core/setup-nextflow@v2
with:
version: "${{ matrix.NXF_VER }}"

- name: Disk space cleanup
- name: Set up Apptainer
if: matrix.profile == 'singularity'
uses: eWaterCycle/setup-apptainer@main

- name: Set up Singularity
if: matrix.profile == 'singularity'
run: |
mkdir -p $NXF_SINGULARITY_CACHEDIR
mkdir -p $NXF_SINGULARITY_LIBRARYDIR

- name: Set up Miniconda
if: matrix.profile == 'conda'
uses: conda-incubator/setup-miniconda@a4260408e20b96e80095f42ff7f1a15b27dd94ca # v3
with:
miniconda-version: "latest"
auto-update-conda: true
conda-solver: libmamba
channels: conda-forge,bioconda

- name: Set up Conda
if: matrix.profile == 'conda'
run: |
echo $(realpath $CONDA)/condabin >> $GITHUB_PATH
echo $(realpath python) >> $GITHUB_PATH

- name: Clean up Disk space
uses: jlumbroso/free-disk-space@54081f138730dfa15788a46383842cd2f914a1be # v1.3.1

- name: Run pipeline with test data
- name: "Run pipeline with test data ${{ matrix.NXF_VER }} | ${{ matrix.test_name }} | ${{ matrix.profile }}"
run: |
nextflow run ${GITHUB_WORKSPACE} -profile test,docker --outdir ./results
nextflow run ${GITHUB_WORKSPACE} -profile ${{ matrix.test_name }},${{ matrix.profile }} --outdir ./results
53 changes: 43 additions & 10 deletions .github/workflows/download_pipeline.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Test successful pipeline download with 'nf-core download'
name: Test successful pipeline download with 'nf-core pipelines download'

# Run the workflow when:
# - dispatched manually
Expand All @@ -8,7 +8,7 @@ on:
workflow_dispatch:
inputs:
testbranch:
description: "The specific branch you wish to utilize for the test execution of nf-core download."
description: "The specific branch you wish to utilize for the test execution of nf-core pipelines download."
required: true
default: "dev"
pull_request:
Expand Down Expand Up @@ -39,9 +39,11 @@ jobs:
with:
python-version: "3.12"
architecture: "x64"
- uses: eWaterCycle/setup-singularity@931d4e31109e875b13309ae1d07c70ca8fbc8537 # v7

- name: Setup Apptainer
uses: eWaterCycle/setup-apptainer@4bb22c52d4f63406c49e94c804632975787312b3 # v2.0.0
with:
singularity-version: 3.8.3
apptainer-version: 1.3.4

- name: Install dependencies
run: |
Expand All @@ -54,33 +56,64 @@ jobs:
echo "REPOTITLE_LOWERCASE=$(basename ${GITHUB_REPOSITORY,,})" >> ${GITHUB_ENV}
echo "REPO_BRANCH=${{ github.event.inputs.testbranch || 'dev' }}" >> ${GITHUB_ENV}

- name: Make a cache directory for the container images
run: |
mkdir -p ./singularity_container_images

- name: Download the pipeline
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
run: |
nf-core download ${{ env.REPO_LOWERCASE }} \
nf-core pipelines download ${{ env.REPO_LOWERCASE }} \
--revision ${{ env.REPO_BRANCH }} \
--outdir ./${{ env.REPOTITLE_LOWERCASE }} \
--compress "none" \
--container-system 'singularity' \
--container-library "quay.io" -l "docker.io" -l "ghcr.io" \
--container-library "quay.io" -l "docker.io" -l "community.wave.seqera.io" \
--container-cache-utilisation 'amend' \
--download-configuration
--download-configuration 'yes'

- name: Inspect download
run: tree ./${{ env.REPOTITLE_LOWERCASE }}

- name: Count the downloaded number of container images
id: count_initial
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Initial container image count: $image_count"
echo "IMAGE_COUNT_INITIAL=$image_count" >> ${GITHUB_ENV}

- name: Run the downloaded pipeline (stub)
id: stub_run_pipeline
continue-on-error: true
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -stub -profile test,singularity --outdir ./results
- name: Run the downloaded pipeline (stub run not supported)
id: run_pipeline
if: ${{ job.steps.stub_run_pipeline.status == failure() }}
env:
NXF_SINGULARITY_CACHEDIR: ./
NXF_SINGULARITY_CACHEDIR: ./singularity_container_images
NXF_SINGULARITY_HOME_MOUNT: true
run: nextflow run ./${{ env.REPOTITLE_LOWERCASE }}/$( sed 's/\W/_/g' <<< ${{ env.REPO_BRANCH }}) -profile test,singularity --outdir ./results

- name: Count the downloaded number of container images
id: count_afterwards
run: |
image_count=$(ls -1 ./singularity_container_images | wc -l | xargs)
echo "Post-pipeline run container image count: $image_count"
echo "IMAGE_COUNT_AFTER=$image_count" >> ${GITHUB_ENV}

- name: Compare container image counts
run: |
if [ "${{ env.IMAGE_COUNT_INITIAL }}" -ne "${{ env.IMAGE_COUNT_AFTER }}" ]; then
initial_count=${{ env.IMAGE_COUNT_INITIAL }}
final_count=${{ env.IMAGE_COUNT_AFTER }}
difference=$((final_count - initial_count))
echo "$difference additional container images were \n downloaded at runtime . The pipeline has no support for offline runs!"
tree ./singularity_container_images
exit 1
else
echo "The pipeline can be downloaded successfully!"
fi
23 changes: 19 additions & 4 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name: nf-core linting
# This workflow is triggered on pushes and PRs to the repository.
# It runs the `nf-core lint` and markdown lint tests to ensure
# It runs the `nf-core pipelines lint` and markdown lint tests to ensure
# that the code meets the nf-core guidelines.
on:
push:
Expand Down Expand Up @@ -42,17 +42,32 @@ jobs:
python-version: "3.12"
architecture: "x64"

- name: read .nf-core.yml
uses: pietrobolcato/[email protected]
id: read_yml
with:
config: ${{ github.workspace }}/.nf-core.yml

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install nf-core
pip install nf-core==${{ steps.read_yml.outputs['nf_core_version'] }}

- name: Run nf-core pipelines lint
if: ${{ github.base_ref != 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt pipelines lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Run nf-core lint
- name: Run nf-core pipelines lint --release
if: ${{ github.base_ref == 'master' }}
env:
GITHUB_COMMENTS_URL: ${{ github.event.pull_request.comments_url }}
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_PR_COMMIT: ${{ github.event.pull_request.head.sha }}
run: nf-core -l lint_log.txt lint --dir ${GITHUB_WORKSPACE} --markdown lint_results.md
run: nf-core -l lint_log.txt pipelines lint --release --dir ${GITHUB_WORKSPACE} --markdown lint_results.md

- name: Save PR number
if: ${{ always() }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/linting_comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Download lint results
uses: dawidd6/action-download-artifact@09f2f74827fd3a8607589e5ad7f9398816f540fe # v3
uses: dawidd6/action-download-artifact@bf251b5aa9c2f7eeb574a96ee720e24f801b7c11 # v6
with:
workflow: linting.yml
workflow_conclusion: completed
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/release-announcements.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ jobs:
- name: get topics and convert to hashtags
id: get_topics
run: |
echo "topics=$(curl -s https://nf-co.re/pipelines.json | jq -r '.remote_workflows[] | select(.full_name == "${{ github.repository }}") | .topics[]' | awk '{print "#"$0}' | tr '\n' ' ')" >> $GITHUB_OUTPUT
echo "topics=$(curl -s https://nf-co.re/pipelines.json | jq -r '.remote_workflows[] | select(.full_name == "${{ github.repository }}") | .topics[]' | awk '{print "#"$0}' | tr '\n' ' ')" | sed 's/-//g' >> $GITHUB_OUTPUT

- uses: rzr/fediverse-action@master
with:
Expand Down
46 changes: 46 additions & 0 deletions .github/workflows/template_version_comment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
name: nf-core template version comment
# This workflow is triggered on PRs to check if the pipeline template version matches the latest nf-core version.
# It posts a comment to the PR, even if it comes from a fork.

on: pull_request_target

jobs:
template_version:
runs-on: ubuntu-latest
steps:
- name: Check out pipeline code
uses: actions/checkout@0ad4b8fadaa221de15dcec353f45205ec38ea70b # v4
with:
ref: ${{ github.event.pull_request.head.sha }}

- name: Read template version from .nf-core.yml
uses: nichmor/[email protected]
id: read_yml
with:
config: ${{ github.workspace }}/.nf-core.yml

- name: Install nf-core
run: |
python -m pip install --upgrade pip
pip install nf-core==${{ steps.read_yml.outputs['nf_core_version'] }}

- name: Check nf-core outdated
id: nf_core_outdated
run: echo "OUTPUT=$(pip list --outdated | grep nf-core)" >> ${GITHUB_ENV}

- name: Post nf-core template version comment
uses: mshick/add-pr-comment@b8f338c590a895d50bcbfa6c5859251edc8952fc # v2
if: |
contains(env.OUTPUT, 'nf-core')
with:
repo-token: ${{ secrets.NF_CORE_BOT_AUTH_TOKEN }}
allow-repeats: false
message: |
> [!WARNING]
> Newer version of the nf-core template is available.
>
> Your pipeline is using an old version of the nf-core template: ${{ steps.read_yml.outputs['nf_core_version'] }}.
> Please update your pipeline to the latest version.
>
> For more documentation on how to update your pipeline, please see the [nf-core documentation](https://github.com/nf-core/tools?tab=readme-ov-file#sync-a-pipeline-with-the-template) and [Synchronisation documentation](https://nf-co.re/docs/contributing/sync).
#
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,4 @@ params.yml
samplesheet.csv
*.swp
input*
null/
Loading
Loading