Skip to content

Commit

Permalink
Merge pull request #331 from Australian-Imaging-Service/t1w-preproc
Browse files Browse the repository at this point in the history
added t1w preprocess pipeline
  • Loading branch information
tclose authored Dec 12, 2024
2 parents 9f7ae5d + 66d459c commit e2bba58
Show file tree
Hide file tree
Showing 253 changed files with 1,274,845 additions and 711 deletions.
7 changes: 4 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ jobs:
env:
PUSH: "${{ steps.deployable.outputs.PUSH }}"
run: >
pydra2app make xnat ./australian-imaging-service
--registry ghcr.io --check-registry --clean-up --tag-latest --loglevel info
--release pipelines-metapackage $RELEASE $PUSH
pipeline2app make xnat ./specs/australian-imaging-service/mri/human/neuro/t1w/preprocess.yaml
--registry ghcr.io --check-registry --clean-up --loglevel info
--resources-dir ./resources --spec-root ./specs --source-package . $PUSH
# --release pipelines-metapackage $RELEASE --tag-latest
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -12,3 +12,4 @@ __pycache__
test-data
*.build*
*.venv
_version.py
2 changes: 2 additions & 0 deletions AUTHORS
Original file line number Diff line number Diff line change
Expand Up @@ -2,3 +2,5 @@ Australian Imaging Service Analysis pipelines were developed by

Thomas G. Close (Sydney Imaging, The University of Sydney, Sydney, Australia
& Australian National Imaging Facility, Australia)
Arkiev D'Souza (Sydney Imaging, The University of Sydney, Sydney, Australia
& Australian National Imaging Facility, Australia)
6 changes: 1 addition & 5 deletions README.rst → README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
AIS Analysis Pipelines
======================
# AIS Analysis Pipelines

AIS Analysis pipelines is a collection of scripts for generating containerised
pipelines that can be run by the XNAT Container Service plugin.
Expand All @@ -8,6 +7,3 @@ pipelines that can be run by the XNAT Container Service plugin.
This work is licensed under a
`Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License <http://creativecommons.org/licenses/by-nc-sa/4.0/>`_

.. image:: https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png
:target: http://creativecommons.org/licenses/by-nc-sa/4.0/
:alt: Creative Commons License: Attribution-NonCommercial-ShareAlike 4.0 International
175 changes: 0 additions & 175 deletions australian-imaging-service/mri/human/neuro/bidsapps/fmriprep.yaml

This file was deleted.

62 changes: 0 additions & 62 deletions australian-imaging-service/mri/human/neuro/bidsapps/mriqc.yaml

This file was deleted.

72 changes: 0 additions & 72 deletions australian-imaging-service/mri/human/neuro/bidsapps/smriprep.yaml

This file was deleted.

22 changes: 15 additions & 7 deletions conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
import typing as ty
from datetime import datetime
from dataclasses import dataclass
from arcana.core.utils.misc import varname2path
from frametree.core.utils import varname2path
import pytest
from click.testing import CliRunner
import xnat4tests
Expand Down Expand Up @@ -57,11 +57,19 @@ class BidsAppTestBlueprint:


BIDS_APP_PARAMETERS = {
'fmriprep': {'json_edits': "func/.*bold \".SliceTiming[] /= 1000.0\""},
'qsiprep': {'qsiprep_flags': '--output-resolution 2.5'}}


bids_apps_dir = Path(__file__).parent / "australianimagingservice" / "mri" / "human" / "neuro" / "bidsapps"
"fmriprep": {"json_edits": 'func/.*bold ".SliceTiming[] /= 1000.0"'},
"qsiprep": {"qsiprep_flags": "--output-resolution 2.5"},
}


bids_apps_dir = (
Path(__file__).parent
/ "australianimagingservice"
/ "mri"
/ "human"
/ "neuro"
/ "bidsapps"
)
test_bids_data_dir = (
Path(__file__).parent / "tests" / "data" / "mri" / "human" / "neuro" / "bidsapps"
)
Expand Down Expand Up @@ -160,4 +168,4 @@ def upload_test_dataset_to_xnat(project_id: str, source_data_dir: Path, xnat_con
xresource.upload_dir(resource_path, method="tar_file")

# Populate metadata from DICOM headers
login.put(f'/data/experiments/{xsession.id}?pullDataFromHeaders=true')
login.put(f"/data/experiments/{xsession.id}?pullDataFromHeaders=true")
Loading

0 comments on commit e2bba58

Please sign in to comment.