Skip to content

Commit

Permalink
Merge pull request #234 from alchem0x2A/master
Browse files Browse the repository at this point in the history
  • Loading branch information
phanish-suryanarayana authored Nov 23, 2024
2 parents 54fe2a8 + d013645 commit 38b6af5
Show file tree
Hide file tree
Showing 11 changed files with 342 additions and 84 deletions.
12 changes: 5 additions & 7 deletions .conda/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
# Build conda recipe for sparc

**Note** the official conda-forge package for SPARC can be found at
[`sparc-x`](https://github.com/conda-forge/sparc-x-feedstock). This
recipe is maintained for CI purpose only.

1. Install `conda-build` and (optionally) `boa`
```bash
conda install -c conda-forge "conda-build>=3.20" colorama pip ruamel ruamel.yaml rich mamba jsonschema
Expand All @@ -22,16 +27,9 @@ anaconda login
anaconda upload $CONDA_PREFIX/conda-bld/linux-64/sparc-<YYYY>.<MM>.<DD>-<i>.bz2
```

If the SPARC package is finally in production, we need to merge it to `conda-forge` channel (perhaps using a different name `sparc-dft`?)


4. Using the package
```bash
conda install -c <your-channel> sparc
```
This will automatically install `sparc` with `openmpi` + `scalapack` + `openblas` support. No compilation requires afterwards.

5. TODOs
- [ ] Configure the mpi-aware fftw package
- [ ] Include mkl-variant?
- [ ] Complete conda-forge merge once in upstream main
32 changes: 30 additions & 2 deletions .github/workflows/build-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,13 +10,41 @@ on:
workflow_dispatch:

jobs:
# Check if initialization.c is up-to-date with Changelog
package-date-check:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: "3.11"
activate-environment: sparc-test
conda-build-version: "24.9.0"
miniforge-version: latest # Fix according to https://github.com/conda-incubator/setup-miniconda?tab=readme-ov-file#example-10-miniforge
channels: conda-forge,defaults
channel-priority: true
- name: Install SPARC-X-API stable version for docparser
run: |
mamba install -c conda-forge pip setuptools
pip install git+https://github.com/SPARC-X/[email protected]
- name: Convert parameters.json
run: |
python -m sparc.docparser --include-subdirs doc/.LaTeX
- name: Check package version and ChangeLog date
run: |
# Usage:
# python test-outdated-package.py <parameters.json> <changelog>
python .github/workflows/test-outdated-package.py \
./parameters.json ChangeLog
build-linux:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
strategy:
max-parallel: 5
needs: package-date-check

steps:
- uses: actions/checkout@v4
Expand Down
87 changes: 87 additions & 0 deletions .github/workflows/test-missing-parameters.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
"""Test script to check parameters coverage in examples
"""
import json
import os
from pathlib import Path
from sparc.io import _read_inpt

def load_parameters_json(json_path):
"""
Load the parameters from the parameters.json file.
"""
with open(json_path, 'r') as f:
parameters = json.load(f)
if "parameters" not in parameters:
raise KeyError("The 'parameters' field is missing in parameters.json")
return parameters["parameters"]

def check_missing_parameters(test_dir, parameters_json_path):
"""
Check for missing parameters in the documentation.
test_dir must be structured in <name>/standard/<name>.inpt
"""
test_dir = Path(test_dir)
documented_parameters = load_parameters_json(parameters_json_path)

# Iterate through the .inpt files and check for missing parameters
report = {}
for match_file in test_dir.glob("*/standard/*.inpt"):
test_name = match_file.stem
try:
inpt_data = _read_inpt(match_file)
params_in_file = inpt_data["inpt"]["params"]
except Exception:
# Something could be buggy with SPARC-X-API?
pass

# Check missing or typo parameters
missing_params = [
param for param in params_in_file
if (param.upper() not in documented_parameters)
# TODO: Obsolete BOUNDARY_CONDITION keyword
and (param.upper() != "BOUNDARY_CONDITION")
]
if missing_params:
report[test_name] = missing_params

# Generate report and return error if missing parameters are found
if report:
print("Missing / Incorrect Parameters Report:")
print("-" * 60)
for file_path, missing_params in report.items():
print(f"Test name: {file_path}")
print(f"Missing Parameters: {', '.join(missing_params)}")
print("-" * 60)
return False
else:
print("All parameters are documented correctly.")
return True


def main():
import argparse
parser = argparse.ArgumentParser(
description="Check for missing / incorrect parameters in SPARC examples."
)
parser.add_argument(
"test_directory",
type=str,
help="Path to the directory containing test .inpt files."
)
parser.add_argument(
"parameters_json",
type=str,
help="Path to the parameters.json file."
)

args = parser.parse_args()

# Run the check
success = check_missing_parameters(args.test_directory,
args.parameters_json)
if not success:
exit(1)
else:
exit(0)
if __name__ == "__main__":
main()
108 changes: 108 additions & 0 deletions .github/workflows/test-outdated-package.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
"""Test script to check if SPARC package version is older than
the latest Changelog entry
We generally do not check the other way around since sometimes a trivial bump of SPARC version may be necessary
"""
import re
import json
from datetime import datetime
from pathlib import Path


def load_parameters_json(json_path):
"""
Load the parameters from the parameters.json file.
"""
with open(json_path, 'r') as f:
parameters = json.load(f)
if "sparc_version" not in parameters:
raise KeyError("The 'sparc_version' field is missing in parameters.json")
return parameters["sparc_version"]


def extract_latest_date_from_changelog(changelog_path):
"""
Extracts the latest date from the changelog file.
"""
date_patterns = [
r"(?P<date>\b(?:Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec) \d{1,2}, \d{4})",
r"(?P<date>\b(?:January|February|March|April|May|June|July|August|September|October|November|December) \d{1,2}, \d{4})",
]

latest_date = None
changelog_path = Path(changelog_path)

with changelog_path.open("r") as f:
content = f.read()

for pattern in date_patterns:
matches = re.findall(pattern, content)
for match in matches:
try:
# Convert matched date to datetime object
parsed_date = datetime.strptime(match, "%b %d, %Y") if "," in match else datetime.strptime(match, "%B %d, %Y")
if latest_date is None or parsed_date > latest_date:
latest_date = parsed_date
except ValueError:
continue # Skip invalid date formats

if latest_date is None:
raise ValueError("No valid date found in the changelog.")
return latest_date


def check_version_against_changelog(parameters_json_path, changelog_path):
"""
Check if the package version in parameters.json is older than the latest changelog date.
"""
# Load sparc_version from parameters.json
sparc_version = load_parameters_json(parameters_json_path)
version_date = datetime.strptime(sparc_version, "%Y.%m.%d")

# Extract the latest date from the changelog
latest_changelog_date = extract_latest_date_from_changelog(changelog_path)

if version_date < latest_changelog_date:
print("Version Check Report:")
print("-" * 60)
print(f"ERROR: SPARC version ({version_date.strftime('%Y.%m.%d')}) "
f"is older than the latest changelog date ({latest_changelog_date.strftime('%Y.%m.%d')}).")
print("Please update initialization.c!")
print("-" * 60)
return False
else:
print("Version Check Report:")
print("-" * 60)
print("SUCCESS:")
print("-" * 60)
return True


def main():
import argparse
parser = argparse.ArgumentParser(
description="Check if package version is up-to-date with the changelog."
)
parser.add_argument(
"parameters_json",
type=str,
help="Path to the parameters.json file."
)
parser.add_argument(
"changelog",
type=str,
help="Path to the changelog file."
)

args = parser.parse_args()

# Run the version check
success = check_version_against_changelog(args.parameters_json, args.changelog)
if not success:
exit(1)
else:
exit(0)


if __name__ == "__main__":
main()
90 changes: 90 additions & 0 deletions .github/workflows/update-doc-pdf.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
name: Check and render LaTeX doc into pdf

on:
push:
branches:
- master
paths:
- doc/**
- .github/workflows/update-doc-pdf.yml
pull_request:
branches:
- master
paths:
- doc/**
- .github/workflows/update-doc-pdf.yml
workflow_dispatch:

jobs:
check-parameters:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
steps:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: "3.11"
activate-environment: sparc-test
conda-build-version: "24.9.0"
miniforge-version: latest # Fix according to https://github.com/conda-incubator/setup-miniconda?tab=readme-ov-file#example-10-miniforge
channels: conda-forge,defaults
channel-priority: true
- name: Install SPARC-X-API stable version for docparser
run: |
mamba install -c conda-forge pip setuptools
pip install git+https://github.com/SPARC-X/[email protected]
- name: Convert parameters.json
run: |
python -m sparc.docparser --include-subdirs doc/.LaTeX
- name: Check missing parameters in test examples
run: |
# Usage:
# python test-missing-parameters.py <test-dir> <parameters.json>
# If test fails, a list of missing / typo params will
# be written in output
python .github/workflows/test-missing-parameters.py \
tests/ ./parameters.json
render-pdf-linux:
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
needs: check-parameters
steps:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: "3.11"
activate-environment: sparc-test
conda-build-version: "24.9.0"
miniforge-version: latest # Fix according to https://github.com/conda-incubator/setup-miniconda?tab=readme-ov-file#example-10-miniforge
channels: conda-forge,defaults
channel-priority: true
- name: Install tectonic as latex rendering engine
run: |
mamba install -c conda-forge tectonic
- name: Make temp build dir
run: |
mkdir -p doc/_build
- name: Render main manual
run: |
tectonic -X compile doc/.LaTeX/Manual.tex \
--outdir doc/_build
ls -al doc/_build
- name: Render subdir manuals
run: |
for dir in doc/.LaTeX/*; do
if [ -d "$dir" ]; then
manual=$(find "$dir" -maxdepth 1 -name "*Manual.tex" | head -n 1)
if [ -n "$manual" ]; then
tectonic -X compile "$manual" \
--outdir doc/_build
echo "Rendered: $manual"
fi
fi
done
ls -al doc/_build
9 changes: 9 additions & 0 deletions ChangeLog
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@
-Date
-Name
-changes
--------------
Nov 23, 2024
Name: Tian Tian
Changes: (doc, CI workflow)
1. Fix typo in SQ LaTeX doc
2. Add CI workflow to check missing parameters
3. Add CI workflow to validate and render LaTeX doc
4. Add CI workflow to check outdated initialization.c (SPARC version) if older than current Changelog

--------------
Nov 18, 2024
Name: Tian Tian, Lucas Timmerman
Expand Down
10 changes: 10 additions & 0 deletions doc/.LaTeX/Introduction.tex
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,8 @@
\begin{itemize}
\item \textbf{Benjamin Comer}: Code testing, Initial testing framework
\item \textbf{Sushree Jagriti Sahoo}: Code testing
\item \textbf{Tian Tian}: Socket communication layer, code testing, Python API
\item \textbf{Lucas R Timmerman}: Socket communication layer, code testing, Python API
\end{itemize}
\end{itemize}

Expand Down Expand Up @@ -416,6 +418,14 @@
\begin{block}{Orbital Free DFT}
\hyperlink{OFDFT_FLAG}{\texttt{OFDFT\_FLAG}} $\vert$ \hyperlink{TOL_OFDFT}{\texttt{TOL\_OFDFT}} $\vert$ \hyperlink{OFDFT_LAMBDA}{\texttt{OFDFT\_LAMBDA}}
\end{block}

\begin{block}{Socket communication}
\hyperlink{SOCKET_FLAG}{\texttt{SOCKET\_FLAG}} $\vert$
\hyperlink{SOCKET_HOST}{\texttt{SOCKET\_HOST}} $\vert$
\hyperlink{SOCKET_PORT}{\texttt{SOCKET\_PORT}} $\vert$
\hyperlink{SOCKET_INET}{\texttt{SOCKET\_INET}} $\vert$
\hyperlink{SOCKET_MAX_NITER}{\texttt{SOCKET\_MAX\_NITER}} $\vert$
\end{block}

\end{frame}

Loading

0 comments on commit 38b6af5

Please sign in to comment.