Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix nfs test setup fail #2022

Draft
wants to merge 17 commits into
base: master
Choose a base branch
from
13 changes: 12 additions & 1 deletion .github/workflows/analysis_workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ jobs:
needs: [cibw_docker_image]
runs-on: "ubuntu-22.04"
container:
image: ${{needs.cibw_docker_image.outputs.tag}}
image: quay.io/pypa/manylinux_2_28_x86_64
services:
mongodb:
image: mongo:4.4
Expand All @@ -133,6 +133,17 @@ jobs:
uses: SimenB/[email protected]
id: cpu-cores

- name: Install deps
run: |
yum update -y
yum install -y zip flex bison krb5-devel cyrus-sasl-devel openssl-devel \
unzip tar epel-release jq wget libcurl-devel python3 \
python3-devel python3-pip perl-IPC-Cmd

yum install -y mono-complete

yum clean all

- name: Extra envs
run: |
. build_tooling/vcpkg_caching.sh # Linux follower needs another call in CIBW
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ jobs:
mongodb:
image: "mongo:4.4"
container:
image: ${{needs.cibw_docker_image.outputs.tag}}
image: quay.io/pypa/manylinux_2_28_x86_64
volumes:
- /:/mnt
windows_matrix:
Expand Down
40 changes: 33 additions & 7 deletions .github/workflows/build_steps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -87,10 +87,18 @@ jobs:
maximum-size: 6GB
disk-root: "D:" # This is also the checkout directory. Total size 12GB.
continue-on-error: true

- name: Enable Windows compiler commands
if: matrix.os == 'windows'
uses: ilammy/[email protected]

- name: Install deps
if: matrix.os == 'linux' && inputs.job_type != 'build-python-wheels'
run: |
yum update -y
yum install -y zip flex bison krb5-devel cyrus-sasl-devel openssl-devel \
unzip tar epel-release jq wget libcurl-devel python3 \
python3-devel python3-pip perl-IPC-Cmd

yum install -y mono-complete

yum clean all

- name: Extra envs
# This has to come after msvc-dev-cmd to overwrite the bad VCPKG_ROOT it sets
Expand Down Expand Up @@ -122,6 +130,23 @@ jobs:
if: inputs.job_type != 'build-python-wheels'
run: . build_tooling/prep_cpp_build.sh # Also applies to Windows

# When a GitHub Windows image gets update the MSVC compiler also can get updated. New compilers can have compilation errors in Arctic or in the VCPKG dependencies.
# We needd to pin a particular MSVC so that runner updates don't affect us.
# When the MSVC version is update custom-triplets/x64-windows-static-msvc.cmake must also be updated with the correct toolsed version.
- name: Install Required MSVC
if: matrix.os == 'windows'
run: |
choco install -y -f visualstudio2022buildtools --version=117.11.4 --params "--add Microsoft.VisualStudio.Component.VC.Tools.x86.x64 --installChannelUri https://aka.ms/vs/17/release/390666095_1317821361/channel"
choco install -y ninja

- name: Enable Windows compiler commands
if: matrix.os == 'windows'
uses: TheMrMilchmann/setup-msvc-dev@v3
with:
arch: x64
toolset: 14.41
vs-path: 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools'

- name: CMake compile
if: inputs.job_type != 'build-python-wheels'
# We are pinning the version to 10.6 because >= 10.7, use node20 which is not supported in the container
Expand Down Expand Up @@ -190,10 +215,10 @@ jobs:
run: |
if [ ${{inputs.python3}} -gt 6 ]
then
find python/tests/* -maxdepth 0 -type d ! -regex '.*\(__pycache__\|util\|nonreg\|scripts\)' -printf '"%f",' |
find python/tests/* -maxdepth 0 -type d ! -regex '.*\(__pycache__\|util\|nonreg\|scripts\|hypothesis\|compat\|stress\|unit\)' -printf '"%f",' |
sed 's/^/test_dirs=[/ ; s/"hypothesis"/"{hypothesis,nonreg,scripts}"/ ; s/,$/]/' | tee -a $GITHUB_ENV
else
find python/tests/* -maxdepth 0 -type d ! -regex '.*\(__pycache__\|util\|nonreg\|scripts\|compat\)' -printf '"%f",' |
find python/tests/* -maxdepth 0 -type d ! -regex '.*\(__pycache__\|util\|nonreg\|scripts\|compat\|hypothesis\|stress\|unit\)' -printf '"%f",' |
sed 's/^/test_dirs=[/ ; s/"hypothesis"/"{hypothesis,nonreg,scripts}"/ ; s/,$/]/' | tee -a $GITHUB_ENV
fi

Expand Down Expand Up @@ -255,7 +280,7 @@ jobs:
${{fromJSON(inputs.matrix)}}
name: ${{matrix.type}}${{matrix.python_deps_id}}
runs-on: ${{matrix.distro}}
container: ${{matrix.os == 'linux' && needs.compile.outputs.manylinux_image || null}}
container: ${{matrix.os == 'linux' && matrix.container || null}}
defaults:
run: {shell: bash}
services: ${{matrix.test_services}}
Expand Down Expand Up @@ -310,6 +335,7 @@ jobs:
python -m pip install --force-reinstall -r $GITHUB_WORKSPACE/build_tooling/${{matrix.python_deps}}
fi
python -m pip uninstall -y pytest-cpp || true # No longer works on 3.6
pip install --force-reinstall "boto3<=1.35.62" "botocore<=1.35.62" "cryptography<=43.0.3"
python -m pip list
echo -e "${{matrix.envs || ''}}" | tee -a $GITHUB_ENV
if [[ -n "$MSYSTEM" ]] ; then
Expand Down
2 changes: 1 addition & 1 deletion build_tooling/parallel_test.sh
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ cd $PARALLEL_TEST_ROOT

export ARCTICDB_RAND_SEED=$RANDOM

$catch python -m pytest --timeout=3600 $PYTEST_XDIST_MODE -v --log-file="$TEST_OUTPUT_DIR/pytest-logger.$group.log" \
$catch python -m pytest --timeout=3600 $PYTEST_XDIST_MODE -vs --log-file="$TEST_OUTPUT_DIR/pytest-logger.$group.log" \
--junitxml="$TEST_OUTPUT_DIR/pytest.$group.xml" \
--basetemp="$PARALLEL_TEST_ROOT/temp-pytest-output" \
"$@" 2>&1 | sed -ur "s#^(tests/.*/([^/]+\.py))?#\2#"
10 changes: 6 additions & 4 deletions cpp/CMakePresets.json
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,9 @@
"generator": "Ninja",
"environment": { "cmakepreset_expected_host_system": "Windows" },
"cacheVariables": {
"ARCTICDB_USE_PCH": "ON"
"ARCTICDB_USE_PCH": "ON",
"VCPKG_OVERLAY_TRIPLETS": "custom-triplets",
"VCPKG_TARGET_TRIPLET": "x64-windows-static-msvc"
}
},
{
Expand All @@ -80,8 +82,7 @@
},
"cacheVariables": {
"CMAKE_C_COMPILER": "cl",
"CMAKE_CXX_COMPILER": "cl",
"VCPKG_TARGET_TRIPLET": "x64-windows-static"
"CMAKE_CXX_COMPILER": "cl"
}
},
{
Expand All @@ -97,7 +98,8 @@
"installDir": "${sourceDir}/out/install",
"cacheVariables": {
"CMAKE_CXX_FLAGS": "/MP",
"VCPKG_TARGET_TRIPLET": "x64-windows-static",
"VCPKG_OVERLAY_TRIPLETS": "custom-triplets",
"VCPKG_TARGET_TRIPLET": "x64-windows-static-msvc",
"ARCTICDB_PYTHON_EXPLICIT_LINK": "ON"
}
},
Expand Down
4 changes: 4 additions & 0 deletions cpp/custom-triplets/x64-windows-static-msvc.cmake
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
set(VCPKG_TARGET_ARCHITECTURE x64)
set(VCPKG_CRT_LINKAGE static)
set(VCPKG_LIBRARY_LINKAGE static)
set(VCPKG_PLATFORM_TOOLSET_VERSION 14.41)
7 changes: 3 additions & 4 deletions python/arcticdb/storage_fixtures/s3.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,10 @@
import os
import re
import sys
import trustme
import subprocess
import platform
from tempfile import mkdtemp
import werkzeug
from moto.server import DomainDispatcherApplication, create_backend_app


import requests
Expand Down Expand Up @@ -181,6 +181,7 @@ def __str__(self):

def _boto(self, service: str, key: Key, api="client"):
import boto3
boto3.set_stream_logger('', logging.DEBUG)

ctor = getattr(boto3, api)
return ctor(
Expand Down Expand Up @@ -262,8 +263,6 @@ def __init__(self,

@staticmethod
def run_server(port, key_file, cert_file):
import werkzeug
from moto.server import DomainDispatcherApplication, create_backend_app

class _HostDispatcherApplication(DomainDispatcherApplication):
_reqs_till_rate_limit = -1
Expand Down
5 changes: 3 additions & 2 deletions python/arcticdb/storage_fixtures/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
from contextlib import AbstractContextManager
from dataclasses import dataclass, field
import trustme
import random

_WINDOWS = platform.system() == "Windows"
_DEBUG = os.getenv("ACTIONS_RUNNER_DEBUG", default=None) in (1, "True")
Expand All @@ -31,7 +32,7 @@ def get_ephemeral_port(seed=0):
# https://stackoverflow.com/a/61685162/ and multiple test runners call this function at roughly the same time, they
# may get the same port! Below more sophisticated implementation uses the PID to avoid that:
pid = os.getpid()
port = (pid // 1000 + pid) % 1000 + seed * 1000 + 10000 # Crude hash
port = (pid // 1000 + pid) % 1000 + seed * 1000 + 10000 + random.randint(0, 999) # Crude hash
while port < 65535:
try:
with socketserver.TCPServer(("localhost", port), None):
Expand Down Expand Up @@ -90,7 +91,7 @@ def terminate(p: Union[multiprocessing.Process, subprocess.Popen]):
os.kill(p.pid, signal.SIGKILL) # TODO (python37): use Process.kill()


def wait_for_server_to_come_up(url: str, service: str, process: ProcessUnion, *, timeout=20, sleep=0.2, req_timeout=1):
def wait_for_server_to_come_up(url: str, service: str, process: ProcessUnion, *, timeout=20, sleep=1, req_timeout=1):
deadline = time.time() + timeout
if process is None:
alive = lambda: True
Expand Down
2 changes: 1 addition & 1 deletion python/tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ def s3_storage(s3_storage_factory) -> Iterator[S3Bucket]:
yield f


@pytest.fixture
@pytest.fixture()
def nfs_backed_s3_storage(nfs_backed_s3_storage_factory) -> Iterator[NfsS3Bucket]:
with nfs_backed_s3_storage_factory.create_fixture() as f:
yield f
Expand Down
2 changes: 1 addition & 1 deletion setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ Testing =
future
mock
boto3
moto
moto <5.0.21
flask # Used by moto
flask-cors
hypothesis <6.73
Expand Down
Loading