forked from davidsd/sdpb
-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[pull] master from davidsd:master #4
Open
pull
wants to merge
460
commits into
vasdommes:master
Choose a base branch
from
davidsd:master
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Refactor integration tests
We use boost::stacktrace::stacktrace() in THROW() macro. Enabled '-g' flag to add debug symbols to executable. They do not affect runtime performance, but make executable larger. For example, Docker image now takes ~600MB instead of ~160MB. When configuring, we try to link boost_stacktrace_backtrace, boost_stacktrace_addr2line or boost_stacktrace_basic. The first two options allow to print source code location for each frame. If linking fails, we use Boost.Stacktrace as a header-only library. See details in https://www.boost.org/doc/libs/1_84_0/doc/html/stacktrace/configuration_and_build.html TODO: Docker image uses boost_stacktrace_addr2line, but fails to print source code locations.
Now WriteBootstrapSDP[] is an alias for a new function WritePmpJson[]. WritePmpXml[] is equivalent to the old WriteBootstrapSDP[]. Performance and file sizes for PMP example generated by Bootstrap2dExample.m: - WritePmpXml: 3.4 seconds, 453 KB - WritePmpJson - 0.06 seconds, 277 KB In total, running a test from Bootstrap2dExample.m takes ~20 minutes for XML and ~10 minutes for JSON. WritePmpXml is slow for two reasons: mathematical calculations (ported to C++ for JSON case) and ineffective token-by-token writing to file. TODO: allow for custom sample points in JSON format to ensure backward compatibility with old WriteBootstrapSDP[].
This is a part of #179 Universal input format for Polynomial Matrix Program TODO: write it from in SDPB.m TODO: update docs
…ion. This restores old WriteBootstrapSDP[] functionality.
- added optional "sampleScalings" and "bilinearBasis" fields to JSON format - also fixed prefactor_or_default() in Polynomial_Vector_Matrix.cxx: use correct MPFR precision for exp_minus_one constant Added tests: pmp-like-xml.json has the same data as pmp.xml - it doesn't contain "normalization" and "DampedRational", but contains "samplePoints", "sampleScalings" and "bilinearBasis". pmp-no-optional-fields.json contains only required fields - "objective" and "polynomials" TODO: update docs
JSON is now officially recommended (and default) PMP format.
Universal JSON format for Polynomial Matrix Program + write JSON from SDPB.m + update SDPB Manual
Fix #115 Print exception stacktrace when SDPB fails
For some reason, I got the following compilation error: ../test/src/unit_tests/cases/json.test.cxx:291:1: error: redefinition of ‘const char _ZTSZN18Json_Vector_ParserI17Json_Float_ParserIN2El8BigFloatEEEC4EbRKSt8functionIFvOSt6vectorIS2_SaIS2_EEEERKS5_IFvvEEEd_UlvE_ []’ 291 | } | ^ ../test/src/unit_tests/cases/json.test.cxx:291:1: note: ‘const char _ZTSZN18Json_Vector_ParserI17Json_Float_ParserIN2El8BigFloatEEEC4EbRKSt8functionIFvOSt6vectorIS2_SaIS2_EEEERKS5_IFvvEEEd_UlvE_ [123]’ previously defined here
Update docs for 2.7.0 + fix compilation on Imperial HPC
# Conflicts: # Dockerfile # docs/site_installs/Caltech.md
Save checkpoint, print message and exit with code=15 (=SIGTERM)
…) from common issues.
…nding. If one rank throws an exception and another one doesn't, the first rank will hang forever on the window fence. Thus, we disable this fence and assume that the program will abort. NB: if exception is caught after that and program continues working, it will probably hang on the next synchronization point!
…edence Before: ASSERT_EQUAL(a == b, c == d) -> a == b == c == d After: ASSERT_EQUAL(a == b, c == d) -> (a == b) == (c == d)
Fix #201 Graceful exit on SIGTERM + cosmetic fixes
# Conflicts: # src/outer_limits/compute_optimal/compute_optimal.cxx # src/sdpb/solve.cxx
Support optional suffixes: 100 or 100B -> 100 bytes 100K or 100KB -> 102400 bytes 100M or 100MB -> 104857600 bytes 100G or 100GB -> 107374182400 bytes
Previously, we defined syrk by checking I==J. This does not work when we are multiplying different matrices, C_IJ := A_I^T B^J (it will happen when we'll split Q window and multiply different vertical bands of P)
Fix #251 pmp2sdp fails for 2 input files with objectives when running on 1 core
Minor fixes: less verbose warnings, compilation error for old libarchive
…plicitly Before that, it didn't write iterations.json, instead printing Warning: Cannot write to "test/out/outer_limits/mpirun-1/out.json/iterations.json" (because out.json is not a directory) TODO: write to parent directory?
Example: If there are two blocks, each having three element, then the JSON output will look like: {"c_minus_By" : [["111","222","333], ["44","55","66"]]} At the end, solver writes to out/c_minus_By/c_minus_By.json At each checkpoint, solver writes to out/c_minus_By/c_minus_By.ITERATION.json, where ITERATION is iteration number At the beginning, solver moves old out/c_minus_By/ folder to out/c_minus_By.INDEX/ where INDEX is the same as for iterations.INDEX.json This scheme allows to match a given c_minus_By.json file with an iteration in the corresponding interations.json file
TODO check c_minus_By.json files
for each block write: block_path, reducedPrefactor, samplePoints, sampleScalings, reducedSampleScalings)
New PMP sampling algorithm
…and include pmp2sdp in SDPB example
Print (c - B.y) vector to SDPB output folder and pmp_info.json to SDP folder
…nd_reduce() fails due to inconsistent output split factors
…s due to inconsistent output split factors Ensure that output_window_split_factor is the same for all nodes.
Fix #258 BigInt_Shared_Memory_Syrk_Context::restore_and_reduce() fails due to inconsistent output split factors
Added the field Polynomial_Vector_Matrix::prefactor
Boost_Float.cxx includes <mpf2mpfr.h>, which for some reason redefines mpf_t type from gmp.h: #define mpf_t mpfr_t When included in other files, this can lead to unpredictable errors in GMP-related code. Moving #include <mpf2mpfr.h> to .cxx file prevents these errors.
Tests for pmp_info.json and related fixes
to be reused for testing c_minus_By
Currently, we simply unzip sdp to a temporary directory. Maybe it's better to read archive directly, as in SDPB itself (requires more coding).
Fix #261 Test c_minus_By.json in integration tests
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
See Commits and Changes for more details.
Created by pull[bot]
Can you help keep this open source service alive? 💖 Please sponsor : )