Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pixel out of bounds for certain (most?) counterfactuals #1

Open
Dorin-D opened this issue Apr 19, 2024 · 4 comments
Open

Pixel out of bounds for certain (most?) counterfactuals #1

Dorin-D opened this issue Apr 19, 2024 · 4 comments

Comments

@Dorin-D
Copy link

Dorin-D commented Apr 19, 2024

Greetings!

I would like to visualize the counterfactuals found by VeriX. In the "VeriX" class, "get_explanation" function, I set the "plot_counterfactual" argument to true. The resulting counterfactuals tend to contain a pixel out of bounds (OOB), with values outside of 0 and 1, e.g. -5, +20. Depending on the image and traversal order set, it also appears that this OOB pixel stays the same for different counterfactuals, with some other pixels varying accordingly between counterfactual examples. This is despite the fact that epsilon is set to 0.05, so the pixel should not vary more than that from its original value.

Reproduction of issue:

  1. Load the MNIST dataset, as done in mnist.py
  2. Initialize the VeriX object with the provided model and test image index 112, as below, and set plot_counterfactual=True:
verix = VeriX(dataset="MNIST",
              image=x_test[112],
              model_path="models/mnist-10x2.onnx")
verix.traversal_order(traverse="heuristic")
verix.get_explanation(epsilon=0.05, plot_counterfactual=True)

This results in ~300 counterfactuals generated, however, (1) all except one contain the same OOB pixel that creates a big contrast in the image, (2) and between them a single pixel varies by epsilon ( (2) as expected). I've attached some examples to show what I mean; note that only counterfactual-at-pixel-685 is different. ZIP attachment:counterfactual-at-pixel-.zip

Is this a VeriX issue? Is it a Marabou issue? I tried to look into the Marabou implementation, but it relies on a C++ function that I can't understand on my own. I see that the bounds are properly set in VeriX for both irrelevant and on the current pixel, so I can't understand why Marabou would return a pixel outside of the input bounds.

I'd also like to note that similar behaviour is seen for multiple images, not just x_test[112]. Setting a sequential traversal order (rather than using the provided heuristic) results in various pixels being OOB for each counterfactual, but the question remains on why they are returned as OOB.

@Dorin-D Dorin-D changed the title Pixel out of bounds for certain counterfactuals Pixel out of bounds for certain (most?) counterfactuals Apr 19, 2024
@minwu-cs
Copy link
Collaborator

minwu-cs commented Jun 13, 2024

Hi @Dorin-D,

Thanks a lot for your interest in our work and also using our tool.

We have looked into the issue carefully and noticed that, when using the exact same model (mnist-10x2.onnx) and input image (x_test[112]), the computed explanation is different from the one you attached, and there is no out-of-bound pixel in all the counterfactuals.

verix = VeriX(dataset="MNIST",
              image=x_test[112],
              model_path="models/mnist-10x2.onnx")
verix.traversal_order(traverse="heuristic")
verix.get_explanation(epsilon=0.05,
                      plot_counterfactual=True)

I'm attaching all the produced results for this particular case, including the original image, the sensitivity, the explanation, and all counterfactuals, in a zip file for reference purpose: mnist-10x2-index-112-heuristic-linf0.05.zip

I also did a sanity check on a couple of other MNIST images and observed no out-of-bound pixels in the counterfactuals.

Please could I confirm if you are using the most up-to-date version of Marabou? If not, any chance you could upgrade to the current version? If yes, and you are still encountering the same issue with counterfactuals containing out-of-bound pixels, please don't hesitate to reach out to us.

Thanks again for using our tool, and wish you every success in your study/research goals.

@Dorin-D
Copy link
Author

Dorin-D commented Jun 21, 2024

Thanks @minwu-cs for your reply,

I am using maraboupy==2.0.0 (installed via pip). I assume this is the latest version?

I am still encountering the same issue.

@minwu-cs
Copy link
Collaborator

Hi @Dorin-D, thanks a lot for getting back to us with more details.

I have passed your message to the Marabou development team regarding the specific Marabou version installed via pip.

In the meantime, any chance you could install Marabou as suggested on the VeriX webpage, i.e., build Marabou from source and configure it to use the Gurobi optimizer? This might not be as straightforward as using pip install but it should give you the correct explanations and counterfactuals.

All you need to do is download the Marabou repository and build it from source, as shown in the following command lines:

git clone https://github.com/NeuralNetworkVerification/Marabou.git
cd path/to/marabou/repo/folder
mkdir build 
cd build
cmake .. -DENABLE_GUROBI=ON -DBUILD_PYTHON=ON
cmake --build . -j 12

Specifically, -DENABLE_GUROBI=ON in the above code block enables the Gurobi optimizer, for which you will need to install Gurobi in advance. How to compile Marabou with the Gurobi optimizer can be found here. Gurobi requires a license but you might qualify for a free one. After Gurobi is set up, running the above-mentioned code block to build Marabou should give you correct counterfactuals as well as explanations.

Hope this helps, and if you encounter any further questions, please don't hesitate to let us know.

@Dorin-D
Copy link
Author

Dorin-D commented Jun 27, 2024

Hi @minwu-cs .

I have tried to build Marabou from source according to your instructions on my local machine;

  1. I installed Gurobi and activated the academic license

  2. I cloned the Marabou repository, created the build folder and executed cmake .. -DENABLE_GUROBI=ON -DBUILD_PYTHON=ON

  3. I edit the CMakeLists.txt file to use Gurobi110 instead of Gurobi95 e.g. set(GUROBI_LIB2 "gurobi95") to set(GUROBI_LIB2 "gurobi110") (I get an error during the build otherwise)

  4. I run cmake --build . -j 8 ; here, several of the tests fail with SEGFAULT. If I attempt to build multiple times, different tests will fail. I am attaching 2 examples of tests that fail:
    Example 1:
    The following tests FAILED: 2 - Test_BilinearConstraint (SEGFAULT) 9 - Test_DisjunctionConstraint (SEGFAULT) 27 - Test_SoftmaxConstraint (SEGFAULT) 29 - Test_SumOfInfeasibilitiesManager (SEGFAULT) 35 - Test_LUFactors (SEGFAULT) 37 - Test_SparseFTFactorization (SEGFAULT) 40 - Test_SparseLUFactors (SEGFAULT) 41 - Test_SparseUnsortedArray (SEGFAULT) 42 - Test_SparseUnsortedArrays (SEGFAULT) 43 - Test_SparseUnsortedList (SEGFAULT) 45 - Test_ConstSimpleData (SEGFAULT) 46 - Test_Error (SEGFAULT) 50 - Test_HashMap (SEGFAULT) 62 - Test_Vector (SEGFAULT) 75 - Test_QueryLoader (SEGFAULT) 78 - Test_WsLayerElimination (SEGFAULT) 79 - Test_ParallelSolver (SEGFAULT) 85 - Test_IncrementalLinearization (SEGFAULT) Errors while running CTest gmake[2]: *** [CMakeFiles/build-tests.dir/build.make:70: build-tests] Error 8 gmake[1]: *** [CMakeFiles/Makefile2:813: CMakeFiles/build-tests.dir/all] Error 2 gmake: *** [Makefile:101: all] Error 2
    Example 2:
    The following tests FAILED: 5 - Test_ConstraintMatrixAnalyzer (SEGFAULT) 6 - Test_CostFunctionManager (SEGFAULT) 9 - Test_DisjunctionConstraint (SEGFAULT) 11 - Test_Engine (SEGFAULT) 12 - Test_Equation (SEGFAULT) 14 - Test_LargestIntervalDivider (SEGFAULT) 15 - Test_LeakyReluConstraint (SEGFAULT) 16 - Test_MaxConstraint (SEGFAULT) 27 - Test_SoftmaxConstraint (SEGFAULT) 34 - Test_LUFactorization (SEGFAULT) 35 - Test_LUFactors (SEGFAULT) 38 - Test_SparseGaussianEliminator (SEGFAULT) 42 - Test_SparseUnsortedArrays (SEGFAULT) 43 - Test_SparseUnsortedList (SEGFAULT) 46 - Test_Error (SEGFAULT) 49 - Test_GurobiWrapper (SEGFAULT) 61 - Test_Stack (SEGFAULT) 73 - Test_OnnxParser (SEGFAULT) 75 - Test_QueryLoader (SEGFAULT) 84 - Test_UnsatCertificateUtils (SEGFAULT) 85 - Test_IncrementalLinearization (SEGFAULT) Errors while running CTest gmake[2]: *** [CMakeFiles/build-tests.dir/build.make:70: build-tests] Error 8 gmake[1]: *** [CMakeFiles/Makefile2:813: CMakeFiles/build-tests.dir/all] Error 2 gmake: *** [Makefile:101: all] Error 2

  5. Since Marabou fails to install, I create an environment with all the necessary python modules for VeriX, including maraboupy. I then copy the generated Marabou binary from the failed build to the location of the Marabou executable (which Marabou -> /path_to_miniforge3/envs/verix9/bin/Marabou)

  6. I run the VeriX script to compute counterfactuals for an example: I am still obtaining the out-of-bounds pixels, VeriX acts the same as before.


Let me know if you need more information about my setup, or if the description of my installation is not clear enough. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants