Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create JUnit test cases to ensure transformers perform as expected #39

Open
GenericException opened this issue Sep 22, 2018 · 1 comment
Labels
suggestion New feature or request

Comments

@GenericException
Copy link

You can probably prevent a lot of future issue reports by creating some test-cases and checking them regularly. Perhaps a format like this (pseudo-code) would work:

void stringsLight() {
    // create radon instance
    CLI cli = // ...
    cli.addTransformer(new StringsLight());
    cli.obfuscate("test-cases.jar", "test-cases-obf.jar");
    // run the obfuscated jar
    Process p = Runtime.exec("java -cp test-cases-obf.jar sample.SomeClass");
    Stream s = p.getOutputStream();
    String s = toString(s);
    // clean-up
    Files.delete("test-cases.obf");
    // print stack-trace if it exists
    if (s.contains("exception")) {
         print(s); 
    }
    // validate results with JUnit
    assertEquals(s, "Hello world, this is the expected output of System.out calls"));
}

In this single test case, the CLI obfuscated some sample jar file and then executes the output. If everything works well the output should be match the expected value. Otherwise the output should be a good'ol stacktrace. You can easily abstract away all of this and have each case method only specify the transformer settings & the expected output.

But what would test-cases.jar contain? It should be something that involves a little-bit of everything as long as it is all console-based and does not require user input. I made a jar that has a bunch of cases that can be tested this way (with the exception of one). I tested Radon 1.0.3 on it and went through each obfuscation feature one-by-one and I everything functioned as intended. This would just automate that process.

@ItzSomebody ItzSomebody added the suggestion New feature or request label Oct 13, 2018
@Storyyeller
Copy link

I did this kind of testing for the Krakatau decompiler. A couple things to watch out for: JVMs may change behavior in different versions. I had a couple tests that stopped working after a JVM upgrade, so I had to disable them.

Another thing to watch out for is infinite loops. It is possible for a transformer to accidently turn your code into an infinite loop, so you need to add a timeout when executing the jvm. Lastly, it is useful to add caching, since invoking the jvm takes so long. For the Krakatau tests, it will cache jvm executions, so if the bytecode output doesn't change, it won't rerun it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
suggestion New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants