-
Notifications
You must be signed in to change notification settings - Fork 58
Conversation
…date/update-to-remote-master
Update update to remote master
- Added feature that allows for `inputTesting` to be supplied to configuration that provides datasets as a JSON object in the form: ``` { "a/path": { //path "post": { //operation "200": [ //response { "body": JSON_DATA } ], "400": [ { "body": JSON_DATA } ] } } } ``` - Moved helpers to `/lib/helpers.js` - Added `printJSON` helper - Modified `testGenResponse` to check to see if there is inputTesting defined for that particular `path:operation:response`, if so, set special variable `inputs` and use the templateFn for each dataset - Modified templates for put, post, and patch (these are the ones where request bodies are allowed) to check for `inputs`, and if defined, it will run the template for each dataset. Otherwise, it runs the template once as usual
Add input testing to configuration
I added an |
@noahdietz -- I'd be happy to check out and resolve the failures, but I did notice them when I opened the PR and wasn't quite sure how to mitigate. I'm not a Travis whiz :) I didn't remove any environment variables, so what environment variables need to be set? Thanks! |
@mm-gmbd just click on the individual builds to see what the error messages are 😄 I couldn't really decipher the exception myself. |
@mm-gmbd rather than continuously pushing to GitHub and triggering Travis, try just running Should've mentioned that earlier, sorry. Also, the diff output of the tests can be really really annoying to interpret. If you haven't already noticed. |
@noahdietz -- the part that is currently failing the integration test is using the "length" helper. I took out the use of the length handler because I want to see the entire description when I was using the test data generator (it builds a description for each test data set). Personally, I think the "length" helper should be removed -- if there is a description, it's there for a reason and it helps MUCH more to have the description for when a test fails. Let me know what you think :) |
@noahdietz -- I was just about to ask if I could run these tests locally :) As for the output, it definitely isn't helpful, but with a little replacing and diffing, I found out what the issues were. Mind taking a bit to think about the "length" helper comment from my previous comment? Let me know what you think... |
@mm-gmbd I understand your position on the I should clarify the purpose of it more. The description being used is derived from the Swagger spec. These documents can be very wordy. Thus, we opted to truncate the description by some length in an effort to maintain code style/format and prevent any weird bugs that may arise from having a paragraph long description in your Swagger spec. So its a safe guard for the code generator. |
Do the tests really need to care about the length of the description, even from a style/format standpoint? Now, if using the description breaks the string and then the code is invalid Javascript -- that is a problem, but I'm not sure that the |
The motivation for truncating it was to reduce the number of possible things that would invalidate the generated code and require the dev to manually seek out and edit the files post generation. I know this tool doesn't 100% eliminate the need for manual work (far from it in fact). Truncating the description to protect from damaging multi-line strings was one of the steps we took. Especially when the description is derived from the Swagger spec (which should be verbose and descriptive imo). I'm less inclined to remove something that is preventing a potentially damaging edge case. Can I propose a compromise? We make the length truncation feature configurable. On by default, set to a default value, but can be toggled off by changing it to -1 or something. The helper checks the configured length, and if its -1, it is a |
@mm-gmbd what do you think about my above suggestion? Think it could work? With all the configuration we are adding, we are going to need to make a |
@noahdietz - apologies for the late reply. I like the suggestion! I'll be quite busy today -- if you have time you're more than welcome to implement :) (should be quick). If not, I may be able to get to it tonight, but can't make any promises. |
Also, it would be great if one of the Obviously, we'd be making some assumptions for the user:
|
my 2 cents: I don't think generation of data should be done by this module. I really like the idea of the option of feeding data to this module to generate the tests. One of the reasons is that I would like the option to choose my own module for generating the data. So I think we would need a few things:
Another reason to keep is small is the maintainaility (is that even a word?). I like modules that do one thing very good, so I am flexible in re-using it. So, I make the assumptions, not the module. I could prob make a PR that makes it possible to generate tests for just certain status codes if we want to go that way |
@Remco75 - Maintainability is the word :) As for keeping the test generation and the testing module decoupled, I understand your concerns. To be clear, I didn't want this to be solely for use with Today, the config allows the user to choose between which test module to use ( |
hehe, thnx for the spell-check. the choice / config for a test framework is makes a bit more sense to me as this is a test-generation module. @noahdietz any thoughts? I will go with the majority of course :-) |
Hmm -- I'm not as convinced that one has to maintain the generation of data. This module could specify a particular known version that works and be okay moving forward? I would argue that, as a test-generation module, test data is absolutely in scope for what this module could/should generate. I haven't used As far as creating another project -- I'm not opposed to the idea I guess, but I would still go back to my previous point that as a test-generation module, data generation would be in scope. |
great discussion, I enjoyed waking up and reading this. Let me cast my ballot first and explain: I think we should stick to providing an interface through which users can supply data for their tests however they'd like, and not auto-generate data for them. A separate, small mock-data gen tool based on a Swagger spec would be pretty tight though... Explanation: I agree with @mm-gmbd 's sentiment, that data generation is in scope of test generating tools, however, I think we are a special case. This project's idea started as a tool that given a Swagger API spec, would test it's alignment with your backend. This idea was awesome in theory, but ultimately naive. This tool has no idea what is in the database backing this API and a majority of operations would fail if it attempted to use randomly generated data. That is the achilles heel of this project, it requires the user to provide the data they expect because only they have the knowledge of what is in the database. tl;dr tool-generated data can't be used because the state of the database is unknown and it would be inaccurate. This all changes, however, if users are using mocking tools (like |
ok, nice rationale about the database (and funny that you we're waking up to this, it was almost the end of the day for me ;-).
, take a look at swagmock, mentioned above. I can take a closer look at this PR wednesday to help with the tests if needed |
Hey all - apologies for going dark. I have some thoughts about @noahdietz's comment, specifically about needing to know what is in the database to test an API. I'll have to take some time (that I don't have right now) to fully comment on this, but for starters...
Apologies if these don't make sense -- I had to write them quickly so let me know if I can explain better. Let me know what you think! Great discussion so far everyone! |
My tiny thoughts: The test is supposed NOT to make any visible data change to the database, so maybe you can classify the APIs to 4 parts in order: POST to add, GET to query, PUT to modify and DELETE to remove. And with that, you can firstly generate test data to test POST, then using the gen-data to test GET and PUT, and remove all the gen-data to test DELETE in the end. Just force users to fill the 4 types APIs which they want to generate data to test. What do you think? |
@Maples7 - Why can't the tests make any change to the database? For a development database, depending on the deeveloper's preference, if he wants to give power to a tool to test by changing data, then he/she ought to be able to. |
@mm-gmbd Of course you can add some data to database after tests, but since we should test DELETE API, maybe it's better to remove all data we gen in the test to retain consistency. |
But, getting back to the point at hand: this PR is about the ability to just feed data to this module. Lets open a new issue / enhancement for data generation if we want, so the discussion will not get lost when we merge this ;-) @mm-gmbd do you need any help finishing this for the release? |
@Maples7 - taking a step back, do we (those contributing to the STT tool) need to know what is in the As a user of the tool, (following the example) I know what is in the @Remco75 - Agreed that this should be another issue -- we can move there. As for completing this branch, I will not have time to do this today, but if someone wanted to complete my fork (I think it is just the use of the |
@mm-gmbd I will try to find some time tomorrow. wil post when I find something. Can you give me rights to your repo, or shall I fork it? |
I've opened a new issue #115 to continue this conversation. @Remco75 I already have a branch for @mm-gmbd 's repo that adds the description truncation toggle, I just can't push the branch/make a PR because I don't have privileges either :) So unless there is something else you wanted to add, I've got it covered as soon as I get contributor access to his repo. |
@noahdietz - apologies, you have been invited! |
add toggle for description truncation
Sorry, the tests failed because I was getting impatient by failing because of the |
Hmm... and I added back in the length helper and got the following test results. @noahdietz, can you peek and figure out what is happening? Just from looking at the first failure, it appears that length is being overzealous and truncating "200 OK" to "...". |
Ok this looks like a bigger issue with the comparison files just not playing nice with the length helper. I'm getting on a flight right now, but I'll take a look and submit another PR to your repo with a fix. Give me a day or two. Thanks everyone for the patience! |
@noahdietz , Can I do anything to help out, or is it already fixed? |
? swagger.basePath : '') + path; | ||
|
||
// get inputTesting from config if defined for this path:operation:response | ||
if (config.inputTesting && |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wouldn't it make more sense to users to call the config entry something like 'request-data' instead of inputTesting?
@Remco75 I'm trudging through the template changes. Turns out I had less time to work on this than I thought whilst traveling, apologies. Thanks for the encouragement :) |
Guys, I having some issues with getting this code to work properly, and some remarks (see above). Took these idea's and expanded them (and tested them :-)) |
@@ -27,11 +27,15 @@ | |||
.set('{{headerApiKey.type}}', process.env.{{headerApiKey.name}}) | |||
{{/if}} | |||
{{#is contentType 'application/json'}} | |||
{{#if inputs.body}} | |||
.send({{printJSON inputs.body.data}}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in your explanation of your PR you do not use the data object. For me it works if I remove this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the printJSON helper does not work properly with more complex structures. I would suggest using JSON.stringify from the code.
Added input testing for data generated with json-schema-test-data-generator.
I'm not sure how complete this is, but I got it working for my purpose and haven't looked at in a while :)
The test data should be included in the
config
in the following manner:An example: