Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate yaml helper cannot generate all yaml #1173

Open
alex-harvey-z3q opened this issue Nov 28, 2021 · 2 comments
Open

Generate yaml helper cannot generate all yaml #1173

alex-harvey-z3q opened this issue Nov 28, 2021 · 2 comments

Comments

@alex-harvey-z3q
Copy link
Contributor

Subject of the issue

While developing a feature, I have noted that the _generate_yaml method in sceptre/cli/helpers.py cannot handle all yaml.

Your environment

  • version of sceptre (sceptre --version): master
  • version of python (python --version): 3.6.9
  • which OS/distro: Ubuntu

Steps to reproduce

Given this stream:

{'DatalakeUser-iam-user-generic': {'StackResourceDrifts': [{'StackId': 'arn:aws:cloudformation:ap-southeast-2:111122223333:stack/DatalakeUser-iam-user-generic/77d13300-504f-11ec-987e-0a90a6c6d40e', 'LogicalResourceId': 'IAMUser', 'PhysicalResourceId': 'datalake_user', 'ResourceType': 'AWS::IAM::User', 'ExpectedProperties': '{"ManagedPolicyArns":["arn:aws:iam::aws:policy/AmazonAthenaFullAccess"],"Policies":[{"PolicyDocument":{"Statement":[{"Action":["lakeformation:SearchDatabasesByLFTags","glue:SearchTables","lakeformation:SearchTablesByLFTags","glue:GetPartitions","glue:GetDatabase","glue:GetTable","glue:GetTables","lakeformation:ListLFTags","lakeformation:GetResourceLFTags","lakeformation:GetLFTag","glue:GetDatabases","lakeformation:GetDataAccess"],"Effect":"Allow","Resource":"*"}],"Version":"2012-10-17"},"PolicyName":"DatalakeUserBasic"}],"Tags":[{"Key":"Creator","Value":"sceptre"},{"Key":"Environment","Value":"personal1"},{"Key":"Version","Value":"1.0.0"},{"Key":"Project","Value":"DataLake"},{"Key":"Department","Value":"Me"},{"Key":"Team","Value":"DevOps"},{"Key":"Contact","Value":"me@personal1"}],"UserName":"datalake_user"}', 'ActualProperties': '{"ManagedPolicyArns":["arn:aws:iam::aws:policy/AmazonAthenaFullAccess"],"Policies":[{"PolicyDocument":{"Statement":[{"Action":["lakeformation:SearchDatabasesByLFTags","glue:SearchTables","lakeformation:SearchTablesByLFTags","glue:GetPartitions","glue:GetDatabase","glue:GetTable","glue:GetTables","lakeformation:ListLFTags","lakeformation:GetResourceLFTags","lakeformation:GetLFTag","glue:GetDatabases","lakeformation:GetDataAccess"],"Effect":"Allow","Resource":"*"}],"Version":"2012-10-17"},"PolicyName":"DatalakeUserBasic"}],"Tags":[{"Key":"Creator","Value":"sceptre"},{"Key":"Environment","Value":"personal1"},{"Key":"Version","Value":"1.0.0"},{"Key":"Project","Value":"DataLake"},{"Key":"Department","Value":"Me"},{"Key":"Team","Value":"DevOps"},{"Key":"Contact","Value":"me@personal1"}],"UserName":"datalake_user"}', 'PropertyDifferences': [], 'StackResourceDriftStatus': 'IN_SYNC', 'Timestamp': datetime.datetime(2021, 11, 28, 14, 18, 46, 708000, tzinfo=tzutc())}], 'ResponseMetadata': {'RequestId': '7c6c1c0b-ffc1-4f92-a5fb-a57a6c1ec51a', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': '7c6c1c0b-ffc1-4f92-a5fb-a57a6c1ec51a', 'content-type': 'text/xml', 'content-length': '3756', 'date': 'Sun, 28 Nov 2021 14:18:53 GMT'}, 'RetryAttempts': 0}}}

Expected behaviour

The tool cfn-flip handles this fine and generates:

DatalakeUser-iam-user-generic:
  StackResourceDrifts:
    - StackId: arn:aws:cloudformation:ap-southeast-2:111122223333:stack/DatalakeUser-iam-user-generic/77d13300-504f-11ec-987e-0a90a6c6d40e
      LogicalResourceId: IAMUser
      PhysicalResourceId: datalake_user
      ResourceType: AWS::IAM::User
      ExpectedProperties: >-
        {"ManagedPolicyArns":["arn:aws:iam::aws:policy/AmazonAthenaFullAccess"],"Policies":[{"PolicyDocument":{"Statement":[{"Action":["lakeformation:SearchDatabasesByLFTags","glue:SearchTables","lakeformation:SearchTablesByLFTags","glue:GetPartitions","glue:GetDatabase","glue:GetTable","glue:GetTables","lakeformation:ListLFTags","lakeformation:GetResourceLFTags","lakeformation:GetLFTag","glue:GetDatabases","lakeformation:GetDataAccess"],"Effect":"Allow","Resource":"*"}],"Version":"2012-10-17"},"PolicyName":"DatalakeUserBasic"}],"Tags":[{"Key":"Creator","Value":"sceptre"},{"Key":"Environment","Value":"personal1"},{"Key":"Version","Value":"1.0.0"},{"Key":"Project","Value":"DataLake"},{"Key":"Department","Value":"Me"},{"Key":"Team","Value":"DevOps"},{"Key":"Contact","Value":"me@personal1"}],"UserName":"datalake_user"}
      ActualProperties: >-
        {"ManagedPolicyArns":["arn:aws:iam::aws:policy/AmazonAthenaFullAccess"],"Policies":[{"PolicyDocument":{"Statement":[{"Action":["lakeformation:SearchDatabasesByLFTags","glue:SearchTables","lakeformation:SearchTablesByLFTags","glue:GetPartitions","glue:GetDatabase","glue:GetTable","glue:GetTables","lakeformation:ListLFTags","lakeformation:GetResourceLFTags","lakeformation:GetLFTag","glue:GetDatabases","lakeformation:GetDataAccess"],"Effect":"Allow","Resource":"*"}],"Version":"2012-10-17"},"PolicyName":"DatalakeUserBasic"}],"Tags":[{"Key":"Creator","Value":"sceptre"},{"Key":"Environment","Value":"personal1"},{"Key":"Version","Value":"1.0.0"},{"Key":"Project","Value":"DataLake"},{"Key":"Department","Value":"Me"},{"Key":"Team","Value":"DevOps"},{"Key":"Contact","Value":"me@personal1"}],"UserName":"datalake_user"}
      PropertyDifferences: []
      StackResourceDriftStatus: IN_SYNC
      Timestamp: '2021-11-28 14:15:53.795000+00:00'
  ResponseMetadata:
    RequestId: '0fe2ecc6-fc2a-43a1-8edd-b8024046a8aa'
    HTTPStatusCode: 200
    HTTPHeaders:
      x-amzn-requestid: '0fe2ecc6-fc2a-43a1-8edd-b8024046a8aa'
      content-type: text/xml
      content-length: '3756'
      date: Sun, 28 Nov 2021 14:16:01 GMT
    RetryAttempts: 0

Actual behaviour

Because this stream is not a list, the code tries to yaml.safe_loads it, which raises AttributeError("module 'yaml' has no attribute 'safe_loads'",). The method then returns the stream back to the caller.

@alex-harvey-z3q
Copy link
Contributor Author

alex-harvey-z3q commented Dec 1, 2021

Perhaps we could investigate simply copying the code from cfn-flip in here.

@jfalkenstein
Copy link
Contributor

@zaro0508 , @alexharv074 I just dove into this a little bit to take an initial swing at fixing this bug and doing some refactoring in the outputting code we have. When I did, a few things became clear to me:

  1. There is almost no documentation or comments indicating how this code should act and what it is expecting the interface to be. There's a little, but it's not sufficient to explain some particularly unusual bits. For example, it appears like it was initially written to accept a list of dicts... but then was expanded to kind of do more than that.
  2. The test coverage is very sparse and insufficient. There are only three unit tests on write and they don't really cover many situations and only use the most basic of testcases.
  3. It is pretty obvious that write() was mostly written for stack launch/creation, with a very specific usecase in mind. And then it started getting used nearly everywhere, but it doesn't really suit all the usecases it's being used with. As this issue points out, it plainly doesn't work in certain circumstances (and who knows how long it hasn't worked).
  4. Colorization appears to be a main feature of it... but it's only really meant for stack creation/updation and the rest of the time it just hangs on as the appendix of that method.
  5. Errors are "papered over", simply having a generic message being printed and not being raised, which would cause the calling code to assume the invocation succeeded, even when it failed.

The most egregious example of these is in the _generate_text function:

  • It has a single comment which appears to not actually be correct. It has no docstrings or type annotations, so it's not clear how it should operate.
  • Most of the variables are so badly abbreviated it's hard to tell what they are
  • It hard-codes what appears to be a "Stack" column header (even if that's not actually a relevant header)
  • The actual output format it is meant to produce isn't clear. It appears to be of the form of a table, but it's hard to tell that looking from the code.

I think what the application needs is a universal outputter that responds to the output format argument. It should be well-documented and well-tested. But overall, I think we really ought to look at completely replacing the "guts" of the write method with a utility that can truly output most anything, without many of the assumptions that appear implicit in the current write() function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants