Skip to content

Commit

Permalink
Utils passthrough target (#636)
Browse files Browse the repository at this point in the history
* added documentation for attest in the cli

Signed-off-by: Amndeep Singh Mann <[email protected]>

* created passthrough write

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added passthrough read

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added target read and write - regex replace passthrough -> target

Signed-off-by: Amndeep Singh Mann <[email protected]>

* debugging failing tests

Signed-off-by: Amndeep Singh Mann <[email protected]>

* debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* run tests in parallel

Signed-off-by: Amndeep Singh Mann <[email protected]>

* don't capture stdout for no reason

Signed-off-by: Amndeep Singh Mann <[email protected]>

* updated name from utils to supplement and fixed documentation

Signed-off-by: Amndeep Singh Mann <[email protected]>

* put flag in the write place for mocha

Signed-off-by: Amndeep Singh Mann <[email protected]>

* parallel shorthand from mocha is the same as for the project shorthand on ts-mocha and ts-mocha takes priority

Signed-off-by: Amndeep Singh Mann <[email protected]>

* giving up on parallelization since now it only checks the first test before moving on in github actions

Signed-off-by: Amndeep Singh Mann <[email protected]>

* execsync returns stdout

Signed-off-by: Amndeep Singh Mann <[email protected]>

* returns a buffer that needs to be converted into a string, also print out more debugging lines

Signed-off-by: Amndeep Singh Mann <[email protected]>

* tree one causees failures for some reason

Signed-off-by: Amndeep Singh Mann <[email protected]>

* debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* more debugging - the mkdir command does not seem to be generating a file so let's see if an error is thrown

Signed-off-by: Amndeep Singh Mann <[email protected]>

* error handling didn't get triggered in mapper integration but the directory was definitely not created so now trying to create a directory by hand

Signed-off-by: Amndeep Singh Mann <[email protected]>

* mkdir direct was able to create the dir, what about fs.mkdirsync

Signed-off-by: Amndeep Singh Mann <[email protected]>

* fs.mkdirsync returns a string or undefined

Signed-off-by: Amndeep Singh Mann <[email protected]>

* check what's going on after the mkdir in the mapper integration

Signed-off-by: Amndeep Singh Mann <[email protected]>

* forgot to print out results of the buffer on the test side

Signed-off-by: Amndeep Singh Mann <[email protected]>

* forgot that the generic convert called the mappers directly itself (should it though?)

Signed-off-by: Amndeep Singh Mann <[email protected]>

* make it print out results the other way too

Signed-off-by: Amndeep Singh Mann <[email protected]>

* debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* maybe its complaints about imports which are only showing up there and not locally btw are what's causing the issue?

Signed-off-by: Amndeep Singh Mann <[email protected]>

* why is it putting out the help text which is also wrong

Signed-off-by: Amndeep Singh Mann <[email protected]>

* it's just using completely the wrong command it seems.  gonna try killing the cache and seeing what happens

Signed-off-by: Amndeep Singh Mann <[email protected]>

* the saf cli uses npm not yarn

Signed-off-by: Amndeep Singh Mann <[email protected]>

* syntax differences between yarn and npm

Signed-off-by: Amndeep Singh Mann <[email protected]>

* fixed caching

Signed-off-by: Amndeep Singh Mann <[email protected]>

* remove debuggin

Signed-off-by: Amndeep Singh Mann <[email protected]>

* remove more debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* remove the last bit of debugging

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added tests

Signed-off-by: Amndeep Singh Mann <[email protected]>

* readme

Signed-off-by: Amndeep Singh Mann <[email protected]>

* moved supplement so it's not cutting off the 'other notes' subsection from the 'generate' section

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added one-liner example

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added example using -f and -o for the supplement commands

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added more examples to attest subcommands and put them in the readme

Signed-off-by: Amndeep Singh Mann <[email protected]>

* clarified "attest" description

"Attesting to 'Not Reviewed' control requirements (that can’t be tested automatically by security tools and hence require manual review), helping to account for all requirements."

* clarified "attest" description

"Attesting to 'Not Reviewed' controls can be done with the saf attest commands. Sometimes requirements can’t be tested automatically by security tools and hence require manual review, whereby someone interviews people and/or examines a system to confirm (i.e., attest as to) whether the control requirements have been satisfied."

* added sample passthrough/target jsons

* added sample passthrough and target jsons to README

* clarified sample passthrough/target jsons

* Clarified sample passthrough json

* Clarified sample target json

* documentation revisions

Signed-off-by: Amndeep Singh Mann <[email protected]>

* added samples to readme

Signed-off-by: Amndeep Singh Mann <[email protected]>

* 'sample' is highlighted in red cause it ain't valid json lol

Signed-off-by: Amndeep Singh Mann <[email protected]>

Signed-off-by: Amndeep Singh Mann <[email protected]>
Co-authored-by: Eugene Aronne <[email protected]>
  • Loading branch information
Amndeep7 and ejaronne authored Sep 30, 2022
1 parent 8bc9ffd commit e37b456
Show file tree
Hide file tree
Showing 19 changed files with 861 additions and 35 deletions.
24 changes: 11 additions & 13 deletions .github/workflows/e2e-ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,29 +13,27 @@ jobs:
steps:
- uses: actions/checkout@v2

- name: Setup Node.js
uses: actions/setup-node@v1
with:
node-version: '16.x'

- name: Cache node modules
uses: actions/cache@v2
env:
cache-name: cache-node-modules
with:
# npm cache files are stored in `~/.npm` on Linux/macOS
path: ~/.npm
key: ${{ runner.os }}-build-${{ env.cache-name }}-${{ hashFiles('**/yarn.lock') }}
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.lock') }}
restore-keys: |
${{ runner.os }}-build-${{ env.cache-name }}-
${{ runner.os }}-build-
${{ runner.os }}-
- name: Setup Node.js
uses: actions/setup-node@v1
with:
node-version: '16.x'
${{ runner.os }}-node-
- name: Install dependencies
run: yarn install
run: npm install

- name: Prepack
run: yarn prepack
run: npm run prepack

- name: Run e2e tests
run: yarn test
run: npm run test
201 changes: 183 additions & 18 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ The SAF CLI is the successor to [Heimdall Tools](https://github.com/mitre/heimda
* [View](#view) - Identify overall security status and deep-dive to solve specific security defects
* [Validate](#validate) - Verify pipeline thresholds
* [Generate](#generate) - Generate InSpec validation code, set pipeline thresholds, and generate options to support other saf commands.
* [Supplement](#supplement) - Supplement elements that provide contextual information in an HDF file such as `passthrough` or `target`
* Scan - Visit https://saf.mitre.org/#/validate to explore and run inspec profiles
* Harden - Visit https://saf.mitre.org/#/harden to explore and run hardening scripts

Expand Down Expand Up @@ -149,28 +150,44 @@ To update the SAF CLI on Windows, uninstall any existing version from your syste

### Attest

Attesting to 'Not Reviewed' controls can be done with the `saf attest` commands. `saf attest create` lets you create attestation files and `saf attest apply` lets you apply attestation files
Attest to 'Not Reviewed' controls: sometimes requirements can’t be tested automatically by security tools and hence require manual review, whereby someone interviews people and/or examines a system to confirm (i.e., attest as to) whether the control requirements have been satisfied.

#### Create Attestations
```
attest create Create attestation files for use with `saf attest apply`
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (optional) An input HDF file used to search for controls
-o, --output=<value> (required) The output filename
-t, --format=<option> [default: json] (optional) The output file type
<options: json|xlsx|yml|yaml>
USAGE
$ saf attest create -o <attestation-file> [-i <hdf-json> -t <json | xlsx | yml | yaml>]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (optional) An input HDF file to search for controls
-o, --output=<value> (required) The output filename
-t, --format=<option> [default: json] (optional) The output file type
<options: json|xlsx|yml|yaml>
EXAMPLES
$ saf attest create -o attestation.json -i hdf.json
$ saf attest create -o attestation.xlsx -t xlsx
```

#### Apply Attestations
```
attest apply Apply one or more attestation files to one or more HDF results sets
FLAGS
-h, --help Show CLI help.
-i, --input=<value>... (required) Your input HDF and Attestation file(s)
-o, --output=<value> (required) Output file or folder (for multiple executions)
USAGE
$ saf attest apply -i <input-hdf-json>... <attestation>... -o <output-hdf-path>
FLAGS
-h, --help Show CLI help.
-i, --input=<value>... (required) Your input HDF and Attestation file(s)
-o, --output=<value> (required) Output file or folder (for multiple executions)
EXAMPLES
$ saf attest apply -i hdf.json attestation.json -o new-hdf.json
$ saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir
```

### Convert
Expand Down Expand Up @@ -1001,14 +1018,8 @@ generate xccdf2inspec_stub Translate a DISA STIG XCCDF XML file int
-s, --singleFile Output the resulting controls as a single file
```





#### Other



##### Notes

- Specifying the `--format` flag as either `cis` or `disa` will parse the input spreadsheet according to the standard formats for CIS Benchmark exports and DISA STIG exports, respectively.
Expand Down Expand Up @@ -1050,8 +1061,162 @@ ref: # InSpec keyword - saf will check this column for
Where the keys (`title`) are InSpec control attributes and the values (`- Title`) are the column headers in the input spreadsheet that correspond to that attribute.

&nbsp;
---

### Supplement

Supplement (ex. read or modify) elements that provide contextual information in an HDF file such as `passthrough` or `target`

#### Passthrough

Supplement (ex. read or modify) the `passthrough` element, which provides contextual information in the Heimdall Data Format results JSON file

```
EXAMPLE (combined read, modfication, and overwrite of the original file)
$ saf supplement passthrough read -i hdf_with_passthrough.json | jq -rc '.key = "new value"' | xargs -0 -I{} saf supplement passthrough write -i hdf_with_passthrough.json -d {}
```
Passthrough data can be any context/structure. See the sample below and more at https://github.com/mitre/saf/wiki/Supplement-HDF-files-with-additional-information-(ex.-%60passthrough%60,-%60target%60)
```json
{
"CDM": {
"HWAM": {
"Asset_ID_Tattoo": "arn:aws:ec2:us-east-1:123456789012:instance/i-12345acbd5678efgh90",
"Data_Center_ID": "1234-5678-ABCD-1BB1-CC12DD34EE56FF78",
"FQDN": "i-12345acbd5678efgh90.ec2.internal",
"Hostname": "i-12345acbd5678efgh90",
"ipv4": "10.0.1.25",
"ipv6": "none defined",
"mac": "02:32:fd:e3:68:a1",
"os": "Linux",
"FISMA_ID": "ABCD2C21-7781-92AA-F126-FF987CZZZZ"
},
"CSM": {
"Server_Type": "member server",
"source_tool": "InSpec"
}
}
}
```

##### Read

```
supplement passthrough read Read the `passthrough` attribute in a given Heimdall Data Format JSON file and send it to stdout or write it to a file
USAGE
$ saf supplement passthrough read -i <hdf-json> [-o <passthrough-json>]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (required) An input HDF file
-o, --output=<value> An output `passthrough` JSON file (otherwise the data is sent to stdout)
EXAMPLES
$ saf supplement passthrough read -i hdf.json -o passthrough.json
```

##### Write

```
supplement passthrough write Overwrite the `passthrough` attribute in a given HDF file with the provided `passthrough` JSON data
USAGE
$ saf supplement passthrough write -i <input-hdf-json> (-f <input-passthrough-json> | -d <passthrough-json>) [-o <output-hdf-json>]
FLAGS
-d, --passthroughData=<value> Input passthrough-data (can be any valid JSON); this flag or `passthroughFile` must be provided
-f, --passthroughFile=<value> An input passthrough-data file (can contain any valid JSON); this flag or `passthroughData` must be provided
-h, --help Show CLI help.
-i, --input=<value> (required) An input Heimdall Data Format file
-o, --output=<value> An output Heimdall Data Format JSON file (otherwise the input file is overwritten)
DESCRIPTION
Passthrough data can be any context/structure. See sample ideas at https://github.com/mitre/saf/wiki/Supplement-HDF-files-with-additional-information-(ex.-%60passthrough%60,-%60target%60)
EXAMPLES
$ saf supplement passthrough write -i hdf.json -d '{"a": 5}'
$ saf supplement passthrough write -i hdf.json -f passthrough.json -o new-hdf.json
```

#### Target

Supplement (ex. read or modify) the `target` element, which provides contextual information in the Heimdall Data Format results JSON file

```
EXAMPLE (combined read, modfication, and overwrite of the original file)
$ saf supplement target read -i hdf_with_target.json | jq -rc '.key = "new value"' | xargs -0 -I{} saf supplement target write -i hdf_with_target.json -d {}
```

Passthrough data can be any context/structure. See the sample below and more at https://github.com/mitre/saf/wiki/Supplement-HDF-files-with-additional-information-(ex.-%60passthrough%60,-%60target%60)
```json
{
"AWS":{
"Resources":[
{
"Type":"AwsEc2Instance",
"Id":"arn:aws:ec2:us-east-1:123456789012:instance/i-06036f0ccaa012345",
"Partition":"aws",
"Region":"us-east-1",
"Details":{
"AwsEc2Instance":{
"Type":"t2.medium",
"ImageId":"ami-0d716eddcc7b7abcd",
"IpV4Addresses":[
"10.0.0.27"
],
"KeyName":"rhel7_1_10152021",
"VpcId":"vpc-0b53ff8f37a06abcd",
"SubnetId":"subnet-0ea14519a4ddaabcd"
}
}
}
]
}
}
```

##### Read

```
supplement target read Read the `target` attribute in a given Heimdall Data Format JSON file and send it to stdout or write it to a file
USAGE
$ saf supplement target read -i <hdf-json> [-o <target-json>]
FLAGS
-h, --help Show CLI help.
-i, --input=<value> (required) An input HDF file
-o, --output=<value> An output `target` JSON file (otherwise the data is sent to stdout)
EXAMPLES
$ saf supplement target read -i hdf.json -o target.json
```

##### Write

```
supplement target write Overwrite the `target` attribute in a given HDF file with the provided `target` JSON data
USAGE
$ saf supplement target write -i <input-hdf-json> (-f <input-target-json> | -d <target-json>) [-o <output-hdf-json>]
FLAGS
-d, --targetData=<value> Input target-data (can be any valid JSON); this flag or `targetFile` must be provided
-f, --targetFile=<value> An input target-data file (can contain any valid JSON); this flag or `targetData` must be provided
-h, --help Show CLI help.
-i, --input=<value> (required) An input Heimdall Data Format file
-o, --output=<value> An output Heimdall Data Format JSON file (otherwise the input file is overwritten)
DESCRIPTION
Target data can be any context/structure. See sample ideas at https://github.com/mitre/saf/wiki/Supplement-HDF-files-with-additional-information-(ex.-%60passthrough%60,-%60target%60)
EXAMPLES
$ saf supplement target write -i hdf.json -d '{"a": 5}'
$ saf supplement target write -i hdf.json -f target.json -o new-hdf.json
```

# License and Author

Expand Down
12 changes: 12 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -111,6 +111,9 @@
},
"topicSeparator": " ",
"topics": {
"attest": {
"description": "[Attest] Attest to 'Not Reviewed' control requirements (that can’t be tested automatically by security tools and hence require manual review), helping to account for all requirements"
},
"convert": {
"description": "[Normalize] Convert security results from all your security tools between common data formats"
},
Expand All @@ -123,6 +126,15 @@
"scan": {
"description": "[Scan] Scan to get detailed security testing results: Visit https://saf.mitre.org/#/validate to explore and run inspec profiles"
},
"supplement": {
"description": "[Supplement] Supplement (ex. read or modify) elements that provide contextual information in the Heimdall Data Format results JSON file such as `passthrough` or `target`"
},
"supplement:passthrough": {
"description": "Supplement (ex. read or modify) the `passthrough` element, which provides contextual information in the Heimdall Data Format results JSON file"
},
"supplement:target": {
"description": "Supplement (ex. read or modify) the `target` element, which provides contextual information in the Heimdall Data Format results JSON file"
},
"validate": {
"description": "[Validate] Verify pipeline thresholds"
},
Expand Down
9 changes: 9 additions & 0 deletions src/commands/attest/apply.ts
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,15 @@ import path from 'path'
import {convertFullPathToFilename} from '../../utils/global'

export default class ApplyAttestation extends Command {
static usage = 'attest apply -i <input-hdf-json>... <attestation>... -o <output-hdf-path>'

static description = 'Apply one or more attestation files to one or more HDF results sets'

static examples = [
'saf attest apply -i hdf.json attestation.json -o new-hdf.json',
'saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir',
]

static flags = {
help: Flags.help({char: 'h'}),
input: Flags.string({char: 'i', required: true, multiple: true, description: 'Your input HDF and Attestation file(s)'}),
Expand Down
9 changes: 9 additions & 0 deletions src/commands/attest/create.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,15 @@ const MAX_SEARCH_RESULTS = 5
const prompt = promptSync()

export default class CreateAttestations extends Command {
static usage = 'attest create -o <attestation-file> [-i <hdf-json> -t <json | xlsx | yml | yaml>]'

static description = 'Create attestation files for use with `saf attest apply`'

static examples = [
'saf attest create -o attestation.json -i hdf.json',
'saf attest create -o attestation.xlsx -t xlsx',
]

static flags = {
help: Flags.help({char: 'h'}),
input: Flags.string({char: 'i', description: '(optional) An input HDF file to search for controls'}),
Expand Down
31 changes: 31 additions & 0 deletions src/commands/supplement/passthrough/read.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
import {Command, Flags} from '@oclif/core'
import {ExecJSON} from 'inspecjs'
import fs from 'fs'

export default class ReadPassthrough extends Command {
static usage = 'supplement passthrough read -i <hdf-json> [-o <passthrough-json>]'

static description = 'Read the `passthrough` attribute in a given Heimdall Data Format JSON file and send it to stdout or write it to a file'

static examples = ['saf supplement passthrough read -i hdf.json -o passthrough.json']

static flags = {
help: Flags.help({char: 'h'}),
input: Flags.string({char: 'i', required: true, description: 'An input HDF file'}),
output: Flags.string({char: 'o', description: 'An output `passthrough` JSON file (otherwise the data is sent to stdout)'}),
}

async run() {
const {flags} = await this.parse(ReadPassthrough)

const input: ExecJSON.Execution & {passthrough?: unknown} = JSON.parse(fs.readFileSync(flags.input, 'utf8'))

const passthrough = input.passthrough || {}

if (flags.output) {
fs.writeFileSync(flags.output, JSON.stringify(passthrough, null, 2))
} else {
process.stdout.write(JSON.stringify(passthrough, null, 2))
}
}
}
Loading

0 comments on commit e37b456

Please sign in to comment.