Skip to content

Commit

Permalink
Checklist metadata validation and checklist mapper severities (#2750)
Browse files Browse the repository at this point in the history
* input validation for checklist metadata

Signed-off-by: kemley76 <[email protected]>

* use hdf-converters in hdf2ckl

Signed-off-by: kemley76 <[email protected]>

* updated hdf2ckl tests

Signed-off-by: kemley76 <[email protected]>

* update tests based on changes to ckl mapper

Signed-off-by: Kaden Emley <[email protected]>

* update ckl metadata validation to use hdf-converters helper function

Signed-off-by: Kaden Emley <[email protected]>

* added ability to use local install of inspecjs

Signed-off-by: Kaden Emley <[email protected]>

* update checklist commands and tests

Signed-off-by: Kaden Emley <[email protected]>

* ensure threshold counts stay based off impact

Signed-off-by: Kaden Emley <[email protected]>

* added tests to ensure that converting with invalid metadata display an error message

Signed-off-by: Kaden Emley <[email protected]>

* use checklist types from hdf-converters

Signed-off-by: Kaden Emley <[email protected]>

* remove redundant code in hdf2ckl command

Signed-off-by: Kaden Emley <[email protected]>

* use inspecJS to convert impact to severity

Signed-off-by: Kaden Emley <[email protected]>

* use checklist types from hdf-converters

Signed-off-by: Kaden Emley <[email protected]>

* fix test data

Signed-off-by: Kaden Emley <[email protected]>

* enforce enum matching for user input in generate ckl_metadata command

Signed-off-by: Kaden Emley <[email protected]>

* add backwards compatibility for old checklist metadata format

Signed-off-by: Kaden Emley <[email protected]>

* remove debug statement

Signed-off-by: Kaden Emley <[email protected]>

* fix code smells

Signed-off-by: Kaden Emley <[email protected]>

* linting

Signed-off-by: Kaden Emley <[email protected]>

* format every output json file with 2 space indent

Signed-off-by: Kaden Emley <[email protected]>

* add flags for all metadata fields on hdf2ckl command

Signed-off-by: Kaden Emley <[email protected]>

* clarify instructions on ckl metadata generation

Signed-off-by: Kaden Emley <[email protected]>

* change formating from 4 to 2 space indent

Signed-off-by: Kaden Emley <[email protected]>

* make version and release number optional in checklist metadata generation

Signed-off-by: Kaden Emley <[email protected]>

* update tests to reflect better formatted error messages

Signed-off-by: Kaden Emley <[email protected]>

* update markdown summary table to include row for severity: none

Signed-off-by: Kaden Emley <[email protected]>

* update code and tests to count N/A controls with severity other than none

Signed-off-by: Kaden Emley <[email protected]>

* fix code smells

Signed-off-by: Kaden Emley <[email protected]>

* revert addition of severity-none row to markdown summary table

Signed-off-by: Kaden Emley <[email protected]>

* remove heimdall version when running checklist tests

Signed-off-by: Kaden Emley <[email protected]>

* change return type of string | undefined to string | null

Signed-off-by: Kaden Emley <[email protected]>

---------

Signed-off-by: kemley76 <[email protected]>
Signed-off-by: Kaden Emley <[email protected]>
Co-authored-by: Amndeep Singh Mann <[email protected]>
  • Loading branch information
kemley76 and Amndeep7 authored Jul 31, 2024
1 parent e867fb1 commit d0fc78a
Show file tree
Hide file tree
Showing 71 changed files with 388,745 additions and 91,346 deletions.
47 changes: 37 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -383,20 +383,47 @@ convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist file
USAGE
$ saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [-H <hostname>] [-F <fqdn>] [-M <mac-address>] [-I <ip-address>]
$ saf convert hdf2ckl saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [--profilename <value>] [--profiletitle <value>] [--version <value>] [--releasenumber <value>] [--releasedate <value>] [--marking <value>] [-H <value>] [-I <value>] [-M <value>] [-F <value>] [--targetcomment <value>] [--role Domain Controller|Member Server|None|Workstation] [--assettype Computing|Non-Computing] [--techarea |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS] [--stigguid <value>] [--targetkey <value>] [--webdbsite <value> --webordatabase] [--webdbinstance <value> ] [--vulidmapping gid|id]
FLAGS
-F, --fqdn=<fqdn> FQDN for CKL metadata
-H, --hostname=<hostname> Hostname for CKL metadata
-I, --ip=<ip-address> IP address for CKL metadata
-M, --mac=<mac-address> MAC address for CKL metadata
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-m, --metadata=<metadata> Metadata JSON file, generate one with "saf generate ckl_metadata"
-o, --output=<output-ckl> (required) Output CKL file
-h, --help Show CLI help.
-i, --input=<value> (required) Input HDF file
-o, --output=<value> (required) Output CKL file
CHECKLIST METADATA FLAGS
-F, --fqdn=<value> Fully Qualified Domain Name
-H, --hostname=<value> The name assigned to the asset within the network
-I, --ip=<value> IP address
-M, --mac=<value> MAC address
-m, --metadata=<value> Metadata JSON file, generate one with "saf generate ckl_metadata"
--assettype=<option> The category or classification of the asset
<options: Computing|Non-Computing>
--marking=<value> A security classification or designation of the asset, indicating its sensitivity level
--profilename=<value> Profile name
--profiletitle=<value> Profile title
--releasedate=<value> Profile release date
--releasenumber=<value> Profile release number
--role=<option> The primary function or role of the asset within the network or organization
<options: Domain Controller|Member Server|None|Workstation>
--stigguid=<value> A unique identifier associated with the STIG for the asset
--targetcomment=<value> Additional comments or notes about the asset
--targetkey=<value> A unique key or identifier for the asset within the checklist or inventory system
--techarea=<option> The technical area or domain to which the asset belongs
<options: |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS>
--version=<value> Profile version number
--vulidmapping=<option> Which type of control identifier to map to the checklist ID
<options: gid|id>
--webdbinstance=<value> The specific instance of the web application or database running on the server
--webdbsite=<value> The specific site or application hosted on the web or database server
--webordatabase Indicates whether the STIG is primarily for either a web or database server
DESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist file
EXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB
$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
```
[top](#convert-hdf-to-other-formats)
#### HDF to CSV
Expand Down
56 changes: 56 additions & 0 deletions pack-inspecjs.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
ECHO OFF

SET CYPRESS_INSTALL_BINARY=0
SET PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true

SET original_dir=%cd%
ECHO %original_dir%

IF DEFINED npm_config_heimdall (
CD %npm_config_heimdall%/libs/inspecjs/
) ELSE (
CD ../heimdall2/libs/inspecjs/
)

IF DEFINED npm_config_branch (
CALL git switch %npm_config_branch% || EXIT /B %ERRORLEVEL%
) ELSE (
CALL git switch master || EXIT /B %ERRORLEVEL%
)

ECHO Executing - git fetch ...
CALL git fetch || EXIT /B %ERRORLEVEL%

ECHO Executing - git pull ...
CALL git pull || EXIT /B %ERRORLEVEL%

ECHO Executing - yarn install ...
CALL yarn install || EXIT /B %ERRORLEVEL%

ECHO Executing - yarn pack ...
CALL yarn pack || EXIT /B %ERRORLEVEL%

ECHO Finished generating the tarball

CD %original_dir%

ECHO Executing - npm install remote ...
CALL npm i || EXIT /B %ERRORLEVEL%

ECHO Executing - npm install local ...

IF DEFINED npm_config_heimdall (
FOR /f "tokens=*" %%a IN ('dir /b %npm_config_heimdall%\libs\inspecjs\inspecjs-v*.tgz') DO (
SET THIS_TAR_ZIP=%npm_config_heimdall%\libs\inspecjs\%%a
)
) ELSE (
FOR /f "tokens=*" %%a IN ('dir /b ..\heimdall2\libs\inspecjs\inspecjs-v*.tgz') DO (
SET THIS_TAR_ZIP=..\heimdall2\libs\inspecjs\%%a
)
)
CALL npm i %THIS_TAR_ZIP% || EXIT /B %ERRORLEVEL%

ECHO Executing - npm run prepack ...
CALL npm run prepack || EXIT /B %ERRORLEVEL%

ECHO Install of local inspecjs complete.
40 changes: 40 additions & 0 deletions pack-inspecjs.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/bin/bash

set -o errexit # abort on nonzero exitstatus
set -o nounset # abort on unbound variable
set -o pipefail # don't hide errors within pipes

ORIGINAL=$PWD
echo $ORIGINAL

cd "${npm_config_heimdall:-../heimdall2}"
cd libs/inspecjs

git switch "${npm_config_branch:-master}"

echo "Executing - git fetch ..."
git fetch

echo "Executing - git pull ..."
git pull

echo "Executing - yarn install ..."
CYPRESS_INSTALL_BINARY=0 PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true yarn install

echo "Executing - yarn pack ..."
yarn pack

echo "Finished generating the tarball"

cd "$ORIGINAL"

echo "Executing - npm install remote ..."
npm i

echo "Executing - npm install local ..."
npm i "${npm_config_heimdall:-../heimdall2}/libs/inspecjs/inspecjs-v"*".tgz"

echo "Executing - npm run prepack ..."
npm run prepack

echo "Install of local inspecjs complete."
5 changes: 4 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,10 @@
"prepack:darwin:linux": "rm -rf lib && tsc",
"pack-hdf-converters": "run-script-os",
"pack-hdf-converters:win32": "pack-hdf-converters.bat",
"pack-hdf-converters:darwin:linux": "./pack-hdf-converters.sh"
"pack-hdf-converters:darwin:linux": "./pack-hdf-converters.sh",
"pack-inspecjs": "run-script-os",
"pack-inspecjs:win32": "pack-inspecjs.bat",
"pack-inspecjs:darwin:linux": "./pack-inspecjs.sh"
},
"types": "lib/index.d.ts",
"jest": {
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/asff2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ export default class ASFF2HDF extends Command {
_.forOwn(results, (result, filename) => {
fs.writeFileSync(
path.join(flags.output, checkSuffix(filename)),
JSON.stringify(result),
JSON.stringify(result, null, 2),
)
})
}
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/aws_config2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,6 @@ export default class AWSConfig2HDF extends Command {
region: flags.region,
}, !flags.insecure, flags.certificate ? fs.readFileSync(flags.certificate, 'utf8') : undefined) : new Mapper({region: flags.region}, !flags.insecure, flags.certificate ? fs.readFileSync(flags.certificate, 'utf8') : undefined)

fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(this.ensureRefs(await converter.toHdf())))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(this.ensureRefs(await converter.toHdf()), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/burpsuite2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class Burpsuite2HDF extends Command {
checkInput({data, filename: flags.input}, 'burp', 'BurpSuite Pro XML')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
8 changes: 6 additions & 2 deletions src/commands/convert/ckl2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,11 @@ export default class CKL2HDF extends Command {
const data = fs.readFileSync(flags.input, 'utf8')
checkInput({data, filename: flags.input}, 'checklist', 'DISA Checklist')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
try {
const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
} catch (error) {
console.error(`Error converting to hdf:\n${error}`)
}
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/conveyor2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ export default class Conveyor2HDF extends Command {
for (const [filename, result] of Object.entries(results)) {
fs.writeFileSync(
path.join(flags.output, checkSuffix(filename)),
JSON.stringify(result),
JSON.stringify(result, null, 2),
)
}
}
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/dbprotect2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class DBProtect2HDF extends Command {
checkInput({data, filename: flags.input}, 'dbProtect', 'DBProtect report in "Check Results Details" XML format')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/fortify2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class Fortify2HDF extends Command {
checkInput({data, filename: flags.input}, 'fortify', 'Fortify results FVDL file')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/gosec2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class Gosec2HDF extends Command {
checkInput({data, filename: flags.input}, 'gosec', 'gosec results JSON')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
4 changes: 2 additions & 2 deletions src/commands/convert/hdf2asff.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,11 +45,11 @@ export default class HDF2ASFF extends Command {
fs.mkdirSync(outputFolder)
if (convertedSlices.length === 1) {
const outfilePath = path.join(outputFolder, convertFullPathToFilename(checkSuffix(flags.output)))
fs.writeFileSync(outfilePath, JSON.stringify(convertedSlices[0]))
fs.writeFileSync(outfilePath, JSON.stringify(convertedSlices[0], null, 2))
} else {
convertedSlices.forEach((slice, index) => {
const outfilePath = path.join(outputFolder, `${convertFullPathToFilename(checkSuffix(flags.output || '')).replace('.json', '')}.p${index}.json`)
fs.writeFileSync(outfilePath, JSON.stringify(slice))
fs.writeFileSync(outfilePath, JSON.stringify(slice, null, 2))
})
}
}
Expand Down
Loading

0 comments on commit d0fc78a

Please sign in to comment.