diff --git a/CHANGELOG.md b/CHANGELOG.md index 81b7b782..6093437e 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -11,9 +11,21 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - New module for preparing files necessary for `perform` (SuperJson, ProfileAST, MapAST, ProviderJson) - New module for mocking necessary files for `perform` - Support hiding of credentials used with new security scheme Digest +- New parameter `fullError` in method `run()` to enable returning whole `PerformError` instead of string +- New static function `report` in `SuperfaceTest` to report found provider changes +- Module `matcher` for comparing old and new HTTP traffic +- Module `analyzer` for determining impact of provider changes +- Module `reporter` for reporting provider changes throughout tests +- Class `ErrorCollector` for collecting errors in `matcher` +- Environment variable `UPDATE_TRAFFIC` to replace old traffic with new, if present +- Environment variable `DISABLE_PROVIDER_CHANGES_COVERAGE` to disable collecting of test reports +- Environment variable `USE_NEW_TRAFFIC` to test with newly recorded traffic +- Errors for module `matcher` +- Error `CoverageFileNotFoundError` for correct reporting ### Removed - Parameter `client` from constructor and method `run` +- Function for omitting timestamp from perform error `removeTimestamp` ### Changed - **BREAKING CHANGE:** Updated One-SDK to [v2.0.0](https://github.com/superfaceai/one-sdk-js/releases/tag/v2.0.0) @@ -21,6 +33,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - Move functions used for recording in `SuperfaceTest` to seperate module - Use `SecurityConfiguration` (containing merged `SecurityValue` and `SecurityScheme` interfaces) instead of using them separately - Move parameter `testInstance` from superface components to second parameter in constructor +- Return value from method `run` to `PerformError | string` +- Does not overwrite HTTP traffic recording when in record mode, instead save new one next to old one with suffix `-new` ## [2.0.3] - 2022-02-15 ### Changed diff --git a/README.md b/README.md index 9de7fc84..f6f025d4 100644 --- a/README.md +++ b/README.md @@ -89,15 +89,16 @@ const superface = new SuperfaceTest( path: 'nock-recordings', fixture: 'my-recording', enableReqheadersRecording: true, + testInstance: expect } ); ``` -Given nock configuration is also stored in class. Property `path` and `fixture` is used to configure location of recordings and property `enableReqheadersRecording` is used to enable/disable recording of request headers (This is turned off by default). +Since it uses `nock` to record HTTP traffic during perform, second parameter in SuperfaceTest constructor is nock configuration containing `path` and `fixture` to configure location of recordings, property `enableReqheadersRecording` to enable/disable recording of request headers (This is turned off by default) and also property `testInstance` to enable testing library accessing current test names to generate unique hashes for recordings (currently supported only Jest and Mocha). ### Running -To test your capabilities, use method `run()`, which encapsulates nock recording and UseCase perform. It expects test configuration (similar to initializing `SuperfaceTest` class) and input. You don't need to specify `profile`, `provider` or `useCase` if you already specified them when initializing `SuperfaceTest` class. +To test your capabilities, use method `run()`, which encapsulates `nock` recording and `BoundProfileProvider` perform. It expects test configuration (similar to initializing `SuperfaceTest` class) and input. You don't need to specify `profile`, `provider` or `useCase` if you already specified them when initializing `SuperfaceTest` class. ```typescript import { SuperfaceTest } from '@superfaceai/testing'; @@ -123,16 +124,17 @@ import { SuperfaceTest } from '@superfaceai/testing'; describe('test', () => { let superface: SuperfaceTest; - afterEach(() => { - superface = new SuperfaceTest(); + beforeAll(() => { + superface = new SuperfaceTest({ + profile: 'profile', + provider: 'provider', + useCase: 'useCase', + }); }); it('performs corretly', async () => { await expect( superface.run({ - profile: 'profile', - provider: 'provider', - useCase: 'useCase', input: { some: 'input', }, @@ -142,13 +144,34 @@ describe('test', () => { }); ``` -Method `run()` will initialize Superface client, transform all components that are represented by string to corresponding instances, check whether map is locally present based on super.json, runs perform for given usecase and returns **result** or **error** value from perform (More about perform in [One-SDK docs](https://github.com/superfaceai/one-sdk-js#performing-the-use-case)). +Method `run()` initializes `BoundProfileProvider` class, runs perform for given usecase and returns **result** or **error** value from perform (More about perform in [One-SDK docs](https://github.com/superfaceai/one-sdk-js#performing-the-use-case)). Since testing library don't use `SuperfaceClient` anymore, it is **limited to local use only**. + +[OneSDK 2.0](https://github.com/superfaceai/one-sdk-js/releases/tag/v2.0.0) does not contain parser anymore, so it looks for compiled files `.ast.json` next to original ones. To support this, parser was added to testing library and can be used to parse files when no AST is found. + +Method `run` also have second parameter, containing parameters to setup processing of recordings described bellow in [Recording](#recording). +Only one parameter from this group processes result of method `run` and that is `fullError`. It enables method `run` to return full error from OneSDK instead of string. + +```typescript +superface.run( + { + profile: 'profile', + provider: 'provider', + useCase: 'useCase', + input: { + some: 'input', + }, + }, + { + fullError: true, + } +); +``` You can then use this return value to test your capabilities (We recommend you to use jest [snapshot testing](https://jestjs.io/docs/snapshot-testing) as seen in example above). ### Recording -Method `run()` also records HTTP traffic with `nock` library during UseCase perform and saves recorded traffic to json file. Before perform, library will decide to record HTTP traffic based on environmental variable `SUPERFACE_LIVE_API` and current test configuration. +Method `run()` also records HTTP traffic as we mentioned above and saves recorded traffic to json file. Before perform, library will decide to record HTTP traffic based on environmental variable `SUPERFACE_LIVE_API` and current test configuration. Variable `SUPERFACE_LIVE_API` specifies configuration which needs to be matched to record HTTP traffic. @@ -216,7 +239,7 @@ superface.run( ); ``` -You can also enter your own processing functions along side `processRecordings` parameter. Both have same function signature and are called either before load or before save of recordings. +You can also enter your own processing functions along side `processRecordings` parameter. Both have same function signature and are called either before load or before save of recordings (see [this sequence diagram](./docs/sequence_diagram.png)) ```typescript import { RecordingDefinitions, SuperfaceTest } from '@superfaceai/testing'; @@ -252,17 +275,54 @@ superface.run( ); ``` -## Debug +## Continuous testing -You can use enviroment variable `DEBUG` to enable logging throughout testing process. +Testing library supports continuous testing with live provider's traffic. This means that you can run testing library in record mode without worrying that old recording of traffic gets rewritten. Testing library compares old recording with new one and determines changes. If it find changes, it will save new traffic next to old one with suffix `-new`. + +This recording represents new traffic and you can test your capabilities with it. First time it records new traffic, it also uses it for map and therefore you can see if map works with it, but we can also setup environment variable `USE_NEW_TRAFFIC=true` to mock new traffic instead of old one when not in record mode (it looks for recording with suffix `-new` next to old one). + +When you think the new recording is safe to use for testing, you can set it up as default with env variable `UPDATE_TRAFFIC=true`. -`DEBUG="superface:testing*"` will enable all logging +## Reporting -`DEBUG="superface:testing"` will enable logging in `SuperfaceTest` class, its methods and utility functions +To report found changes in traffic, you can implement your own function for reporting and pass it to `SuperfaceTest.report()`. It's signiture should be: -`DEBUG="superface:testing:recordings"` will enable logging of processing sensitive information in recordings +```typescript +type TestReport = { + impact: MatchImpact; + profileId: string; + providerName: string; + useCaseName: string; + recordingPath: string; + input: NonPrimitive; + result: TestingReturn; + errors: ErrorCollection; +}[]; + +type AlertFunction = (report: TestReport) => unknown | Promise; +``` + +To disable collecting and also reporting these information, you can setup environment variable `DISABLE_PROVIDER_CHANGES_COVERAGE=true`. + +## Debug + +You can use enviroment variable `DEBUG` to enable logging throughout testing process. -`DEBUG="superface:testing:recordings*"` or `DEBUG="superface:testing:recordings:sensitive"` will enable logging of replacing actual credentials +- `DEBUG="superface:testing*"` will enable all logging +- `DEBUG="superface:testing"` will enable logging of: + - perform results + - start and end of recording/mocking HTTP traffic + - start of `beforeRecordingLoad` and `beforeRecordingSave` functions +- `DEBUG=superface:testing:setup*` will enable logging of: + - setup of recording paths and superface components (profile, provider, usecase) + - setup of super.json and local map +- `DEBUG=superface:testing:hash*` will enable logging of hashing recordings +- `DEBUG="superface:testing:recordings"` will enable logging of processing sensitive information in recordings +- `DEBUG="superface:testing:recordings*"` or `DEBUG="superface:testing:recordings:sensitive"` will also enable logging of replacing actual credentials +- `DEBUG=superface:testing:matching*` enables logging of matching recordings +- `DEBUG=superface:testing:reporter*` enables logging of reporting + +You can encounter `NetworkError` or `SdkExecutionError` during testing with mocked traffic, it usually means that request didn’t get through. If nock (used for loading mocked traffic) can’t match recording, request is denied. You can debug nock matching of recordings with `DEBUG=nock*` to see what went wrong. ## Known Limitations @@ -270,7 +330,8 @@ You can use enviroment variable `DEBUG` to enable logging throughout testing pro Recordings make it possible to run tests without calling the live API. This works by trying to match a request to the requests in the existing recordings. If a match is found, the recorded response is returned. However, since the testing client saves recording for each test run in a single file, it means multiple matching requests for the same use-case and input will overwrite each other. -A workaround is to use different inputs for each each test. +To solve this, you can enter test instance (`expect` from jest) or specify custom hash phrase to differentiate between runs. +Also a workaround is to use different inputs for each each test. ## Support diff --git a/docs/sequence_diagram.png b/docs/sequence_diagram.png new file mode 100644 index 00000000..7b163719 Binary files /dev/null and b/docs/sequence_diagram.png differ diff --git a/package.json b/package.json index f53fa670..f779e6e8 100644 --- a/package.json +++ b/package.json @@ -44,7 +44,10 @@ }, "dependencies": { "@superfaceai/ast": "^1.2.0", + "ajv": "^8.11.0", "debug": "^4.3.2", + "genson-js": "^0.0.8", + "http-encoding": "^1.5.1", "nock": "^13.1.3" }, "peerDependencies": { diff --git a/src/common/errors.test.ts b/src/common/errors.test.ts index ead6042f..60592e5e 100644 --- a/src/common/errors.test.ts +++ b/src/common/errors.test.ts @@ -1,11 +1,14 @@ import { assertIsIOError, + BaseURLNotFoundError, ComponentUndefinedError, + CoverageFileNotFoundError, InstanceMissingError, MapUndefinedError, ProfileUndefinedError, ProviderJsonUndefinedError, RecordingPathUndefinedError, + RecordingsNotFoundError, SuperJsonNotFoundError, UnexpectedError, } from './errors'; @@ -227,6 +230,59 @@ describe('errors', () => { }); }); + describe('when throwing RecordingsNotFoundError', () => { + const error = new RecordingsNotFoundError('path/to/recording.json'); + + it('throws in correct format', () => { + expect(() => { + throw error; + }).toThrow( + 'Recordings could not be found for running mocked tests at "path/to/recording.json".\nYou must call the live API first to record API traffic.\nUse the environment variable SUPERFACE_LIVE_API to call the API and record traffic.\nSee https://github.com/superfaceai/testing#recording to learn more.' + ); + }); + + it('returns correct format', () => { + expect(error.toString()).toEqual( + 'RecordingsNotFoundError: Recordings could not be found for running mocked tests at "path/to/recording.json".\nYou must call the live API first to record API traffic.\nUse the environment variable SUPERFACE_LIVE_API to call the API and record traffic.\nSee https://github.com/superfaceai/testing#recording to learn more.' + ); + }); + }); + + describe('when throwing BaseURLNotFoundError', () => { + const error = new BaseURLNotFoundError('provider'); + + it('throws in correct format', () => { + expect(() => { + throw error; + }).toThrow( + 'No base URL was found for provider "provider", configure a service in provider.json.' + ); + }); + + it('returns correct format', () => { + expect(error.toString()).toEqual( + 'BaseURLNotFoundError: No base URL was found for provider "provider", configure a service in provider.json.' + ); + }); + }); + + describe('when throwing CoverageFileNotFoundError', () => { + const samplePath = 'path/to/coverage.json'; + const error = new CoverageFileNotFoundError(samplePath); + + it('throws in correct format', () => { + expect(() => { + throw error; + }).toThrow(`No coverage file at path "${samplePath}" found.`); + }); + + it('returns correct format', () => { + expect(error.toString()).toEqual( + `CoverageFileNotFoundError: No coverage file at path "${samplePath}" found.` + ); + }); + }); + describe('when asserting error is IO error', () => { it('throws developer error correctly', async () => { expect(() => assertIsIOError(null)).toThrow(new UnexpectedError('null')); diff --git a/src/common/errors.ts b/src/common/errors.ts index dbb80cbc..cb13a78d 100644 --- a/src/common/errors.ts +++ b/src/common/errors.ts @@ -1,20 +1,25 @@ import { SDKExecutionError } from '@superfaceai/one-sdk'; import { inspect } from 'util'; -class ErrorBase extends Error { - constructor(public kind: string, public override message: string) { +export class ErrorBase extends Error { + constructor(kind: string, message: string) { super(message); - this.name = kind; Object.setPrototypeOf(this, ErrorBase.prototype); + + this.name = kind; } - get [Symbol.toStringTag](): string { - return this.kind; + public get [Symbol.toStringTag](): string { + return this.name; } - override toString(): string { - return `${this.kind}: ${this.message}`; + public get kind(): string { + return this.name; + } + + public override toString(): string { + return `${this.name}: ${this.message}`; } } @@ -107,10 +112,10 @@ export class SuperJsonLoadingFailedError extends ErrorBase { } export class RecordingsNotFoundError extends ErrorBase { - constructor() { + constructor(path: string) { super( 'RecordingsNotFoundError', - 'Recordings could not be found for running mocked tests.\nYou must call the live API first to record API traffic.\nUse the environment variable SUPERFACE_LIVE_API to call the API and record traffic.\nSee https://github.com/superfaceai/testing#recording to learn more.' + `Recordings could not be found for running mocked tests at "${path}".\nYou must call the live API first to record API traffic.\nUse the environment variable SUPERFACE_LIVE_API to call the API and record traffic.\nSee https://github.com/superfaceai/testing#recording to learn more.` ); } } @@ -124,6 +129,15 @@ export class BaseURLNotFoundError extends ErrorBase { } } +export class CoverageFileNotFoundError extends ErrorBase { + constructor(path: string) { + super( + 'CoverageFileNotFoundError', + `No coverage file at path "${path}" found.` + ); + } +} + export function assertIsIOError( error: unknown ): asserts error is { code: string } { diff --git a/src/common/format.ts b/src/common/format.ts index 6e39c1dc..2fbc3af8 100644 --- a/src/common/format.ts +++ b/src/common/format.ts @@ -1,11 +1,5 @@ import { join as joinPath } from 'path'; -const ISO_DATE_REGEX = - /(\d{4})-(\d{2})-(\d{2})T(\d{2}):(\d{2}):(\d{2}(?:\.\d*)?)((-(\d{2}):(\d{2})|Z)?)/gm; - -export const removeTimestamp = (payload: string): string => - payload.replace(ISO_DATE_REGEX, ''); - export function getFixtureName( profileId: string, providerName: string, diff --git a/src/common/io.test.ts b/src/common/io.test.ts index dba7b637..cabd5b00 100644 --- a/src/common/io.test.ts +++ b/src/common/io.test.ts @@ -1,16 +1,23 @@ import { NormalizedSuperJsonDocument } from '@superfaceai/ast'; -import { join } from 'path'; +import { join as joinPath, resolve as resolvePath } from 'path'; import { Writable } from 'stream'; import { mockSuperJson } from '../superface/mock/super-json'; -import { exists, rimraf, streamEnd, streamWrite } from './io'; +import { + exists, + mkdirQuiet, + readFilesInDir, + rimraf, + streamEnd, + streamWrite, +} from './io'; import { OutputStream } from './output-stream'; describe('IO functions', () => { - const WORKING_DIR = join('fixtures', 'io'); + const WORKING_DIR = joinPath('fixtures', 'io'); const FIXTURE = { - superJson: join('superface', 'super.json'), + superJson: joinPath('superface', 'super.json'), }; let INITIAL_CWD: string; @@ -107,4 +114,58 @@ describe('IO functions', () => { await expect(actualPromise).resolves.toBeUndefined(); }, 10000); }); + + describe('when reading files in directory', () => { + it('fails when directory does not exist', async () => { + const dirname = 'not-existing-directory'; + + await expect(readFilesInDir(dirname)).rejects.toThrow(); + }); + + it('returns empty array when directory has no files', async () => { + const dirname = 'test'; + await mkdirQuiet(dirname); + + await expect(readFilesInDir(dirname)).resolves.toEqual([]); + }); + + it('returns list of files in directory', async () => { + const dirname = 'test'; + const expectedFileName = joinPath(dirname, 'test.json'); + + // prepare + await mkdirQuiet(dirname); + await OutputStream.writeIfAbsent(expectedFileName, 'test'); + + await expect(readFilesInDir(dirname)).resolves.toEqual( + expect.arrayContaining([resolvePath(expectedFileName)]) + ); + }); + + it('returns list of files, even nested in directories', async () => { + const dirname = 'test'; + const expectedFileName1 = joinPath(dirname, 'test.json'); + const expectedFileName2 = joinPath(dirname, 'nested', 'test.json'); + const expectedFileName3 = joinPath( + dirname, + 'nested', + 'nested', + 'test.json' + ); + + // prepare + const options = { dirs: true }; + await OutputStream.writeIfAbsent(expectedFileName1, 'test', options); + await OutputStream.writeIfAbsent(expectedFileName2, 'test', options); + await OutputStream.writeIfAbsent(expectedFileName3, 'test', options); + + await expect(readFilesInDir(dirname)).resolves.toEqual( + expect.arrayContaining([ + resolvePath(expectedFileName1), + resolvePath(expectedFileName2), + resolvePath(expectedFileName3), + ]) + ); + }); + }); }); diff --git a/src/common/io.ts b/src/common/io.ts index 2f4023af..2d7b2760 100644 --- a/src/common/io.ts +++ b/src/common/io.ts @@ -1,4 +1,5 @@ import * as fs from 'fs'; +import { join as joinPath, resolve as resolvePath } from 'path'; import rimrafCallback from 'rimraf'; import { Writable } from 'stream'; import { promisify } from 'util'; @@ -9,6 +10,8 @@ export const access = promisify(fs.access); export const mkdir = promisify(fs.mkdir); export const readFile = promisify(fs.readFile); export const rimraf = promisify(rimrafCallback); +export const rename = promisify(fs.rename); +export const readdir = promisify(fs.readdir); export interface WritingOptions { append?: boolean; @@ -35,6 +38,26 @@ export async function exists(path: string): Promise { return true; } +/** + * Creates a directory without erroring if it already exists. + * Returns `true` if the directory was created. + */ +export async function mkdirQuiet(path: string): Promise { + try { + await mkdir(path); + } catch (err: unknown) { + assertIsIOError(err); + + // Allow `EEXIST` because scope directory already exists. + if (err.code === 'EEXIST') { + return; + } + + // Rethrow other errors. + throw err; + } +} + /** * Reads a file and converts to string. * Returns `undefined` if reading fails for any reason. @@ -67,3 +90,21 @@ export function streamEnd(stream: Writable): Promise { stream.end(); }); } + +export async function readFilesInDir(path: string): Promise { + const resolvedPath = resolvePath(path); + const dirents = await readdir(path, { withFileTypes: true }); + const files: string[] = []; + + for (const dirent of dirents) { + if (dirent.isDirectory()) { + files.push( + ...(await readFilesInDir(joinPath(resolvedPath, dirent.name))) + ); + } else { + files.push(joinPath(resolvedPath, dirent.name)); + } + } + + return files; +} diff --git a/src/nock/analyzer.test.ts b/src/nock/analyzer.test.ts new file mode 100644 index 00000000..da8692ae --- /dev/null +++ b/src/nock/analyzer.test.ts @@ -0,0 +1,88 @@ +import { analyzeChangeImpact, MatchImpact } from './analyzer'; +import { + MatchErrorRequestHeaders, + MatchErrorResponse, + MatchErrorResponseHeaders, +} from './matcher.errors'; + +describe('Analyze module', () => { + describe('analyzeChangeImpact', () => { + describe('when got no errors', () => { + it('returns no impact', () => { + expect( + analyzeChangeImpact({ + added: [], + removed: [], + changed: [], + }) + ).toBe(MatchImpact.NONE); + }); + }); + + describe('when got no breaking errors', () => { + it('returns impact PATCH', () => { + expect( + analyzeChangeImpact({ + added: [ + new MatchErrorRequestHeaders( + 'Accept', + undefined, + 'application/json' + ), + ], + removed: [], + changed: [], + }) + ).toBe(MatchImpact.PATCH); + }); + }); + + describe('when got minor change errors', () => { + it('returns impact MINOR', () => { + expect( + analyzeChangeImpact({ + added: [ + new MatchErrorResponseHeaders( + 'content-type', + undefined, + 'application/json' + ), + ], + removed: [], + changed: [], + }) + ).toBe(MatchImpact.MINOR); + }); + }); + + describe('when got breaking change errors', () => { + it('returns impact MAJOR', () => { + expect( + analyzeChangeImpact({ + added: [ + new MatchErrorResponseHeaders( + 'content-type', + undefined, + 'application/json' + ), + ], + removed: [], + changed: [ + new MatchErrorResponse( + { + oldResponse: { + field1: 'value', + }, + newResponse: { + field: 'value', + }, + }, + 'Response format changed' + ), + ], + }) + ).toBe(MatchImpact.MAJOR); + }); + }); + }); +}); diff --git a/src/nock/analyzer.ts b/src/nock/analyzer.ts new file mode 100644 index 00000000..5a1b1334 --- /dev/null +++ b/src/nock/analyzer.ts @@ -0,0 +1,64 @@ +import { + ErrorCollection, + MatchError, + MatchErrorLength, + MatchErrorResponse, + MatchErrorResponseHeaders, + MatchErrorStatus, +} from './matcher.errors'; + +export enum MatchImpact { + MAJOR = 'major', + MINOR = 'minor', + PATCH = 'patch', + NONE = 'none', +} + +export function analyzeChangeImpact( + errors: ErrorCollection +): MatchImpact { + // check for breaking changes + const responseDoesNotMatch = [...errors.removed, ...errors.changed].some( + error => error instanceof MatchErrorResponse + ); + const statusCodeDoesNotMatch = errors.changed.some( + error => error instanceof MatchErrorStatus + ); + const responseHeadersDoesNotMatch = [ + ...errors.removed, + ...errors.changed, + ].some(error => error instanceof MatchErrorResponseHeaders); + const recordingsCountNotMatch = [...errors.added, ...errors.removed].some( + error => error instanceof MatchErrorLength + ); + + if ( + responseDoesNotMatch || + responseHeadersDoesNotMatch || + statusCodeDoesNotMatch || + recordingsCountNotMatch + ) { + return MatchImpact.MAJOR; + } + + // check for minor changes + const responseExtended = errors.added.some( + error => error instanceof MatchErrorResponse + ); + const responseHeadersAdded = errors.added.some( + error => error instanceof MatchErrorResponseHeaders + ); + + if (responseExtended || responseHeadersAdded) { + return MatchImpact.MINOR; + } + + const errorCount = + errors.added.length + errors.removed.length + errors.changed.length; + + if (errorCount !== 0) { + return MatchImpact.PATCH; + } + + return MatchImpact.NONE; +} diff --git a/src/nock/error-collector.ts b/src/nock/error-collector.ts new file mode 100644 index 00000000..721e0373 --- /dev/null +++ b/src/nock/error-collector.ts @@ -0,0 +1,35 @@ +import createDebug from 'debug'; + +import { ErrorCollection, ErrorType, MatchError } from './matcher.errors'; + +const debugMatching = createDebug('superface:testing:matching'); + +export class ErrorCollector { + private readonly added: MatchError[] = []; + private readonly removed: MatchError[] = []; + private readonly changed: MatchError[] = []; + + get count(): number { + return this.added.length + this.removed.length + this.changed.length; + } + + get errors(): ErrorCollection { + return { + added: this.added, + removed: this.removed, + changed: this.changed, + }; + } + + add(type: ErrorType, error: MatchError): void { + debugMatching(error.toString()); + + if (type === ErrorType.ADD) { + this.added.push(error); + } else if (type === ErrorType.REMOVE) { + this.removed.push(error); + } else if (type === ErrorType.CHANGE) { + this.changed.push(error); + } + } +} diff --git a/src/nock/matcher.errors.ts b/src/nock/matcher.errors.ts new file mode 100644 index 00000000..fd5c5242 --- /dev/null +++ b/src/nock/matcher.errors.ts @@ -0,0 +1,154 @@ +import { inspect } from 'util'; + +import { ErrorBase } from '../common/errors'; + +export class MatchError extends ErrorBase { + constructor(kind: string, message: string) { + super(kind, message); + } +} + +export class MatchErrorLength extends MatchError { + constructor(public oldLength: number, public newLength: number) { + super( + 'MatchErrorLength', + `Number of recorded HTTP calls do not match: ${oldLength} : ${newLength}` + ); + + Object.setPrototypeOf(this, MatchErrorLength.prototype); + } +} + +export class MatchErrorMethod extends MatchError { + constructor( + public oldMethod: string | undefined, + public newMethod: string | undefined + ) { + super( + 'MatchErrorMethod', + `Request method does not match: "${oldMethod ?? 'not-existing'}" : "${ + newMethod ?? 'not-existing' + }"` + ); + + Object.setPrototypeOf(this, MatchErrorMethod.prototype); + } +} + +export class MatchErrorStatus extends MatchError { + constructor( + public oldStatus: number | undefined, + public newStatus: number | undefined + ) { + super( + 'MatchErrorStatus', + `Status codes do not match: "${oldStatus ?? 'not-existing'}" : "${ + newStatus ?? 'not-existing' + }"` + ); + + Object.setPrototypeOf(this, MatchErrorStatus.prototype); + } +} + +export class MatchErrorBaseURL extends MatchError { + constructor(public oldBaseURL: string, public newBaseURL: string) { + super( + 'MatchErrorBaseURL', + `Request Base URL does not match: "${oldBaseURL}" : "${newBaseURL}"` + ); + + Object.setPrototypeOf(this, MatchErrorBaseURL.prototype); + } +} + +export class MatchErrorPath extends MatchError { + constructor(public oldPath: string, public newPath: string) { + super('MatchErrorPath', `Paths do not match: "${oldPath}" : "${newPath}"`); + + Object.setPrototypeOf(this, MatchErrorPath.prototype); + } +} + +export class MatchErrorRequestHeaders extends MatchError { + constructor( + public headerName: string, + public oldRequestHeader?: string, + public newRequestHeader?: string + ) { + super( + 'MatchErrorRequestHeaders', + `Request header "${headerName}" does not match: "${ + oldRequestHeader ?? 'not-existing' + }" : "${newRequestHeader ?? 'not-existing'}"` + ); + + Object.setPrototypeOf(this, MatchErrorRequestHeaders.prototype); + } +} + +export class MatchErrorResponseHeaders extends MatchError { + constructor( + public headerName: string, + public oldResponseHeader?: string, + public newResponseHeader?: string + ) { + super( + 'MatchErrorResponseHeaders', + `Response header "${headerName}" does not match: "${ + oldResponseHeader ?? 'not-existing' + }" - "${newResponseHeader ?? 'not-existing'}"` + ); + + Object.setPrototypeOf(this, MatchErrorResponseHeaders.prototype); + } +} + +export class MatchErrorRequestBody extends MatchError { + constructor( + payload: { oldRequestBody?: unknown; newRequestBody?: unknown } | string + ) { + const message = + typeof payload === 'string' + ? `Request body does not match: ${payload}` + : `Request body does not match: \nold:\n\`\`\`${ + inspect(payload.oldRequestBody) ?? 'not-existing' + }\`\`\`\nnew:\n\`\`\`${ + inspect(payload.newRequestBody) ?? 'not-existing' + }\`\`\``; + + super('MatchErrorRequestBody', message); + + Object.setPrototypeOf(this, MatchErrorRequestBody.prototype); + } +} + +export class MatchErrorResponse extends MatchError { + constructor( + public payload: { oldResponse?: unknown; newResponse?: unknown }, + public stringPayload: string + ) { + super( + 'MatchErrorResponse', + `Response does not match: \nold:\n\`\`\`${ + inspect(payload.oldResponse) ?? 'not-existing' + }\`\`\`\nnew:\n\`\`\`${ + inspect(payload.newResponse) ?? 'not-existing' + }\`\`\`\n` + stringPayload + ); + + Object.setPrototypeOf(this, MatchErrorResponse.prototype); + } +} + +export type ErrorCollection = { + added: T[]; + removed: T[]; + changed: T[]; +}; + +export enum ErrorType { + ADD = 'add', + REMOVE = 'remove', + CHANGE = 'change', +} diff --git a/src/nock/matcher.test.ts b/src/nock/matcher.test.ts new file mode 100644 index 00000000..105579bc --- /dev/null +++ b/src/nock/matcher.test.ts @@ -0,0 +1,483 @@ +import { + ReplyBody, + RequestBodyMatcher, + RequestHeaderMatcher, +} from 'nock/types'; + +import { + RecordingDefinition, + RecordingDefinitions, +} from '../superface-test.interfaces'; +import { Matcher } from './matcher'; +import { + MatchErrorBaseURL, + MatchErrorLength, + MatchErrorMethod, + MatchErrorPath, + MatchErrorRequestBody, + MatchErrorRequestHeaders, + MatchErrorResponse, + MatchErrorResponseHeaders, + MatchErrorStatus, +} from './matcher.errors'; + +const oldContentType = 'application/json; charset=utf-8'; +const oldAccept = 'plain/text'; +const oldBody = 'body=true'; +const oldResponse = { + created: { + name: 'intelligence', + color: 'blue', + }, +}; + +const getRecording = (options?: { + baseUrl?: string; + method?: string; + path?: string; + body?: RequestBodyMatcher; + status?: number; + response?: ReplyBody; + rawHeaders?: string[]; + requestHeaders?: Record; +}): RecordingDefinition => ({ + scope: options?.baseUrl ?? 'https://localhost', + method: options?.method ?? 'POST', + path: options?.path ?? '/attributes?name=intelligence&color=blue', + body: options?.body ?? oldBody, + status: options?.status ?? 201, + response: options?.response ?? oldResponse, + rawHeaders: options?.rawHeaders ?? [ + 'Access-Control-Allow-Origin', + '*', + 'Access-Control-Allow-Headers', + 'Origin, X-Requested-With, Content-Type, Accept, Authorization', + 'Content-Type', + oldContentType, + ], + reqheaders: options?.requestHeaders ?? { + Accept: oldAccept, + }, +}); + +const sampleRecordings: RecordingDefinitions = [getRecording()]; + +describe('Matcher', () => { + it('returns no errors when recordings match', async () => { + const newRecordings: RecordingDefinitions = [getRecording()]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ valid: true }); + }); + + describe('when number of recordings does not match', () => { + it('returns removed error for length of recordings', async () => { + const newRecordings: RecordingDefinitions = []; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + changed: [], + removed: [ + new MatchErrorLength(sampleRecordings.length, newRecordings.length), + ], + }, + }); + }); + + it('returns added error for length of recordings', async () => { + const newRecordings: RecordingDefinitions = sampleRecordings; + + await expect(Matcher.match([], newRecordings)).resolves.toEqual({ + valid: false, + errors: { + changed: [], + removed: [], + added: [new MatchErrorLength(0, newRecordings.length)], + }, + }); + }); + }); + + describe('when method does not match', () => { + it('returns changed error for method', async () => { + const newRecordings = [getRecording({ method: 'GET' })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [new MatchErrorMethod('POST', 'GET')], + }, + }); + }); + }); + + describe('when response status does not match', () => { + it('returns changed error for status', async () => { + const newRecordings = [getRecording({ status: 404 })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [new MatchErrorStatus(201, 404)], + }, + }); + }); + }); + + describe('when request base URL does not match', () => { + it('returns changed error for base url', async () => { + const newBaseUrl = 'https://localhost/new/path'; + const newRecordings = [getRecording({ baseUrl: newBaseUrl })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [new MatchErrorBaseURL('https://localhost', newBaseUrl)], + }, + }); + }); + }); + + describe('when request path does not match', () => { + it('returns changed error for path', async () => { + const newPath = '/new/path'; + const newRecordings = [getRecording({ path: newPath })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [ + new MatchErrorPath( + '/attributes?name=intelligence&color=blue', + newPath + ), + ], + }, + }); + }); + }); + + describe('when request headers does not match', () => { + it('returns changed error for header', async () => { + const newHeaderValue = 'application/json'; + const newRecordings = [ + getRecording({ requestHeaders: { Accept: newHeaderValue } }), + ]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [ + new MatchErrorRequestHeaders('Accept', oldAccept, newHeaderValue), + ], + }, + }); + }); + + it('returns added error for header', async () => { + const newHeaderValue = 'application/json'; + const oldRecordings = [getRecording({ requestHeaders: {} })]; + const newRecordings = [ + getRecording({ requestHeaders: { Accept: newHeaderValue } }), + ]; + + await expect( + Matcher.match(oldRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + removed: [], + changed: [], + added: [ + new MatchErrorRequestHeaders('Accept', undefined, newHeaderValue), + ], + }, + }); + }); + + it('returns removed error for header', async () => { + const newHeaderValue = undefined; + const newRecordings = [getRecording({ requestHeaders: {} })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + changed: [], + removed: [ + new MatchErrorRequestHeaders('Accept', oldAccept, newHeaderValue), + ], + }, + }); + }); + }); + + describe('when response headers does not match', () => { + it('returns changed error for header', async () => { + const newHeaderValue = 'application/json'; + const newRecordings = [ + getRecording({ rawHeaders: ['Content-Type', newHeaderValue] }), + ]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [ + new MatchErrorResponseHeaders( + 'Content-Type', + oldContentType, + newHeaderValue + ), + ], + }, + }); + }); + + it('returns added error for header', async () => { + const newHeaderValue = 'application/json'; + const oldRecordings = [getRecording({ rawHeaders: [] })]; + const newRecordings = [ + getRecording({ rawHeaders: ['Content-Type', newHeaderValue] }), + ]; + + await expect( + Matcher.match(oldRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + removed: [], + changed: [], + added: [ + new MatchErrorResponseHeaders( + 'Content-Type', + undefined, + newHeaderValue + ), + ], + }, + }); + }); + + it('returns removed error for header', async () => { + const newHeaderValue = undefined; + const newRecordings = [getRecording({ rawHeaders: [] })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + changed: [], + removed: [ + new MatchErrorResponseHeaders( + 'Content-Type', + oldContentType, + newHeaderValue + ), + ], + }, + }); + }); + }); + + describe('when request body does not match', () => { + it('returns changed error for request body', async () => { + const newBody = 'newBody=true'; + const newRecordings = [getRecording({ body: newBody })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [ + new MatchErrorRequestBody( + "data must have required property 'body'" + ), + ], + }, + }); + }); + + it('returns added error for request body as empty string', async () => { + const newBody = 'newBody=true'; + const oldRecordings = [getRecording({ body: '' })]; + const newRecordings = [getRecording({ body: newBody })]; + + await expect( + Matcher.match(oldRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [ + new MatchErrorRequestBody({ + oldRequestBody: undefined, + newRequestBody: { newBody: 'true' }, + }), + ], + removed: [], + changed: [], + }, + }); + }); + + it('returns added error for request body as undefined', async () => { + const newBody = 'newBody=true'; + const oldRecordings = [ + { + scope: 'https://localhost', + method: 'POST', + path: '/attributes?name=intelligence&color=blue', + body: undefined, + status: 201, + response: oldResponse, + rawHeaders: [ + 'Access-Control-Allow-Origin', + '*', + 'Access-Control-Allow-Headers', + 'Origin, X-Requested-With, Content-Type, Accept, Authorization', + 'Content-Type', + oldContentType, + ], + reqheaders: { + Accept: oldAccept, + }, + }, + ]; + const newRecordings = [getRecording({ body: newBody })]; + + await expect( + Matcher.match(oldRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [ + new MatchErrorRequestBody({ + oldRequestBody: undefined, + newRequestBody: { newBody: 'true' }, + }), + ], + removed: [], + changed: [], + }, + }); + }); + + it('returns removed error for request body as empty string', async () => { + const newRecordings = [getRecording({ body: '' })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [ + new MatchErrorRequestBody({ + oldRequestBody: { body: 'true' }, + newRequestBody: undefined, + }), + ], + changed: [], + }, + }); + }); + + it('returns removed error for request body as undefined', async () => { + const newRecordings = [ + { + scope: 'https://localhost', + method: 'POST', + path: '/attributes?name=intelligence&color=blue', + body: undefined, + status: 201, + response: oldResponse, + rawHeaders: [ + 'Access-Control-Allow-Origin', + '*', + 'Access-Control-Allow-Headers', + 'Origin, X-Requested-With, Content-Type, Accept, Authorization', + 'Content-Type', + oldContentType, + ], + reqheaders: { + Accept: oldAccept, + }, + }, + ]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [ + new MatchErrorRequestBody({ + oldRequestBody: { body: 'true' }, + newRequestBody: undefined, + }), + ], + changed: [], + }, + }); + }); + }); + + describe('when response does not match', () => { + it('returns changed error for response', async () => { + const newResponse = {}; + const newRecordings = [getRecording({ response: newResponse })]; + + await expect( + Matcher.match(sampleRecordings, newRecordings) + ).resolves.toEqual({ + valid: false, + errors: { + added: [], + removed: [], + changed: [ + new MatchErrorResponse( + { oldResponse, newResponse }, + "data must have required property 'created'" + ), + ], + }, + }); + }); + }); +}); diff --git a/src/nock/matcher.ts b/src/nock/matcher.ts new file mode 100644 index 00000000..67664415 --- /dev/null +++ b/src/nock/matcher.ts @@ -0,0 +1,442 @@ +import Ajv from 'ajv'; +import createDebug from 'debug'; +import { createSchema } from 'genson-js/dist'; +import { inspect } from 'util'; + +import { UnexpectedError } from '../common/errors'; +import { readFileQuiet } from '../common/io'; +import { writeRecordings } from '../common/output-stream'; +import { + AnalysisResult, + RecordingDefinition, + RecordingDefinitions, +} from '../superface-test.interfaces'; +import { analyzeChangeImpact, MatchImpact } from './analyzer'; +import { ErrorCollector } from './error-collector'; +import { + ErrorCollection, + ErrorType, + MatchError, + MatchErrorBaseURL, + MatchErrorLength, + MatchErrorMethod, + MatchErrorPath, + MatchErrorRequestBody, + MatchErrorRequestHeaders, + MatchErrorResponse, + MatchErrorResponseHeaders, + MatchErrorStatus, +} from './matcher.errors'; +import { + decodeResponse, + getRequestHeader, + getResponseHeader, + parseBody, +} from './matcher.utils'; +import { composeRecordingPath } from './recorder'; + +export interface MatchHeaders { + old?: string; + new?: string; +} + +interface RequestHeaderMatch { + accept?: MatchHeaders; +} + +interface ResponseHeaderMatch { + contentEncoding?: MatchHeaders; + contentType?: MatchHeaders; + contentLength?: MatchHeaders; +} + +const schemaValidator = new Ajv({ + allErrors: true, +}); + +const debugMatching = createDebug('superface:testing:matching'); +const debugMatchingSensitive = createDebug( + 'superface:testing:matching:sensitive' +); +debugMatchingSensitive( + ` +WARNING: YOU HAVE ALLOWED LOGGING SENSITIVE INFORMATION. +THIS LOGGING LEVEL DOES NOT PREVENT LEAKING SECRETS AND SHOULD NOT BE USED IF THE LOGS ARE GOING TO BE SHARED. +CONSIDER DISABLING SENSITIVE INFORMATION LOGGING BY APPENDING THE DEBUG ENVIRONMENT VARIABLE WITH ",-*:sensitive". +` +); + +export type MatchResult = + | { valid: true } + | { valid: false; errors: ErrorCollection }; + +export class Matcher { + private static errorCollector: ErrorCollector; + + /** + * Matches old recording file to new recorded traffic. + * Assumes we always have correct order of HTTP calls. + * + * @param oldTrafficDefs - old traffic loaded from recording file + * @param newTrafficDefs - new recorded traffic + * @returns object representing whether recordings match or not + */ + static async match( + oldTrafficDefs: RecordingDefinitions, + newTrafficDefs: RecordingDefinitions + ): Promise { + this.errorCollector = new ErrorCollector(); + + if (oldTrafficDefs.length < newTrafficDefs.length) { + this.errorCollector.add( + ErrorType.ADD, + new MatchErrorLength(oldTrafficDefs.length, newTrafficDefs.length) + ); + } else if (oldTrafficDefs.length > newTrafficDefs.length) { + this.errorCollector.add( + ErrorType.REMOVE, + new MatchErrorLength(oldTrafficDefs.length, newTrafficDefs.length) + ); + } + + for (let i = 0; i < oldTrafficDefs.length; i++) { + // access old recording and new recording + const oldTraffic = oldTrafficDefs[i]; + const newTraffic = newTrafficDefs[i]; + + await this.matchTraffic(oldTraffic, newTraffic); + } + + const { errors, count } = this.errorCollector; + + if (count === 0) { + debugMatching('No changes found'); + + return { valid: true }; + } else { + debugMatching(`Found ${count} ${count > 1 ? 'errors' : 'error'}`); + + return { valid: false, errors }; + } + } + + private static async matchTraffic( + oldTraffic?: RecordingDefinition, + newTraffic?: RecordingDefinition + ): Promise { + if (!oldTraffic || !newTraffic) { + return; + } + + debugMatching( + `Matching HTTP calls ${oldTraffic.scope + oldTraffic.path} : ${ + newTraffic.scope + newTraffic.path + }` + ); + + // method + debugMatching('\trequest method'); + if (oldTraffic.method !== newTraffic.method) { + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorMethod(oldTraffic.method, newTraffic.method) + ); + } + + // status + debugMatching('\tresponse status'); + if (oldTraffic.status !== newTraffic.status) { + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorStatus(oldTraffic.status, newTraffic.status) + ); + } + + // base URL + debugMatching('\trequest base URL'); + if (oldTraffic.scope !== newTraffic.scope) { + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorBaseURL(oldTraffic.scope, newTraffic.scope) + ); + } + + // path + debugMatching('\trequest path'); + if (oldTraffic.path !== newTraffic.path) { + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorPath(oldTraffic.path, newTraffic.path) + ); + } + + // TODO: research nock types of request headers and parse them correctly + // request headers + const { accept } = this.matchRequestHeaders( + (oldTraffic.reqheaders as Record) ?? {}, + (newTraffic.reqheaders as Record) ?? {} + ); + + // response headers + const { contentEncoding } = this.matchResponseHeaders( + oldTraffic.rawHeaders ?? [], + newTraffic.rawHeaders ?? [] + ); + + // request body + this.matchRequestBody(oldTraffic.body, newTraffic.body, accept); + + // response + if (oldTraffic.response !== undefined) { + await this.matchResponse( + oldTraffic.response, + newTraffic.response, + contentEncoding + ); + } + } + + private static matchRequestHeaders( + oldHeaders: Record, + newHeaders: Record + ): RequestHeaderMatch { + debugMatching('\trequest headers'); + + const accept = getRequestHeader(oldHeaders, newHeaders, 'accept'); + + this.addError( + accept.old, + accept.new, + new MatchErrorRequestHeaders('Accept', accept.old, accept.new) + ); + + // list of other headers to add support for: + // ... + + return { + accept, + }; + } + + private static matchResponseHeaders( + oldHeaders: string[], + newHeaders: string[] + ): ResponseHeaderMatch { + debugMatching('\tresponse headers'); + + // match content type + const contentType = getResponseHeader( + oldHeaders, + newHeaders, + 'content-type' + ); + + if (contentType.old !== contentType.new) { + this.addError( + contentType.old, + contentType.new, + new MatchErrorResponseHeaders( + 'Content-Type', + contentType.old, + contentType.new + ) + ); + } + + // match content Encoding + const contentEncoding = getResponseHeader( + oldHeaders, + newHeaders, + 'content-encoding' + ); + + if (contentEncoding.old !== contentEncoding.new) { + this.addError( + contentEncoding.old, + contentEncoding.new, + new MatchErrorResponseHeaders( + 'Content-Encoding', + contentEncoding.old, + contentEncoding.new + ) + ); + } + + const contentLength = getResponseHeader( + oldHeaders, + newHeaders, + 'content-length' + ); + + // list of other headers to add support for: + // Access-Control-Allow-Origin, access-control-allow-headers, access-control-allow-methods + // Cache-Control, Vary, Transfer-Encoding, + // Pragma, Server, Connection, referrer-policy? + + return { + contentType, + contentEncoding, + contentLength, + }; + } + + private static matchRequestBody( + oldBody: unknown, + newBody: unknown, + accept?: MatchHeaders + ): void { + debugMatching('\trequest body'); + + // TODO: try to parse string body or rather compare it plainly? + // if body is not string and is defined - expect valid JSON + let oldRequestBody = oldBody, + newRequestBody = newBody; + + if (typeof oldBody === 'string') { + oldRequestBody = parseBody(oldBody, accept?.old); + } + + if (typeof newBody === 'string') { + newRequestBody = parseBody(newBody, accept?.new); + } + + if (oldRequestBody === undefined && newRequestBody === undefined) { + return; + } + + // if old body is empty string or undefined - we dont create JSON scheme + if (oldRequestBody === undefined && newRequestBody !== undefined) { + this.errorCollector.add( + ErrorType.ADD, + new MatchErrorRequestBody({ oldRequestBody, newRequestBody }) + ); + + return; + } + + if (oldRequestBody !== undefined && newRequestBody === undefined) { + this.errorCollector.add( + ErrorType.REMOVE, + new MatchErrorRequestBody({ oldRequestBody, newRequestBody }) + ); + + return; + } + + // TODO: create own algorithm for parsing this + const valid = this.createAndValidateSchema(oldRequestBody, newRequestBody); + + if (!valid) { + debugMatchingSensitive(schemaValidator.errors); + + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorRequestBody(schemaValidator.errorsText()) + ); + } + } + + private static async matchResponse( + oldRes: unknown, + newRes: unknown, + contentEncoding?: MatchHeaders + ): Promise { + debugMatching('\tresponse'); + let oldResponse = oldRes, + newResponse = newRes; + + // if responses are encoded - decode them + if (contentEncoding?.old !== undefined) { + debugMatching( + `Decoding old response based on ${contentEncoding.old} encoding` + ); + + oldResponse = await decodeResponse(oldRes, contentEncoding.old); + } + + if (contentEncoding?.new !== undefined) { + debugMatching( + `Decoding new response based on ${contentEncoding.new} encoding` + ); + + newResponse = await decodeResponse(newRes, contentEncoding.new); + } + + // validate responses + const valid = this.createAndValidateSchema(oldResponse, newResponse); + + if (!valid) { + debugMatching(schemaValidator.errors); + + this.errorCollector.add( + ErrorType.CHANGE, + new MatchErrorResponse( + { oldResponse, newResponse }, + schemaValidator.errorsText() + ) + ); + } + } + + private static createAndValidateSchema( + base: unknown, + payload: unknown + ): boolean { + const oldJsonSchema = createSchema(base); + + debugMatchingSensitive( + 'Generated JSON Schema from old recording:', + inspect(oldJsonSchema, true, 25) + ); + + return schemaValidator.validate(oldJsonSchema, payload); + } + + private static addError( + oldPayload: undefined | unknown, + newPayload: undefined | unknown, + error: MatchError + ) { + if (oldPayload === undefined && newPayload !== undefined) { + this.errorCollector.add(ErrorType.ADD, error); + } else if (oldPayload !== undefined && newPayload === undefined) { + this.errorCollector.add(ErrorType.REMOVE, error); + } else if (oldPayload !== newPayload) { + this.errorCollector.add(ErrorType.CHANGE, error); + } + } +} + +export async function matchTraffic( + oldRecordingPath: string, + newTraffic: RecordingDefinitions +): Promise { + // recording file exist -> record and compare new traffic + const oldRecording = await readFileQuiet( + composeRecordingPath(oldRecordingPath) + ); + + if (oldRecording === undefined) { + throw new UnexpectedError('Reading old recording file failed'); + } + + const oldRecordingDefs = JSON.parse(oldRecording) as RecordingDefinitions; + + // Match new HTTP traffic to saved for breaking changes + const match = await Matcher.match(oldRecordingDefs, newTraffic); + + if (match.valid) { + // do not save new recording as there were no breaking changes found + return { impact: MatchImpact.NONE }; + } else { + const impact = analyzeChangeImpact(match.errors); + + // Save new traffic + await writeRecordings( + composeRecordingPath(oldRecordingPath, 'new'), + newTraffic + ); + + return { impact, errors: match.errors }; + } +} diff --git a/src/nock/matcher.utils.test.ts b/src/nock/matcher.utils.test.ts new file mode 100644 index 00000000..b5c125a8 --- /dev/null +++ b/src/nock/matcher.utils.test.ts @@ -0,0 +1,126 @@ +import { UnexpectedError } from '../common/errors'; +import { + decodeResponse, + getRequestHeader, + getRequestHeaderValue, + getResponseHeader, + getResponseHeaderValue, + parseBody, +} from './matcher.utils'; + +describe('Matcher utils', () => { + describe('getRequestHeaderValue', () => { + it('returns string request header value', () => { + const headers = { 'Content-Type': 'application/json' }; + const expectedResult = 'application/json'; + + expect(getRequestHeaderValue('content-type', headers)).toBe( + expectedResult + ); + }); + + it('returns array of strings header value', () => { + const headers = { + 'Content-Type': ['application/json', 'application/xml'], + }; + const expectedResult = ['application/json', 'application/xml']; + + expect(getRequestHeaderValue('content-type', headers)).toEqual( + expectedResult + ); + }); + }); + + describe('getResponseHeaderValue', () => { + it('returns response header value', () => { + const headers = ['Content-Type', 'application/json']; + const expectedResult = 'application/json'; + + expect(getResponseHeaderValue('content-type', headers)).toBe( + expectedResult + ); + }); + }); + + describe('getResponseHeader', () => { + it('returns response header old and new values', () => { + const oldHeaders = ['Content-Type', 'application/json']; + const newHeaders = ['content-type', 'plain/text']; + const expectedResult = { + old: 'application/json', + new: 'plain/text', + }; + + expect(getResponseHeader(oldHeaders, newHeaders, 'content-type')).toEqual( + expectedResult + ); + }); + }); + + describe('getRequestHeader', () => { + it('returns request header value', () => { + const oldHeaders = { 'Content-Type': 'application/json' }; + const newHeaders = { 'content-type': 'plain/text' }; + const expectedResult = { + old: 'application/json', + new: 'plain/text', + }; + + expect(getRequestHeader(oldHeaders, newHeaders, 'content-type')).toEqual( + expectedResult + ); + }); + + it('returns request header value with multiple values', () => { + const oldHeaders = { + 'Content-Type': ['application/json', 'application/xml'], + }; + const newHeaders = { 'content-type': 'plain/text' }; + const expectedResult = { + old: 'application/json, application/xml', + new: 'plain/text', + }; + + expect(getRequestHeader(oldHeaders, newHeaders, 'content-type')).toEqual( + expectedResult + ); + }); + }); + + describe('decodeResponse', () => { + const contentEncoding = 'gzip'; + + it.each([() => ({ value: 1 }), { value: 1 }, '{value: 1}', true, 1])( + 'fails when specified response is not array of hex data', + async (response: unknown) => { + await expect( + decodeResponse(response, contentEncoding) + ).rejects.toThrowError( + new UnexpectedError( + `Response is encoded by "${contentEncoding}" and is not an array` + ) + ); + } + ); + }); + + describe('parseBody', () => { + it('returns undefined when body is empty string', () => { + const body = ''; + + expect(parseBody(body)).toBeUndefined(); + }); + + it.each([ + 'from%3D%7B%22name%22%3A%22test%22%7D%26to%3Dtest', + 'from={"name":"test"}&to=test', + ])('returns decoded request body', body => { + const expectedValue = { + from: { name: 'test' }, + to: 'test', + }; + + expect(parseBody(body)).toEqual(expectedValue); + }); + }); +}); diff --git a/src/nock/matcher.utils.ts b/src/nock/matcher.utils.ts new file mode 100644 index 00000000..4a9a953b --- /dev/null +++ b/src/nock/matcher.utils.ts @@ -0,0 +1,123 @@ +import { decodeBuffer } from 'http-encoding'; +import { ReplyBody } from 'nock/types'; +import { URLSearchParams } from 'url'; + +import { UnexpectedError } from '../common/errors'; +import { MatchHeaders } from './matcher'; + +export function getRequestHeaderValue( + headerName: string, + payload: Record +): string | string[] | undefined { + const headerKey = Object.keys(payload).find( + key => key.toLowerCase() === headerName.toLowerCase() + ); + + return headerKey ? payload[headerKey] : undefined; +} + +export function getRequestHeader( + oldHeaders: Record, + newHeaders: Record, + headerName: string +): MatchHeaders { + let oldHeader = getRequestHeaderValue(headerName, oldHeaders); + let newHeader = getRequestHeaderValue(headerName, newHeaders); + + if (Array.isArray(oldHeader)) { + oldHeader = oldHeader.join(', '); + } + + if (Array.isArray(newHeader)) { + newHeader = newHeader.join(', '); + } + + return { + old: oldHeader, + new: newHeader, + }; +} + +export function getResponseHeaderValue( + headerName: string, + payload: string[] +): string | undefined { + for (let i = 0; i < payload.length; i += 2) { + if (payload[i].toLowerCase() === headerName.toLowerCase()) { + return payload[i + 1]; + } + } + + return undefined; +} + +export function getResponseHeader( + oldHeaders: string[], + newHeaders: string[], + headerName: string +): MatchHeaders { + const oldHeader = getResponseHeaderValue(headerName, oldHeaders); + const newHeader = getResponseHeaderValue(headerName, newHeaders); + + return { + old: oldHeader, + new: newHeader, + }; +} + +/* eslint-disable-next-line @typescript-eslint/no-explicit-any */ +function composeBuffer(response: any[]): Buffer { + return Buffer.concat(response.map(res => Buffer.from(res, 'hex'))); +} + +export async function decodeResponse( + response: unknown, + contentEncoding: string +): Promise { + if (!Array.isArray(response)) { + throw new UnexpectedError( + `Response is encoded by "${contentEncoding}" and is not an array` + ); + } + + const buffer = composeBuffer(response); + + return JSON.parse( + (await decodeBuffer(buffer, contentEncoding)).toString() + ) as ReplyBody; +} + +/** + * Expect something like `To=%2Bxxx&From=%2Bxxx&Body=Hello+World%21` + * and want back: `{ To: "+xxx", From: "+xxx", Body: "Hello World!" }` + * + * Limitation: + * since URLSearchParams always transform params to string we can't + * generate correct schema for this if it contains numbers or booleans + */ +export function parseBody( + body: string, + _accept?: string +): Record | undefined { + if (body === '') { + return undefined; + } + + const parsedBody = decodeURIComponent(body); + const result: Record = {}; + const params = new URLSearchParams(parsedBody); + + for (const [key, value] of params.entries()) { + // parse value + let parsedValue: unknown; + if (value.startsWith('{') || value.startsWith('[')) { + parsedValue = JSON.parse(value); + } else { + parsedValue = value; + } + + result[key] = parsedValue; + } + + return result; +} diff --git a/src/nock/recorder.ts b/src/nock/recorder.ts index 209c657c..ff86a902 100644 --- a/src/nock/recorder.ts +++ b/src/nock/recorder.ts @@ -2,21 +2,23 @@ import { BoundProfileProvider } from '@superfaceai/one-sdk'; import createDebug from 'debug'; import { activate as activateNock, - define, + define as loadRecordingDefinitions, disableNetConnect, isActive as isNockActive, recorder, restore as restoreRecordings, } from 'nock'; +import { basename, dirname, join as joinPath } from 'path'; import { BaseURLNotFoundError, RecordingsNotFoundError, UnexpectedError, } from '../common/errors'; -import { exists, readFileQuiet } from '../common/io'; +import { exists, mkdirQuiet, readFileQuiet, rename } from '../common/io'; import { writeRecordings } from '../common/output-stream'; import { + AnalysisResult, InputVariables, ProcessingFunction, RecordingDefinitions, @@ -24,8 +26,10 @@ import { import { assertsDefinitionsAreNotStrings, checkSensitiveInformation, + parseBooleanEnv, replaceCredentials, } from '../superface-test.utils'; +import { matchTraffic } from './matcher'; const debug = createDebug('superface:testing'); @@ -74,6 +78,7 @@ export async function loadRecording({ beforeRecordingLoad?: ProcessingFunction; }; }): Promise { + const definitions = await getRecordings(recordingPath); const { parameters, security, services } = boundProfileProvider.configuration; const integrationParameters = parameters ?? {}; const baseUrl = services.getUrl(); @@ -82,20 +87,6 @@ export async function loadRecording({ throw new BaseURLNotFoundError(providerName); } - const recordingExists = await exists(recordingPath); - - if (!recordingExists) { - throw new RecordingsNotFoundError(); - } - - const definitionFile = await readFileQuiet(recordingPath); - - if (definitionFile === undefined) { - throw new UnexpectedError('Reading recording file failed'); - } - - const definitions = JSON.parse(definitionFile) as RecordingDefinitions; - if (options?.processRecordings) { //Use security configuration only replaceCredentials({ @@ -116,7 +107,7 @@ export async function loadRecording({ await options.beforeRecordingLoad(definitions); } - define(definitions); + loadRecordingDefinitions(definitions); debug('Loaded and mocked recorded traffic based on recording fixture'); @@ -150,7 +141,7 @@ export async function endRecording({ }; inputVariables?: InputVariables; beforeRecordingSave?: ProcessingFunction; -}): Promise { +}): Promise { const definitions = recorder.play(); recorder.clear(); restoreRecordings(); @@ -160,9 +151,9 @@ export async function endRecording({ ); if (definitions === undefined || definitions.length === 0) { - await writeRecordings(recordingPath, []); + await writeRecordings(composeRecordingPath(recordingPath), []); - return; + return undefined; } assertsDefinitionsAreNotStrings(definitions); @@ -196,12 +187,105 @@ export async function endRecording({ if ( security.length > 0 || - // securityValues.length > 0 || (integrationParameters && Object.values(integrationParameters).length > 0) ) { checkSensitiveInformation(definitions, security, integrationParameters); } - await writeRecordings(recordingPath, definitions); + const recordingExists = await exists(composeRecordingPath(recordingPath)); + + if (recordingExists) { + return await matchTraffic(recordingPath, definitions); + } + + // recording file does not exist -> record new traffic + await writeRecordings(composeRecordingPath(recordingPath), definitions); debug('Recorded definitions written'); + + return undefined; +} + +export function composeRecordingPath( + recordingPath: string, + version?: string +): string { + if (version === 'new') { + return `${recordingPath}-new.json`; + } + + if (version !== undefined) { + const baseDir = dirname(recordingPath); + const hash = basename(recordingPath); + + return joinPath(baseDir, 'old', `${hash}_${version}.json`); + } + + return `${recordingPath}.json`; +} + +export async function getRecordings( + recordingPath: string +): Promise { + // try to get new recordings if environment variable is set + const useNewTraffic = parseBooleanEnv(process.env.USE_NEW_TRAFFIC); + const newRecordingPath = composeRecordingPath(recordingPath, 'new'); + + if (useNewTraffic && (await exists(newRecordingPath))) { + return await parseRecordings(newRecordingPath); + } + + // otherwise use default ones + const currentRecordingPath = composeRecordingPath(recordingPath); + const recordingExists = await exists(currentRecordingPath); + + if (!recordingExists) { + throw new RecordingsNotFoundError(currentRecordingPath); + } + + return await parseRecordings(currentRecordingPath); +} + +export async function parseRecordings( + path: string +): Promise { + const definitionFile = await readFileQuiet(path); + + if (definitionFile === undefined) { + throw new UnexpectedError('Reading new recording file failed'); + } + + debug(`Parsing recording file at: "${path}"`); + + return JSON.parse(definitionFile) as RecordingDefinitions; +} + +export async function updateTraffic(recordingPath: string): Promise { + const pathToCurrent = composeRecordingPath(recordingPath); + const pathToNew = composeRecordingPath(recordingPath, 'new'); + + await mkdirQuiet(joinPath(dirname(pathToCurrent), 'old')); + + // TODO: compose version based on used map + let i = 0; + while (await exists(composeRecordingPath(recordingPath, `${i}`))) { + i++; + } + + await rename(pathToCurrent, composeRecordingPath(recordingPath, `${i}`)); + await rename(pathToNew, pathToCurrent); +} + +export async function canUpdateTraffic( + recordingPath: string +): Promise { + const updateTraffic = parseBooleanEnv(process.env.UPDATE_TRAFFIC); + + if ( + updateTraffic && + (await exists(composeRecordingPath(recordingPath, 'new'))) + ) { + return true; + } + + return false; } diff --git a/src/reporter.test.ts b/src/reporter.test.ts new file mode 100644 index 00000000..28fff5e9 --- /dev/null +++ b/src/reporter.test.ts @@ -0,0 +1,137 @@ +import { ok } from '@superfaceai/one-sdk'; +import { join as joinPath } from 'path'; +import { mocked } from 'ts-jest/utils'; + +import { CoverageFileNotFoundError } from './common/errors'; +import { exists, readFileQuiet, readFilesInDir } from './common/io'; +import { OutputStream } from './common/output-stream'; +import { MatchImpact } from './nock/analyzer'; +import { DEFAULT_COVERAGE_PATH, report, saveReport } from './reporter'; +import { ImpactResult, TestReport } from './superface-test.interfaces'; + +jest.mock('./common/io'); + +const sampleHash = 'XXX'; +const samplePath = joinPath('profile', 'provider', 'test'); +const sampleAnalysisResult: ImpactResult = { + errors: { added: [], changed: [], removed: [] }, + impact: MatchImpact.PATCH, +}; +const sampleTestResult = { + recordingPath: '', + profileId: 'profile', + providerName: 'provider', + useCaseName: 'test', +}; + +describe('Reporter module', () => { + describe('saveReport', () => { + it('composes path to coverage report', async () => { + const writeIfAbsentSpy = jest + .spyOn(OutputStream, 'writeIfAbsent') + .mockResolvedValue(true); + + const expectedPath = joinPath( + DEFAULT_COVERAGE_PATH, + samplePath, + `coverage-XXX.json` + ); + const expectedReport = { + ...sampleAnalysisResult, + ...sampleTestResult, + input: {}, + result: ok(''), + }; + + await expect( + saveReport({ + input: {}, + result: ok(''), + hash: sampleHash, + analysis: sampleAnalysisResult, + ...sampleTestResult, + }) + ).resolves.toBeUndefined(); + + expect(writeIfAbsentSpy).toBeCalledTimes(1); + expect(writeIfAbsentSpy).toBeCalledWith( + expectedPath, + JSON.stringify(expectedReport, null, 2), + { dirs: true, force: true } + ); + }); + + it('warns that writing failed, when writeIfAbsent returns false', async () => { + const consoleOutput: string[] = []; + const originalWarn = console.warn; + const mockedWarn = (output: string) => consoleOutput.push(output); + console.warn = mockedWarn; + + jest.spyOn(OutputStream, 'writeIfAbsent').mockResolvedValue(false); + + await saveReport({ + input: {}, + result: ok(''), + hash: sampleHash, + analysis: sampleAnalysisResult, + ...sampleTestResult, + }); + + expect(consoleOutput).toEqual(['Writing coverage data failed']); + + console.warn = originalWarn; + }); + }); + + describe('report', () => { + it('does not look for files, if coverage directory was not found', async () => { + const existsSpy = mocked(exists).mockResolvedValue(false); + const readFilesInDirSpy = mocked(readFilesInDir); + + await report((_analysis: TestReport) => ({})); + + expect(existsSpy).toBeCalledTimes(1); + expect(existsSpy).toBeCalledWith(DEFAULT_COVERAGE_PATH); + + expect(readFilesInDirSpy).not.toBeCalled(); + }); + + it('fails when found file cannot be found', async () => { + const samplePath = 'path/to/coverage.json'; + mocked(exists).mockResolvedValue(true); + const readFilesInDirSpy = mocked(readFilesInDir).mockResolvedValue([ + samplePath, + ]); + const readFileQuietSpy = + mocked(readFileQuiet).mockResolvedValue(undefined); + + await expect( + report((_analysis: TestReport) => ({})) + ).rejects.toThrowError(new CoverageFileNotFoundError(samplePath)); + + expect(readFilesInDirSpy).toBeCalledTimes(1); + expect(readFilesInDirSpy).toBeCalledWith(DEFAULT_COVERAGE_PATH); + + expect(readFileQuietSpy).toBeCalledTimes(1); + expect(readFileQuietSpy).toBeCalledWith(samplePath); + }); + + it('calls injected alert function with parsed files', async () => { + const sampleReport = { + ...sampleAnalysisResult, + input: {}, + result: ok(''), + }; + const alertSpy = jest.fn(); + + mocked(exists).mockResolvedValue(true); + mocked(readFilesInDir).mockResolvedValue([samplePath]); + mocked(readFileQuiet).mockResolvedValue(JSON.stringify(sampleReport)); + + await expect(report(alertSpy)).resolves.toBeUndefined(); + + expect(alertSpy).toBeCalledTimes(1); + expect(alertSpy).toBeCalledWith([sampleReport]); + }); + }); +}); diff --git a/src/reporter.ts b/src/reporter.ts new file mode 100644 index 00000000..3b93c6b1 --- /dev/null +++ b/src/reporter.ts @@ -0,0 +1,152 @@ +import { NonPrimitive } from '@superfaceai/one-sdk'; +import createDebug from 'debug'; +import { join as joinPath } from 'path'; + +import { CoverageFileNotFoundError } from './common/errors'; +import { getFixtureName } from './common/format'; +import { exists, readFileQuiet, readFilesInDir, rimraf } from './common/io'; +import { OutputStream } from './common/output-stream'; +import { ErrorCollection, MatchError } from './nock/matcher.errors'; +import { + ImpactResult, + TestAnalysis, + TestingReturn, + TestReport, +} from './superface-test.interfaces'; + +export const DEFAULT_COVERAGE_PATH = 'superface-test-coverage'; + +const debug = createDebug('superface:testing:reporter'); + +/** + * Saves provider change report along with input and result + * on filesystem under /superface-test-coverage + */ +export async function saveReport({ + input, + result, + hash, + analysis, + recordingPath, + profileId, + providerName, + useCaseName, +}: { + input: NonPrimitive; + result: TestingReturn; + hash: string; + recordingPath: string; + analysis: ImpactResult; + profileId: string; + providerName: string; + useCaseName: string; +}): Promise { + debug('Saving coverage report'); + const coveragePath = joinPath( + DEFAULT_COVERAGE_PATH, + getFixtureName(profileId, providerName, useCaseName), + `coverage-${hash}.json` + ); + + const data: TestAnalysis = { + ...analysis, + recordingPath, + profileId, + providerName, + useCaseName, + input, + result, + errors: parseErrors(analysis.errors), + }; + + debug(`Writing report on path "${coveragePath}"`); + const write = await OutputStream.writeIfAbsent( + coveragePath, + JSON.stringify(data, null, 2), + { dirs: true, force: true } + ); + + if (!write) { + console.warn('Writing coverage data failed'); + } +} + +/** + * Parses catched errors to strings. + * + * @param errors error collection with MatchError instances + * @returns error collection with strings + */ +function parseErrors( + errors: ErrorCollection +): ErrorCollection { + const result: ErrorCollection = { + added: [], + changed: [], + removed: [], + }; + + for (const error of errors.added) { + result.added.push(error.toString()); + } + + for (const error of errors.removed) { + result.removed.push(error.toString()); + } + + for (const error of errors.changed) { + result.changed.push(error.toString()); + } + + return result; +} + +export async function report( + alert: (analysis: TestReport) => unknown | Promise, + _options?: { onlyFailedTests?: boolean } +): Promise { + debug('Collecting reports from superface runs'); + if (!(await exists(DEFAULT_COVERAGE_PATH))) { + debug('Directory with reports is not created yet'); + + return; + } + + const paths = await readFilesInDir(DEFAULT_COVERAGE_PATH); + debug('Available paths:', paths.join('\n')); + + const report: TestReport = []; + for (const path of paths) { + const data = await readFileQuiet(path); + + if (!data) { + throw new CoverageFileNotFoundError(path); + } + + const coverage = JSON.parse(data) as TestAnalysis; + + report.push(coverage); + + // TODO: + // filter data based on parameters + } + + // remove coverage as it is no longer needed + await rimraf(DEFAULT_COVERAGE_PATH); + + debug(`Alerting test analysis report. Analysis Count: ${report.length}`); + + if (report.length > 0) { + await alert(report); + } +} + +// TODO: collect coverage for completed test, put them into batch and add info about test result +// async function collect(): Promise { +// // TODO: find a way to get to correct path without concurrency problems +// // const basePath = joinPath('./coverage'); +// // const write = await OutputStream.writeIfAbsent(path, data, { dirs: true }); +// // if (!write) { +// // console.warn('Writing coverage data failed'); +// // } +// } diff --git a/src/superface-test.interfaces.ts b/src/superface-test.interfaces.ts index be24ba13..158f6032 100644 --- a/src/superface-test.interfaces.ts +++ b/src/superface-test.interfaces.ts @@ -1,13 +1,19 @@ import { + MapInterpreterError, NonPrimitive, Primitive, Profile, + ProfileParameterError, Provider, Result, + UnexpectedError, UseCase, } from '@superfaceai/one-sdk'; import { Definition } from 'nock/types'; +import { MatchImpact } from './nock/analyzer'; +import { ErrorCollection, MatchError } from './nock/matcher.errors'; + export interface SuperfaceTestConfig { profile?: Profile | string; provider?: Provider | string; @@ -21,9 +27,15 @@ export interface HashOptions { testName?: string; } +export type AlertFunction = (report: TestReport) => unknown | Promise; + export type SuperfaceTestRun = SuperfaceTestConfig & HashOptions; -export type TestingReturn = Result; +export type PerformError = + | ProfileParameterError + | MapInterpreterError + | UnexpectedError; +export type TestingReturn = Result; export interface NockConfig { path?: string; @@ -46,4 +58,29 @@ export interface RecordingProcessOptions { beforeRecordingSave?: ProcessingFunction; beforeRecordingLoad?: ProcessingFunction; hideInput?: string[]; + fullError?: boolean; +} + +export interface NoImpactResult { + impact: MatchImpact.NONE; +} + +export interface ImpactResult { + impact: MatchImpact.MAJOR | MatchImpact.MINOR | MatchImpact.PATCH; + errors: ErrorCollection; } + +export type AnalysisResult = NoImpactResult | ImpactResult; + +export type TestAnalysis = { + impact: MatchImpact; + profileId: string; + providerName: string; + useCaseName: string; + recordingPath: string; + input: NonPrimitive; + result: TestingReturn; + errors: ErrorCollection; +}; + +export type TestReport = TestAnalysis[]; diff --git a/src/superface-test.test.ts b/src/superface-test.test.ts index 97d7630b..468279c6 100644 --- a/src/superface-test.test.ts +++ b/src/superface-test.test.ts @@ -1,7 +1,7 @@ import { ApiKeyPlacement, SecurityType } from '@superfaceai/ast'; import { err, MapASTError, ok } from '@superfaceai/one-sdk'; import nock, { pendingMocks, recorder } from 'nock'; -import { join as joinPath } from 'path'; +import { join as joinPath, resolve as resolvePath } from 'path'; import { mocked } from 'ts-jest/utils'; import { RecordingsNotFoundError } from './common/errors'; @@ -9,10 +9,12 @@ import { matchWildCard } from './common/format'; import { exists, readFileQuiet } from './common/io'; import { writeRecordings } from './common/output-stream'; import { generate } from './generate-hash'; +import { Matcher } from './nock/matcher'; +import { MatchErrorResponse } from './nock/matcher.errors'; +import { saveReport } from './reporter'; import { prepareSuperface } from './superface/config'; import { mockSuperface } from './superface/mock/superface'; import { SuperfaceTest } from './superface-test'; -import { SuperfaceTestConfig } from './superface-test.interfaces'; import { HIDDEN_CREDENTIALS_PLACEHOLDER, HIDDEN_INPUT_PLACEHOLDER, @@ -38,12 +40,16 @@ jest.mock('./common/output-stream', () => ({ writeRecordings: jest.fn(), })); -const testPayload: SuperfaceTestConfig = { +const testPayload = { profile: 'profile', provider: 'provider', useCase: 'test', }; +jest.mock('./reporter', () => ({ + saveReport: jest.fn(), +})); + const DEFAULT_RECORDING_PATH = joinPath(process.cwd(), 'nock'); describe('SuperfaceTest', () => { @@ -53,6 +59,7 @@ describe('SuperfaceTest', () => { mocked(exists).mockReset(); mocked(matchWildCard).mockReset(); mocked(writeRecordings).mockReset(); + mocked(saveReport).mockReset(); }); describe('run', () => { @@ -270,6 +277,101 @@ describe('SuperfaceTest', () => { ] ); }); + + describe('and when old traffic already exists', () => { + it('does not write recordings when old and new traffic match', async () => { + superfaceTest = new SuperfaceTest(testPayload); + + const sampleRecording = { + scope: 'https://localhost', + path: `/path`, + status: 200, + response: { value: 1 }, + }; + + const writeRecordingsSpy = mocked(writeRecordings); + jest.spyOn(recorder, 'play').mockReturnValueOnce([sampleRecording]); + const matcherSpy = jest + .spyOn(Matcher, 'match') + .mockResolvedValue({ valid: true }); + + mocked(prepareSuperface).mockResolvedValue(mockSuperface()); + mocked(exists).mockResolvedValue(true); + mocked(matchWildCard).mockReturnValueOnce(true); + mocked(readFileQuiet).mockResolvedValue( + JSON.stringify([sampleRecording]) + ); + + await superfaceTest.run({ input: {} }); + + expect(writeRecordingsSpy).not.toBeCalled(); + expect(matcherSpy).toBeCalledTimes(1); + expect(matcherSpy).toBeCalledWith( + [sampleRecording], + [sampleRecording] + ); + }); + + it('writes recordings when old and new traffic does not match', async () => { + superfaceTest = new SuperfaceTest(testPayload); + + const oldRecording = { + scope: 'https://localhost', + path: `/path`, + status: 200, + response: { value: 1 }, + }; + + const newRecording = { + scope: 'https://localhost', + path: `/path`, + status: 200, + response: { new_value: 1 }, + }; + + jest.spyOn(recorder, 'play').mockReturnValueOnce([newRecording]); + + const writeRecordingsSpy = mocked(writeRecordings); + const errors = { + added: [], + removed: [], + changed: [ + new MatchErrorResponse( + { + oldResponse: { value: 1 }, + newResponse: { new_value: 1 }, + }, + 'response property "value" is not present' + ), + ], + }; + const matcherSpy = jest.spyOn(Matcher, 'match').mockResolvedValue({ + valid: false, + errors, + }); + const saveReportSpy = mocked(saveReport); + + mocked(prepareSuperface).mockResolvedValue(mockSuperface()); + mocked(exists).mockResolvedValue(true); + mocked(matchWildCard).mockReturnValueOnce(true); + mocked(readFileQuiet).mockResolvedValue( + JSON.stringify([oldRecording]) + ); + + await superfaceTest.run({ input: {} }); + + expect(matcherSpy).toBeCalledTimes(1); + expect(matcherSpy).toBeCalledWith([oldRecording], [newRecording]); + + expect(writeRecordingsSpy).toBeCalledTimes(1); + expect(writeRecordingsSpy).toBeCalledWith( + expect.stringContaining('new'), + [newRecording] + ); + + expect(saveReportSpy).toBeCalledTimes(1); + }); + }); }); describe('when hashing recordings', () => { @@ -358,6 +460,12 @@ describe('SuperfaceTest', () => { describe('when loading recordings', () => { it('throws when recording fixture does not exist', async () => { + const testName = 'my-test-name'; + const expectedHash = generate(testName); + const recordingPath = resolvePath( + `nock/${testPayload.profile}/${testPayload.provider}/${testPayload.useCase}/recording-${expectedHash}.json` + ); + superfaceTest = new SuperfaceTest(testPayload); const recorderSpy = jest.spyOn(recorder, 'rec'); @@ -365,9 +473,9 @@ describe('SuperfaceTest', () => { mocked(matchWildCard).mockReturnValueOnce(false); mocked(prepareSuperface).mockResolvedValue(mockSuperface()); - await expect(superfaceTest.run({ input: {} })).rejects.toThrowError( - new RecordingsNotFoundError() - ); + await expect( + superfaceTest.run({ input: {}, testName }) + ).rejects.toThrowError(new RecordingsNotFoundError(recordingPath)); expect(recorderSpy).not.toHaveBeenCalled(); }); @@ -415,8 +523,10 @@ describe('SuperfaceTest', () => { mocked(matchWildCard).mockReturnValueOnce(true); - await expect(superfaceTest.run({ input: {} })).resolves.toEqual({ - error: new MapASTError('error').toString(), + await expect( + superfaceTest.run({ input: {} }, { fullError: true }) + ).resolves.toEqual({ + error: new MapASTError('error'), }); }); diff --git a/src/superface-test.ts b/src/superface-test.ts index 7dde225b..d4ee2f03 100644 --- a/src/superface-test.ts +++ b/src/superface-test.ts @@ -1,32 +1,37 @@ -import { - err, - MapInterpreterError, - ok, - ProfileParameterError, - Result, - UnexpectedError, -} from '@superfaceai/one-sdk'; +import { err, ok, Result } from '@superfaceai/one-sdk'; import createDebug from 'debug'; import { enableNetConnect, recorder, restore as restoreRecordings } from 'nock'; import { join as joinPath } from 'path'; -import { RecordingProcessOptions } from '.'; -import { UnexpectedError as UnexpectedErrorTesting } from './common/errors'; -import { - getFixtureName, - matchWildCard, - removeTimestamp, -} from './common/format'; +import { UnexpectedError } from './common/errors'; +import { getFixtureName, matchWildCard } from './common/format'; import { IGenerator } from './generate-hash'; -import { endRecording, loadRecording, startRecording } from './nock/recorder'; +import { MatchImpact } from './nock/analyzer'; +import { + canUpdateTraffic, + endRecording, + loadRecording, + startRecording, + updateTraffic, +} from './nock/recorder'; +import { report, saveReport } from './reporter'; import { prepareSuperface } from './superface/config'; import { + AlertFunction, + AnalysisResult, NockConfig, + PerformError, + RecordingProcessOptions, SuperfaceTestConfig, SuperfaceTestRun, TestingReturn, } from './superface-test.interfaces'; -import { getGenerator, searchValues } from './superface-test.utils'; +import { + getGenerator, + mapError, + parseBooleanEnv, + searchValues, +} from './superface-test.utils'; const debug = createDebug('superface:testing'); const debugSetup = createDebug('superface:testing:setup'); @@ -34,6 +39,7 @@ const debugHashing = createDebug('superface:testing:hash'); export class SuperfaceTest { private nockConfig?: NockConfig; + private analysis?: AnalysisResult; private generator: IGenerator; public configuration: SuperfaceTestConfig | undefined; @@ -68,17 +74,22 @@ export class SuperfaceTest { debugHashing('Created hash:', hash); const recordingPath = this.setupRecordingPath( - getFixtureName(sf.profileId, sf.providerName, sf.usecaseName), + getFixtureName(sf.profileId, sf.providerName, sf.useCaseName), hash ); debugSetup('Prepared path to recording:', recordingPath); + // Replace currently supported traffic with new (with changes) + if (await canUpdateTraffic(recordingPath)) { + await updateTraffic(recordingPath); + } + // Parse env variable and check if test should be recorded const record = matchWildCard( sf.profileId, sf.providerName, - sf.usecaseName, + sf.useCaseName, process.env.SUPERFACE_LIVE_API ); const processRecordings = options?.processRecordings ?? true; @@ -101,16 +112,13 @@ export class SuperfaceTest { }); } - let result: Result< - unknown, - ProfileParameterError | MapInterpreterError | UnexpectedError - >; + let result: Result; try { // Run perform method on specified configuration - result = await sf.boundProfileProvider.perform(sf.usecaseName, input); + result = await sf.boundProfileProvider.perform(sf.useCaseName, input); if (record) { - await endRecording({ + this.analysis = await endRecording({ recordingPath, processRecordings, inputVariables, @@ -134,10 +142,33 @@ export class SuperfaceTest { throw error; } + if ( + this.analysis && + this.analysis.impact !== MatchImpact.NONE && + !parseBooleanEnv(process.env.DISABLE_PROVIDER_CHANGES_COVERAGE) + ) { + await saveReport({ + input, + result, + hash: this.generator.hash({ input, testName }), + recordingPath, + profileId: sf.profileId, + providerName: sf.providerName, + useCaseName: sf.useCaseName, + analysis: this.analysis, + }); + } + + this.analysis = undefined; + if (result.isErr()) { debug('Perform failed with error:', result.error.toString()); - return err(removeTimestamp(result.error.toString())); + if (options?.fullError) { + return err(mapError(result.error)); + } + + return err(result.error.toString()); } if (result.isOk()) { @@ -146,7 +177,20 @@ export class SuperfaceTest { return ok(result.value); } - throw new UnexpectedErrorTesting('Unexpected result object'); + throw new UnexpectedError('Unexpected result object'); + } + + // static async collectData(): Promise { + // await collect(); + // } + + static async report( + alert: AlertFunction, + options?: { + onlyFailedTests?: boolean; + } + ): Promise { + await report(alert, options); } /** @@ -158,7 +202,7 @@ export class SuperfaceTest { return joinPath( path ?? joinPath(process.cwd(), 'nock'), fixtureName, - `${fixture ?? 'recording'}-${inputHash}.json` + `${fixture ?? 'recording'}-${inputHash}` ); } } diff --git a/src/superface-test.utils.ts b/src/superface-test.utils.ts index fc91e29b..93a7ae0e 100644 --- a/src/superface-test.utils.ts +++ b/src/superface-test.utils.ts @@ -10,7 +10,6 @@ import { } from '@superfaceai/one-sdk'; import createDebug from 'debug'; -import { InputVariables, RecordingDefinition, RecordingDefinitions } from '.'; import { IGenerator, InputGenerateHash, @@ -22,8 +21,14 @@ import { replaceInputInDefinition, replaceParameterInDefinition, } from './nock'; +import { + InputVariables, + PerformError, + RecordingDefinition, + RecordingDefinitions, +} from './superface-test.interfaces'; -const debug = createDebug('superface:testing'); +const debugRecording = createDebug('superface:testing:recordings'); export function assertsDefinitionsAreNotStrings( definitions: string[] | RecordingDefinition[] @@ -36,7 +41,7 @@ export function assertsDefinitionsAreNotStrings( } export function resolveCredential(securityValue: SecurityValues): string { - debug('Resolving security value:', securityValue.id); + debugRecording('Resolving security value:', securityValue.id); if ('apikey' in securityValue) { if (securityValue.apikey.startsWith('$')) { @@ -135,11 +140,11 @@ export function replaceCredentials({ beforeSave: boolean; baseUrl: string; }): void { - debug('Replacing credentials from recording definitions'); + debugRecording('Replacing credentials from recording definitions'); for (const definition of definitions) { for (const securityConfig of security) { - debug( + debugRecording( `Going through scheme with id: '${securityConfig.id}' and type: '${securityConfig.type}'` ); @@ -157,7 +162,7 @@ export function replaceCredentials({ } for (const [name, value] of Object.entries(integrationParameters)) { - debug('Going through integration parameter:', name); + debugRecording('Going through integration parameter:', name); replaceParameterInDefinition({ definition, @@ -168,7 +173,7 @@ export function replaceCredentials({ if (inputVariables) { for (const [name, value] of Object.entries(inputVariables)) { - debug('Going through input property:', name); + debugRecording('Going through input property:', name); replaceInputInDefinition({ definition, @@ -336,3 +341,29 @@ export function getGenerator(testInstance: unknown): IGenerator { return new InputGenerateHash(); } + +export function parseBooleanEnv(variable: string | undefined): boolean { + if (variable === 'true') { + return true; + } + + if (variable === 'false') { + return false; + } + + return false; +} + +/** + * @param error - error returned from perform + * @returns perform error without ast metadata + */ +export function mapError(error: PerformError): PerformError { + const result = error; + + if ('metadata' in result) { + delete result.metadata; + } + + return result; +} diff --git a/src/superface/config/prepare-superface.test.ts b/src/superface/config/prepare-superface.test.ts index f50e205e..754e77dc 100644 --- a/src/superface/config/prepare-superface.test.ts +++ b/src/superface/config/prepare-superface.test.ts @@ -49,7 +49,7 @@ describe('prepare superface module', () => { ).resolves.toEqual({ profileId: 'profile', providerName: 'provider', - usecaseName: 'test', + useCaseName: 'test', files: expectedFiles, boundProfileProvider: expectedBoundProfileProvider, }); diff --git a/src/superface/config/prepare-superface.ts b/src/superface/config/prepare-superface.ts index 12e34c75..2a1d672e 100644 --- a/src/superface/config/prepare-superface.ts +++ b/src/superface/config/prepare-superface.ts @@ -26,7 +26,7 @@ export interface SuperfaceConfiguration { boundProfileProvider: BoundProfileProvider; profileId: string; providerName: string; - usecaseName: string; + useCaseName: string; files: { superJson: NormalizedSuperJsonDocument; profileAst: ProfileDocumentNode; @@ -54,13 +54,13 @@ export async function prepareSuperface( fileSystem: options?.fileSystem, }); - const usecaseName = + const useCaseName = payload.useCase instanceof UseCase ? payload.useCase.name : payload.useCase; return { profileId: profileAstId(files.profileAst), providerName: files.providerJson.name, - usecaseName, + useCaseName, files, boundProfileProvider: createBoundProfileProvider({ ...files, diff --git a/src/superface/mock/superface.ts b/src/superface/mock/superface.ts index d027edf6..b6ab84a6 100644 --- a/src/superface/mock/superface.ts +++ b/src/superface/mock/superface.ts @@ -78,7 +78,7 @@ export const mockSuperface = (options?: { }, profileId: options?.profile?.name ?? 'profile', providerName: options?.provider?.name ?? 'provider', - usecaseName: options?.useCaseName ?? 'test', + useCaseName: options?.useCaseName ?? 'test', boundProfileProvider, }; }; diff --git a/yarn.lock b/yarn.lock index 3c683b43..0f59d0d0 100644 --- a/yarn.lock +++ b/yarn.lock @@ -940,6 +940,16 @@ ajv@^8.0.1: require-from-string "^2.0.2" uri-js "^4.2.2" +ajv@^8.11.0: + version "8.11.0" + resolved "https://registry.yarnpkg.com/ajv/-/ajv-8.11.0.tgz#977e91dd96ca669f54a11e23e378e33b884a565f" + integrity sha512-wGgprdCvMalC0BztXvitD2hC04YffAvtsUn93JbGXYLAtCUO4xd17mCCZQxUOItiBwZvJScWo8NIvQMQ71rdpg== + dependencies: + fast-deep-equal "^3.1.1" + json-schema-traverse "^1.0.0" + require-from-string "^2.0.2" + uri-js "^4.2.2" + ansi-colors@^4.1.1: version "4.1.1" resolved "https://registry.yarnpkg.com/ansi-colors/-/ansi-colors-4.1.1.tgz#cbb9ae256bf750af1eab344f229aa27fe94ba348" @@ -1107,6 +1117,11 @@ braces@^3.0.1: dependencies: fill-range "^7.0.1" +brotli-wasm@^1.1.0: + version "1.2.0" + resolved "https://registry.yarnpkg.com/brotli-wasm/-/brotli-wasm-1.2.0.tgz#0f99b97b0020c8152308c277388aecf2a06b6e32" + integrity sha512-PdDi7awF36zFujZyFJb9UNrP1l+If7iCgXhLKE1SpwqFQSK2yc7w2dysOmME7p325yQaZNvae7ruzypB3YhFxA== + browser-process-hrtime@^1.0.0: version "1.0.0" resolved "https://registry.yarnpkg.com/browser-process-hrtime/-/browser-process-hrtime-1.0.0.tgz#3c9b4b7d782c8121e56f10106d84c0d0ffc94626" @@ -1845,6 +1860,11 @@ functional-red-black-tree@^1.0.1: resolved "https://registry.yarnpkg.com/functional-red-black-tree/-/functional-red-black-tree-1.0.1.tgz#1b0ab3bd553b2a0d6399d29c0e3ea0b252078327" integrity sha1-GwqzvVU7Kg1jmdKcDj6gslIHgyc= +genson-js@^0.0.8: + version "0.0.8" + resolved "https://registry.yarnpkg.com/genson-js/-/genson-js-0.0.8.tgz#b5a2c9dad7b821b0c08f103ccbe2de88e88b1f34" + integrity sha512-4NUusDTwF+lzYh72uKV+Uvpky9iPO+YDIMpGImA5pbHfLV9HwgRCA4hYjGu78V4J4Cx2IZRTFfRERn9aUs74mw== + gensync@^1.0.0-beta.2: version "1.0.0-beta.2" resolved "https://registry.yarnpkg.com/gensync/-/gensync-1.0.0-beta.2.tgz#32a6ee76c3d7f52d46b2b1ae5d93fea8580a25e0" @@ -1973,6 +1993,15 @@ html-escaper@^2.0.0: resolved "https://registry.yarnpkg.com/html-escaper/-/html-escaper-2.0.2.tgz#dfd60027da36a36dfcbe236262c00a5822681453" integrity sha512-H2iMtd0I4Mt5eYiapRdIDjp+XzelXQ0tFE4JS7YFwFevXXMmOp9myNrUvCg0D6ws8iqkRPBfKHgbwig1SmlLfg== +http-encoding@^1.5.1: + version "1.5.1" + resolved "https://registry.yarnpkg.com/http-encoding/-/http-encoding-1.5.1.tgz#ae0b48fbe97b5a2e0a211fdb3e5412bead8d9865" + integrity sha512-2m4JnG1Z5RX5pRMdccyp6rX1jVo4LO+ussQzWdwR4AmrWhtX0KP1NyslVAFAspQwMxt2P00CCWXIBKj7ILZLpQ== + dependencies: + brotli-wasm "^1.1.0" + pify "^5.0.0" + zstd-codec "^0.1.4" + http-proxy-agent@^4.0.1: version "4.0.1" resolved "https://registry.yarnpkg.com/http-proxy-agent/-/http-proxy-agent-4.0.1.tgz#8a8c8ef7f5932ccf953c296ca8291b95aa74aa3a" @@ -3136,6 +3165,11 @@ pify@^3.0.0: resolved "https://registry.yarnpkg.com/pify/-/pify-3.0.0.tgz#e5a4acd2c101fdf3d9a4d07f0dbc4db49dd28176" integrity sha1-5aSs0sEB/fPZpNB/DbxNtJ3SgXY= +pify@^5.0.0: + version "5.0.0" + resolved "https://registry.yarnpkg.com/pify/-/pify-5.0.0.tgz#1f5eca3f5e87ebec28cc6d54a0e4aaf00acc127f" + integrity sha512-eW/gHNMlxdSP6dmG6uJip6FXN0EQBwm2clYYd8Wul42Cwu/DK8HEftzsapcNdYe2MfLiIwZqsDk2RDEsTE79hA== + pirates@^4.0.1: version "4.0.1" resolved "https://registry.yarnpkg.com/pirates/-/pirates-4.0.1.tgz#643a92caf894566f91b2b986d2c66950a8e2fb87" @@ -3902,3 +3936,8 @@ yargs@^16.0.3: string-width "^4.2.0" y18n "^5.0.5" yargs-parser "^20.2.2" + +zstd-codec@^0.1.4: + version "0.1.4" + resolved "https://registry.yarnpkg.com/zstd-codec/-/zstd-codec-0.1.4.tgz#6abb311b63cfacbd06e72797ee6c6e1c7c65248c" + integrity sha512-KYnWoFWgGtWyQEKNnUcb3u8ZtKO8dn5d8u+oGpxPlopqsPyv60U8suDyfk7Z7UtAO6Sk5i1aVcAs9RbaB1n36A==