From d48b5224c09f0791fa89985e38eb574b2b43f280 Mon Sep 17 00:00:00 2001 From: Chengzhong Wu Date: Fri, 6 Dec 2024 14:22:06 +0000 Subject: [PATCH 01/88] doc: fix module.md headings PR-URL: https://github.com/nodejs/node/pull/56131 Reviewed-By: Antoine du Hamel Reviewed-By: Luigi Pinca Reviewed-By: Marco Ippolito --- doc/api/module.md | 338 +++++++++++++++++++++++----------------------- 1 file changed, 169 insertions(+), 169 deletions(-) diff --git a/doc/api/module.md b/doc/api/module.md index 19346182bf42d7..42809cea3b9565 100644 --- a/doc/api/module.md +++ b/doc/api/module.md @@ -64,159 +64,6 @@ const require = createRequire(import.meta.url); const siblingModule = require('./sibling-module'); ``` -### `module.constants.compileCacheStatus` - - - -> Stability: 1.1 - Active Development - -The following constants are returned as the `status` field in the object returned by -[`module.enableCompileCache()`][] to indicate the result of the attempt to enable the -[module compile cache][]. - - - - - - - - - - - - - - - - - - - - - - -
ConstantDescription
ENABLED - Node.js has enabled the compile cache successfully. The directory used to store the - compile cache will be returned in the directory field in the - returned object. -
ALREADY_ENABLED - The compile cache has already been enabled before, either by a previous call to - module.enableCompileCache(), or by the NODE_COMPILE_CACHE=dir - environment variable. The directory used to store the - compile cache will be returned in the directory field in the - returned object. -
FAILED - Node.js fails to enable the compile cache. This can be caused by the lack of - permission to use the specified directory, or various kinds of file system errors. - The detail of the failure will be returned in the message field in the - returned object. -
DISABLED - Node.js cannot enable the compile cache because the environment variable - NODE_DISABLE_COMPILE_CACHE=1 has been set. -
- -### `module.enableCompileCache([cacheDir])` - - - -> Stability: 1.1 - Active Development - -* `cacheDir` {string|undefined} Optional path to specify the directory where the compile cache - will be stored/retrieved. -* Returns: {Object} - * `status` {integer} One of the [`module.constants.compileCacheStatus`][] - * `message` {string|undefined} If Node.js cannot enable the compile cache, this contains - the error message. Only set if `status` is `module.constants.compileCacheStatus.FAILED`. - * `directory` {string|undefined} If the compile cache is enabled, this contains the directory - where the compile cache is stored. Only set if `status` is - `module.constants.compileCacheStatus.ENABLED` or - `module.constants.compileCacheStatus.ALREADY_ENABLED`. - -Enable [module compile cache][] in the current Node.js instance. - -If `cacheDir` is not specified, Node.js will either use the directory specified by the -[`NODE_COMPILE_CACHE=dir`][] environment variable if it's set, or use -`path.join(os.tmpdir(), 'node-compile-cache')` otherwise. For general use cases, it's -recommended to call `module.enableCompileCache()` without specifying the `cacheDir`, -so that the directory can be overridden by the `NODE_COMPILE_CACHE` environment -variable when necessary. - -Since compile cache is supposed to be a quiet optimization that is not required for the -application to be functional, this method is designed to not throw any exception when the -compile cache cannot be enabled. Instead, it will return an object containing an error -message in the `message` field to aid debugging. -If compile cache is enabled successfully, the `directory` field in the returned object -contains the path to the directory where the compile cache is stored. The `status` -field in the returned object would be one of the `module.constants.compileCacheStatus` -values to indicate the result of the attempt to enable the [module compile cache][]. - -This method only affects the current Node.js instance. To enable it in child worker threads, -either call this method in child worker threads too, or set the -`process.env.NODE_COMPILE_CACHE` value to compile cache directory so the behavior can -be inherited into the child workers. The directory can be obtained either from the -`directory` field returned by this method, or with [`module.getCompileCacheDir()`][]. - -#### Module compile cache - - - -The module compile cache can be enabled either using the [`module.enableCompileCache()`][] -method or the [`NODE_COMPILE_CACHE=dir`][] environment variable. After it is enabled, -whenever Node.js compiles a CommonJS or a ECMAScript Module, it will use on-disk -[V8 code cache][] persisted in the specified directory to speed up the compilation. -This may slow down the first load of a module graph, but subsequent loads of the same module -graph may get a significant speedup if the contents of the modules do not change. - -To clean up the generated compile cache on disk, simply remove the cache directory. The cache -directory will be recreated the next time the same directory is used for for compile cache -storage. To avoid filling up the disk with stale cache, it is recommended to use a directory -under the [`os.tmpdir()`][]. If the compile cache is enabled by a call to -[`module.enableCompileCache()`][] without specifying the directory, Node.js will use -the [`NODE_COMPILE_CACHE=dir`][] environment variable if it's set, or defaults -to `path.join(os.tmpdir(), 'node-compile-cache')` otherwise. To locate the compile cache -directory used by a running Node.js instance, use [`module.getCompileCacheDir()`][]. - -Currently when using the compile cache with [V8 JavaScript code coverage][], the -coverage being collected by V8 may be less precise in functions that are -deserialized from the code cache. It's recommended to turn this off when -running tests to generate precise coverage. - -The enabled module compile cache can be disabled by the [`NODE_DISABLE_COMPILE_CACHE=1`][] -environment variable. This can be useful when the compile cache leads to unexpected or -undesired behaviors (e.g. less precise test coverage). - -Compilation cache generated by one version of Node.js can not be reused by a different -version of Node.js. Cache generated by different versions of Node.js will be stored -separately if the same base directory is used to persist the cache, so they can co-exist. - -At the moment, when the compile cache is enabled and a module is loaded afresh, the -code cache is generated from the compiled code immediately, but will only be written -to disk when the Node.js instance is about to exit. This is subject to change. The -[`module.flushCompileCache()`][] method can be used to ensure the accumulated code cache -is flushed to disk in case the application wants to spawn other Node.js instances -and let them share the cache long before the parent exits. - -### `module.getCompileCacheDir()` - - - -> Stability: 1.1 - Active Development - -* Returns: {string|undefined} Path to the [module compile cache][] directory if it is enabled, - or `undefined` otherwise. - ### `module.findPackageJSON(specifier[, base])` + +The module compile cache can be enabled either using the [`module.enableCompileCache()`][] +method or the [`NODE_COMPILE_CACHE=dir`][] environment variable. After it is enabled, +whenever Node.js compiles a CommonJS or a ECMAScript Module, it will use on-disk +[V8 code cache][] persisted in the specified directory to speed up the compilation. +This may slow down the first load of a module graph, but subsequent loads of the same module +graph may get a significant speedup if the contents of the modules do not change. + +To clean up the generated compile cache on disk, simply remove the cache directory. The cache +directory will be recreated the next time the same directory is used for for compile cache +storage. To avoid filling up the disk with stale cache, it is recommended to use a directory +under the [`os.tmpdir()`][]. If the compile cache is enabled by a call to +[`module.enableCompileCache()`][] without specifying the directory, Node.js will use +the [`NODE_COMPILE_CACHE=dir`][] environment variable if it's set, or defaults +to `path.join(os.tmpdir(), 'node-compile-cache')` otherwise. To locate the compile cache +directory used by a running Node.js instance, use [`module.getCompileCacheDir()`][]. + +Currently when using the compile cache with [V8 JavaScript code coverage][], the +coverage being collected by V8 may be less precise in functions that are +deserialized from the code cache. It's recommended to turn this off when +running tests to generate precise coverage. + +The enabled module compile cache can be disabled by the [`NODE_DISABLE_COMPILE_CACHE=1`][] +environment variable. This can be useful when the compile cache leads to unexpected or +undesired behaviors (e.g. less precise test coverage). + +Compilation cache generated by one version of Node.js can not be reused by a different +version of Node.js. Cache generated by different versions of Node.js will be stored +separately if the same base directory is used to persist the cache, so they can co-exist. + +At the moment, when the compile cache is enabled and a module is loaded afresh, the +code cache is generated from the compiled code immediately, but will only be written +to disk when the Node.js instance is about to exit. This is subject to change. The +[`module.flushCompileCache()`][] method can be used to ensure the accumulated code cache +is flushed to disk in case the application wants to spawn other Node.js instances +and let them share the cache long before the parent exits. + +### `module.constants.compileCacheStatus` + + + +> Stability: 1.1 - Active Development + +The following constants are returned as the `status` field in the object returned by +[`module.enableCompileCache()`][] to indicate the result of the attempt to enable the +[module compile cache][]. + + + + + + + + + + + + + + + + + + + + + + +
ConstantDescription
ENABLED + Node.js has enabled the compile cache successfully. The directory used to store the + compile cache will be returned in the directory field in the + returned object. +
ALREADY_ENABLED + The compile cache has already been enabled before, either by a previous call to + module.enableCompileCache(), or by the NODE_COMPILE_CACHE=dir + environment variable. The directory used to store the + compile cache will be returned in the directory field in the + returned object. +
FAILED + Node.js fails to enable the compile cache. This can be caused by the lack of + permission to use the specified directory, or various kinds of file system errors. + The detail of the failure will be returned in the message field in the + returned object. +
DISABLED + Node.js cannot enable the compile cache because the environment variable + NODE_DISABLE_COMPILE_CACHE=1 has been set. +
+ +### `module.enableCompileCache([cacheDir])` + + + +> Stability: 1.1 - Active Development + +* `cacheDir` {string|undefined} Optional path to specify the directory where the compile cache + will be stored/retrieved. +* Returns: {Object} + * `status` {integer} One of the [`module.constants.compileCacheStatus`][] + * `message` {string|undefined} If Node.js cannot enable the compile cache, this contains + the error message. Only set if `status` is `module.constants.compileCacheStatus.FAILED`. + * `directory` {string|undefined} If the compile cache is enabled, this contains the directory + where the compile cache is stored. Only set if `status` is + `module.constants.compileCacheStatus.ENABLED` or + `module.constants.compileCacheStatus.ALREADY_ENABLED`. + +Enable [module compile cache][] in the current Node.js instance. + +If `cacheDir` is not specified, Node.js will either use the directory specified by the +[`NODE_COMPILE_CACHE=dir`][] environment variable if it's set, or use +`path.join(os.tmpdir(), 'node-compile-cache')` otherwise. For general use cases, it's +recommended to call `module.enableCompileCache()` without specifying the `cacheDir`, +so that the directory can be overridden by the `NODE_COMPILE_CACHE` environment +variable when necessary. + +Since compile cache is supposed to be a quiet optimization that is not required for the +application to be functional, this method is designed to not throw any exception when the +compile cache cannot be enabled. Instead, it will return an object containing an error +message in the `message` field to aid debugging. +If compile cache is enabled successfully, the `directory` field in the returned object +contains the path to the directory where the compile cache is stored. The `status` +field in the returned object would be one of the `module.constants.compileCacheStatus` +values to indicate the result of the attempt to enable the [module compile cache][]. + +This method only affects the current Node.js instance. To enable it in child worker threads, +either call this method in child worker threads too, or set the +`process.env.NODE_COMPILE_CACHE` value to compile cache directory so the behavior can +be inherited into the child workers. The directory can be obtained either from the +`directory` field returned by this method, or with [`module.getCompileCacheDir()`][]. + +### `module.flushCompileCache()` + + + +> Stability: 1.1 - Active Development + +Flush the [module compile cache][] accumulated from modules already loaded +in the current Node.js instance to disk. This returns after all the flushing +file system operations come to an end, no matter they succeed or not. If there +are any errors, this will fail silently, since compile cache misses should not +interfere with the actual operation of the application. + +### `module.getCompileCacheDir()` + + + +> Stability: 1.1 - Active Development + +* Returns: {string|undefined} Path to the [module compile cache][] directory if it is enabled, + or `undefined` otherwise. + ## Customization Hooks @@ -1285,21 +1300,6 @@ added: `path` is the resolved path for the file for which a corresponding source map should be fetched. -### `module.flushCompileCache()` - - - -> Stability: 1.1 - Active Development - -Flush the [module compile cache][] accumulated from modules already loaded -in the current Node.js instance to disk. This returns after all the flushing -file system operations come to an end, no matter they succeed or not. If there -are any errors, this will fail silently, since compile cache misses should not -interfere with the actual operation of the application. - ### Class: `module.SourceMap` + +> Stability: 1.1 - Active development + +* `options` {Object} + * `load` {Function|undefined} See [load hook][]. **Default:** `undefined`. + * `resolve` {Function|undefined} See [resolve hook][]. **Default:** `undefined`. + +Register [hooks][] that customize Node.js module resolution and loading behavior. +See [Customization hooks][]. + ### `module.stripTypeScriptTypes(code[, options])` -> Stability: 1.2 - Release candidate - +> Stability: 1.2 - Release candidate (asynchronous version) +> Stability: 1.1 - Active development (synchronous version) + +There are two types of module customization hooks that are currently supported: + +1. `module.register(specifier[, parentURL][, options])` which takes a module that + exports asynchronous hook functions. The functions are run on a separate loader + thread. +2. `module.registerHooks(options)` which takes synchronous hook functions that are + run directly on the thread where the module is loaded. + ### Enabling -Module resolution and loading can be customized by registering a file which -exports a set of hooks. This can be done using the [`register`][] method -from `node:module`, which you can run before your application code by -using the `--import` flag: +Module resolution and loading can be customized by: + +1. Registering a file which exports a set of asynchronous hook functions, using the + [`register`][] method from `node:module`, +2. Registering a set of synchronous hook functions using the [`registerHooks`][] method + from `node:module`. + +The hooks can be registered before the application code is run by using the +[`--import`][] or [`--require`][] flag: ```bash node --import ./register-hooks.js ./my-app.js +node --require ./register-hooks.js ./my-app.js ``` ```mjs // register-hooks.js +// This file can only be require()-ed if it doesn't contain top-level await. +// Use module.register() to register asynchronous hooks in a dedicated thread. import { register } from 'node:module'; - register('./hooks.mjs', import.meta.url); ``` @@ -556,24 +590,46 @@ register('./hooks.mjs', import.meta.url); // register-hooks.js const { register } = require('node:module'); const { pathToFileURL } = require('node:url'); - +// Use module.register() to register asynchronous hooks in a dedicated thread. register('./hooks.mjs', pathToFileURL(__filename)); ``` -The file passed to `--import` can also be an export from a dependency: +```mjs +// Use module.registerHooks() to register synchronous hooks in the main thread. +import { registerHooks } from 'node:module'; +registerHooks({ + resolve(specifier, context, nextResolve) { /* implementation */ }, + load(url, context, nextLoad) { /* implementation */ }, +}); +``` + +```cjs +// Use module.registerHooks() to register synchronous hooks in the main thread. +const { registerHooks } = require('node:module'); +registerHooks({ + resolve(specifier, context, nextResolve) { /* implementation */ }, + load(url, context, nextLoad) { /* implementation */ }, +}); +``` + +The file passed to `--import` or `--require` can also be an export from a dependency: ```bash node --import some-package/register ./my-app.js +node --require some-package/register ./my-app.js ``` Where `some-package` has an [`"exports"`][] field defining the `/register` export to map to a file that calls `register()`, like the following `register-hooks.js` example. -Using `--import` ensures that the hooks are registered before any application -files are imported, including the entry point of the application. Alternatively, -`register` can be called from the entry point, but dynamic `import()` must be -used for any code that should be run after the hooks are registered: +Using `--import` or `--require` ensures that the hooks are registered before any +application files are imported, including the entry point of the application and for +any worker threads by default as well. + +Alternatively, `register()` and `registerHooks()` can be called from the entry point, +though dynamic `import()` must be used for any ESM code that should be run after the hooks +are registered. ```mjs import { register } from 'node:module'; @@ -596,18 +652,52 @@ register('http-to-https', pathToFileURL(__filename)); import('./my-app.js'); ``` +Customization hooks will run for any modules loaded later than the registration +and the modules they reference via `import` and the built-in `require`. +`require` function created by users using `module.createRequire()` can only be +customized by the synchronous hooks. + In this example, we are registering the `http-to-https` hooks, but they will -only be available for subsequently imported modules—in this case, `my-app.js` -and anything it references via `import` (and optionally `require`). If the -`import('./my-app.js')` had instead been a static `import './my-app.js'`, the +only be available for subsequently imported modules — in this case, `my-app.js` +and anything it references via `import` or built-in `require` in CommonJS dependencies. + +If the `import('./my-app.js')` had instead been a static `import './my-app.js'`, the app would have _already_ been loaded **before** the `http-to-https` hooks were registered. This due to the ES modules specification, where static imports are evaluated from the leaves of the tree first, then back to the trunk. There can be static imports _within_ `my-app.js`, which will not be evaluated until `my-app.js` is dynamically imported. -`my-app.js` can also be CommonJS. Customization hooks will run for any -modules that it references via `import` (and optionally `require`). +If synchronous hooks are used, both `import`, `require` and user `require` created +using `createRequire()` are supported. + +```mjs +import { registerHooks, createRequire } from 'node:module'; + +registerHooks({ /* implementation of synchronous hooks */ }); + +const require = createRequire(import.meta.url); + +// The synchronous hooks affect import, require() and user require() function +// created through createRequire(). +await import('./my-app.js'); +require('./my-app-2.js'); +``` + +```cjs +const { register, registerHooks } = require('node:module'); +const { pathToFileURL } = require('node:url'); + +registerHooks({ /* implementation of synchronous hooks */ }); + +const userRequire = createRequire(__filename); + +// The synchronous hooks affect import, require() and user require() function +// created through createRequire(). +import('./my-app.js'); +require('./my-app-2.js'); +userRequire('./my-app-3.js'); +``` Finally, if all you want to do is register hooks before your app runs and you don't want to create a separate file for that purpose, you can pass a `data:` @@ -657,9 +747,36 @@ earlier registered hooks transpile into JavaScript. The `register` method cannot be called from within the module that defines the hooks. +Chaining of `registerHooks` work similarly. If synchronous and asynchronous +hooks are mixed, the synchronous hooks are always run first before the asynchronous +hooks start running, that is, in the last synchronous hook being run, its next +hook includes invocation of the asynchronous hooks. + +```mjs +// entrypoint.mjs +import { registerHooks } from 'node:module'; + +const hook1 = { /* implementation of hooks */ }; +const hook2 = { /* implementation of hooks */ }; +// hook2 run before hook1. +registerHooks(hook1); +registerHooks(hook2); +``` + +```cjs +// entrypoint.cjs +const { registerHooks } = require('node:module'); + +const hook1 = { /* implementation of hooks */ }; +const hook2 = { /* implementation of hooks */ }; +// hook2 run before hook1. +registerHooks(hook1); +registerHooks(hook2); +``` + ### Communication with module customization hooks -Module customization hooks run on a dedicated thread, separate from the main +Asynchronous hooks run on a dedicated thread, separate from the main thread that runs application code. This means mutating global variables won't affect the other thread(s), and message channels must be used to communicate between the threads. @@ -708,8 +825,13 @@ register('./my-hooks.mjs', { }); ``` +Synchronous module hooks are run on the same thread where the application code is +run. They can directly mutate the globals of the context accessed by the main thread. + ### Hooks +#### Asynchronous hooks accepted by `module.register()` + The [`register`][] method can be used to register a module that exports a set of hooks. The hooks are functions that are called by Node.js to customize the module resolution and loading process. The exported functions must have specific @@ -729,6 +851,46 @@ export async function load(url, context, nextLoad) { } ``` +Asynchronous hooks are run in a separate thread, isolated from the main thread where +application code runs. That means it is a different [realm][]. The hooks thread +may be terminated by the main thread at any time, so do not depend on +asynchronous operations (like `console.log`) to complete. They are inherited into +child workers by default. + +#### Synchronous hooks accepted by `module.registerHooks()` + + + +> Stability: 1.1 - Active development + +The `module.registerHooks()` method accepts synchronous hook functions. +`initialize()` is not supported nor necessary, as the hook implementer +can simply run the initialization code directly before the call to +`module.registerHooks()`. + +```mjs +function resolve(specifier, context, nextResolve) { + // Take an `import` or `require` specifier and resolve it to a URL. +} + +function load(url, context, nextLoad) { + // Take a resolved URL and return the source code to be evaluated. +} +``` + +Synchronous hooks are run in the same thread and the same [realm][] where the modules +are loaded. Unlike the asynchronous hooks they are not inherited into child worker +threads by default, though if the hooks are registered using a file preloaded by +[`--import`][] or [`--require`][], child worker threads can inherit the preloaded scripts +via `process.execArgv` inheritance. See [the documentation of `Worker`][] for detail. + +In synchronous hooks, users can expect `console.log()` to complete in the same way that +they expect `console.log()` in module code to complete. + +#### Conventions of hooks + Hooks are part of a [chain][], even if that chain consists of only one custom (user-provided) hook and the default hook, which is always present. Hook functions nest: each one must always return a plain object, and chaining happens @@ -741,11 +903,6 @@ hook that returns without calling `next()` _and_ without returning prevent unintentional breaks in the chain. Return `shortCircuit: true` from a hook to signal that the chain is intentionally ending at your hook. -Hooks are run in a separate thread, isolated from the main thread where -application code runs. That means it is a different [realm][]. The hooks thread -may be terminated by the main thread at any time, so do not depend on -asynchronous operations (like `console.log`) to complete. - #### `initialize()` -> Stability: 1.2 - Release candidate +> Stability: 1.2 - Release candidate (asynchronous version) +> Stability: 1.1 - Active development (synchronous version) * `specifier` {string} * `context` {Object} @@ -863,7 +1028,9 @@ changes: Node.js default `resolve` hook after the last user-supplied `resolve` hook * `specifier` {string} * `context` {Object} -* Returns: {Object|Promise} +* Returns: {Object|Promise} The asynchronous version takes either an object containing the + following properties, or a `Promise` that will resolve to such an object. The + synchronous version only accepts an object returned synchronously. * `format` {string|null|undefined} A hint to the load hook (it might be ignored) `'builtin' | 'commonjs' | 'json' | 'module' | 'wasm'` @@ -873,8 +1040,9 @@ changes: terminate the chain of `resolve` hooks. **Default:** `false` * `url` {string} The absolute URL to which this input resolves -> **Warning** Despite support for returning promises and async functions, calls -> to `resolve` may block the main thread which can impact performance. +> **Warning** In the case of the asynchronous version, despite support for returning +> promises and async functions, calls to `resolve` may still block the main thread which +> can impact performance. The `resolve` hook chain is responsible for telling Node.js where to find and how to cache a given `import` statement or expression, or `require` call. It can @@ -889,8 +1057,8 @@ the internal module cache. The `resolve` hook is responsible for returning an `importAttributes` object if the module should be cached with different attributes than were present in the source code. -The `conditions` property in `context` is an array of conditions for -[package exports conditions][Conditional exports] that apply to this resolution +The `conditions` property in `context` is an array of conditions that will be used +to match [package exports conditions][Conditional exports] for this resolution request. They can be used for looking up conditional mappings elsewhere or to modify the list when calling the default resolution logic. @@ -900,7 +1068,11 @@ Node.js module specifier resolution behavior_ when calling `defaultResolve`, the `context.conditions` array passed to it _must_ include _all_ elements of the `context.conditions` array originally passed into the `resolve` hook. + + ```mjs +// Asynchronous version accepted by module.register(). export async function resolve(specifier, context, nextResolve) { const { parentURL = null } = context; @@ -930,10 +1102,21 @@ export async function resolve(specifier, context, nextResolve) { } ``` +```mjs +// Synchronous version accepted by module.registerHooks(). +function resolve(specifier, context, nextResolve) { + // Similar to the asynchronous resolve() above, since that one does not have + // any asynchronous logic. +} +``` + #### `load(url, context, nextLoad)` -> Stability: 1.2 - Release candidate +> Stability: 1.2 - Release candidate (asynchronous version) +> Stability: 1.1 - Active development (synchronous version) * `url` {string} The URL returned by the `resolve` chain * `context` {Object} @@ -958,7 +1142,9 @@ changes: Node.js default `load` hook after the last user-supplied `load` hook * `url` {string} * `context` {Object} -* Returns: {Object} +* Returns: {Object|Promise} The asynchronous version takes either an object containing the + following properties, or a `Promise` that will resolve to such an object. The + synchronous version only accepts an object returned synchronously. * `format` {string} * `shortCircuit` {undefined|boolean} A signal that this hook intends to terminate the chain of `load` hooks. **Default:** `false` @@ -981,7 +1167,10 @@ The final value of `format` must be one of the following: The value of `source` is ignored for type `'builtin'` because currently it is not possible to replace the value of a Node.js builtin (core) module. -Omitting vs providing a `source` for `'commonjs'` has very different effects: +##### Caveat in the asynchronous `load` hook + +When using the asynchronous `load` hook, omitting vs providing a `source` for +`'commonjs'` has very different effects: * When a `source` is provided, all `require` calls from this module will be processed by the ESM loader with registered `resolve` and `load` hooks; all @@ -995,7 +1184,12 @@ Omitting vs providing a `source` for `'commonjs'` has very different effects: registered hooks. This behavior for nullish `source` is temporary — in the future, nullish `source` will not be supported. -The Node.js internal `load` implementation, which is the value of `next` for the +These caveats do not apply to the synchronous `load` hook, in which case +the complete set of CommonJS APIs available to the customized CommonJS +modules, and `require`/`require.resolve` always go through the registered +hooks. + +The Node.js internal asynchronous `load` implementation, which is the value of `next` for the last hook in the `load` chain, returns `null` for `source` when `format` is `'commonjs'` for backward compatibility. Here is an example hook that would opt-in to using the non-default behavior: @@ -1003,6 +1197,8 @@ opt-in to using the non-default behavior: ```mjs import { readFile } from 'node:fs/promises'; +// Asynchronous version accepted by module.register(). This fix is not needed +// for the synchronous version accepted by module.registerSync(). export async function load(url, context, nextLoad) { const result = await nextLoad(url, context); if (result.format === 'commonjs') { @@ -1012,9 +1208,14 @@ export async function load(url, context, nextLoad) { } ``` -> **Warning**: The ESM `load` hook and namespaced exports from CommonJS modules -> are incompatible. Attempting to use them together will result in an empty -> object from the import. This may be addressed in the future. +This doesn't apply to the synchronous `load` hook either, in which case the +`source` returned contains source code loaded by the next hook, regardless +of module format. + +> **Warning**: The asynchronous `load` hook and namespaced exports from CommonJS +> modules are incompatible. Attempting to use them together will result in an empty +> object from the import. This may be addressed in the future. This does not apply +> to the synchronous `load` hook, in which case exports can be used as usual. > These types all correspond to classes defined in ECMAScript. @@ -1030,6 +1231,7 @@ reading files from disk. It could also be used to map an unrecognized format to a supported one, for example `yaml` to `module`. ```mjs +// Asynchronous version accepted by module.register(). export async function load(url, context, nextLoad) { const { format } = context; @@ -1053,6 +1255,14 @@ export async function load(url, context, nextLoad) { } ``` +```mjs +// Synchronous version accepted by module.registerHooks(). +function load(url, context, nextLoad) { + // Similar to the asynchronous load() above, since that one does not have + // any asynchronous logic. +} +``` + In a more advanced scenario, this can also be used to transform an unsupported source to a supported one (see [Examples](#examples) below). @@ -1111,6 +1321,10 @@ With the preceding hooks module, running prints the current version of CoffeeScript per the module at the URL in `main.mjs`. + + #### Transpilation Sources that are in formats Node.js doesn't understand can be converted into @@ -1119,6 +1333,8 @@ JavaScript using the [`load` hook][load hook]. This is less performant than transpiling source files before running Node.js; transpiler hooks should only be used for development and testing purposes. +##### Asynchronous version + ```mjs // coffeescript-hooks.mjs import { readFile } from 'node:fs/promises'; @@ -1184,6 +1400,57 @@ async function getPackageType(url) { } ``` +##### Synchronous version + +```mjs +// coffeescript-sync-hooks.mjs +import { readFileSync } from 'node:fs/promises'; +import { registerHooks } from 'node:module'; +import { dirname, extname, resolve as resolvePath } from 'node:path'; +import { cwd } from 'node:process'; +import { fileURLToPath, pathToFileURL } from 'node:url'; +import coffeescript from 'coffeescript'; + +const extensionsRegex = /\.(coffee|litcoffee|coffee\.md)$/; + +function load(url, context, nextLoad) { + if (extensionsRegex.test(url)) { + const format = getPackageType(url); + + const { source: rawSource } = nextLoad(url, { ...context, format }); + const transformedSource = coffeescript.compile(rawSource.toString(), url); + + return { + format, + shortCircuit: true, + source: transformedSource, + }; + } + + return nextLoad(url); +} + +function getPackageType(url) { + const isFilePath = !!extname(url); + const dir = isFilePath ? dirname(fileURLToPath(url)) : url; + const packagePath = resolvePath(dir, 'package.json'); + + let type; + try { + const filestring = readFileSync(packagePath, { encoding: 'utf8' }); + type = JSON.parse(filestring).type; + } catch (err) { + if (err?.code !== 'ENOENT') console.error(err); + } + if (type) return type; + return dir.length > 1 && getPackageType(resolvePath(dir, '..')); +} + +registerHooks({ load }); +``` + +#### Running hooks + ```coffee # main.coffee import { scream } from './scream.coffee' @@ -1198,8 +1465,9 @@ console.log "Brought to you by Node.js version #{version}" export scream = (str) -> str.toUpperCase() ``` -With the preceding hooks module, running +With the preceding hooks modules, running `node --import 'data:text/javascript,import { register } from "node:module"; import { pathToFileURL } from "node:url"; register(pathToFileURL("./coffeescript-hooks.mjs"));' ./main.coffee` +or `node --import ./coffeescript-sync-hooks.mjs ./main.coffee` causes `main.coffee` to be turned into JavaScript after its source code is loaded from disk but before Node.js executes it; and so on for any `.coffee`, `.litcoffee` or `.coffee.md` files referenced via `import` statements of any @@ -1212,6 +1480,8 @@ The previous two examples defined `load` hooks. This is an example of a which specifiers to override to other URLs (this is a very simplistic implementation of a small subset of the "import maps" specification). +##### Asynchronous version + ```mjs // import-map-hooks.js import fs from 'node:fs/promises'; @@ -1227,6 +1497,28 @@ export async function resolve(specifier, context, nextResolve) { } ``` +##### Synchronous version + +```mjs +// import-map-sync-hooks.js +import fs from 'node:fs/promises'; +import module from 'node:module'; + +const { imports } = JSON.parse(fs.readFileSync('import-map.json', 'utf-8')); + +function resolve(specifier, context, nextResolve) { + if (Object.hasOwn(imports, specifier)) { + return nextResolve(imports[specifier], context); + } + + return nextResolve(specifier, context); +} + +module.registerHooks({ resolve }); +``` + +##### Using the hooks + With these files: ```mjs @@ -1249,6 +1541,7 @@ console.log('some module!'); ``` Running `node --import 'data:text/javascript,import { register } from "node:module"; import { pathToFileURL } from "node:url"; register(pathToFileURL("./import-map-hooks.js"));' main.js` +or `node --import ./import-map-sync-hooks.js main.js` should print `some module!`. ## Source map v3 support @@ -1404,6 +1697,8 @@ returned object contains the following keys: [V8 code cache]: https://v8.dev/blog/code-caching-for-devs [`"exports"`]: packages.md#exports [`--enable-source-maps`]: cli.md#--enable-source-maps +[`--import`]: cli.md#--importmodule +[`--require`]: cli.md#-r---require-module [`ArrayBuffer`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer [`NODE_COMPILE_CACHE=dir`]: cli.md#node_compile_cachedir [`NODE_DISABLE_COMPILE_CACHE=1`]: cli.md#node_disable_compile_cache1 @@ -1419,6 +1714,7 @@ returned object contains the following keys: [`module.getCompileCacheDir()`]: #modulegetcompilecachedir [`module`]: #the-module-object [`os.tmpdir()`]: os.md#ostmpdir +[`registerHooks`]: #moduleregisterhooksoptions [`register`]: #moduleregisterspecifier-parenturl-options [`string`]: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String [`util.TextDecoder`]: util.md#class-utiltextdecoder @@ -1429,7 +1725,9 @@ returned object contains the following keys: [module wrapper]: modules.md#the-module-wrapper [prefix-only modules]: modules.md#built-in-modules-with-mandatory-node-prefix [realm]: https://tc39.es/ecma262/#realm +[resolve hook]: #resolvespecifier-context-nextresolve [source map include directives]: https://sourcemaps.info/spec.html#h.lmz475t4mvbx +[the documentation of `Worker`]: worker_threads.md#new-workerfilename-options [transferable objects]: worker_threads.md#portpostmessagevalue-transferlist [transform TypeScript features]: typescript.md#typescript-features [type-stripping]: typescript.md#type-stripping diff --git a/lib/internal/modules/cjs/loader.js b/lib/internal/modules/cjs/loader.js index f99d0fc2a7a0eb..1aadb45f936d6c 100644 --- a/lib/internal/modules/cjs/loader.js +++ b/lib/internal/modules/cjs/loader.js @@ -102,6 +102,7 @@ const kIsCachedByESMLoader = Symbol('kIsCachedByESMLoader'); const kRequiredModuleSymbol = Symbol('kRequiredModuleSymbol'); const kIsExecuting = Symbol('kIsExecuting'); +const kURL = Symbol('kURL'); const kFormat = Symbol('kFormat'); // Set first due to cycle with ESM loader functions. @@ -112,6 +113,9 @@ module.exports = { kModuleCircularVisited, initializeCJS, Module, + findLongestRegisteredExtension, + resolveForCJSWithHooks, + loadSourceForCJSWithHooks: loadSource, wrapSafe, wrapModuleLoad, kIsMainSymbol, @@ -157,6 +161,15 @@ const { stripBOM, toRealPath, } = require('internal/modules/helpers'); +const { + convertCJSFilenameToURL, + convertURLToCJSFilename, + loadHooks, + loadWithHooks, + registerHooks, + resolveHooks, + resolveWithHooks, +} = require('internal/modules/customization_hooks'); const { stripTypeScriptModuleTypes } = require('internal/modules/typescript'); const packageJsonReader = require('internal/modules/package_json_reader'); const { getOptionValue, getEmbedderOptions } = require('internal/options'); @@ -173,6 +186,7 @@ const { ERR_REQUIRE_CYCLE_MODULE, ERR_REQUIRE_ESM, ERR_UNKNOWN_BUILTIN_MODULE, + ERR_UNKNOWN_MODULE_FORMAT, }, setArrowMessage, } = require('internal/errors'); @@ -585,7 +599,7 @@ function trySelfParentPath(parent) { * @param {string} parentPath The path of the parent module * @param {string} request The module request to resolve */ -function trySelf(parentPath, request) { +function trySelf(parentPath, request, conditions) { if (!parentPath) { return false; } const pkg = packageJsonReader.getNearestParentPackageJSON(parentPath); @@ -606,7 +620,7 @@ function trySelf(parentPath, request) { const { packageExportsResolve } = require('internal/modules/esm/resolve'); return finalizeEsmResolution(packageExportsResolve( pathToFileURL(pkg.path), expansion, pkg.data, - pathToFileURL(parentPath), getCjsConditions()), parentPath, pkg.path); + pathToFileURL(parentPath), conditions), parentPath, pkg.path); } catch (e) { if (e.code === 'ERR_MODULE_NOT_FOUND') { throw createEsmNotFoundErr(request, pkg.path); @@ -627,7 +641,7 @@ const EXPORTS_PATTERN = /^((?:@[^/\\%]+\/)?[^./\\%][^/\\%]*)(\/.*)?$/; * @param {string} nmPath The path to the module. * @param {string} request The request for the module. */ -function resolveExports(nmPath, request) { +function resolveExports(nmPath, request, conditions) { // The implementation's behavior is meant to mirror resolution in ESM. const { 1: name, 2: expansion = '' } = RegExpPrototypeExec(EXPORTS_PATTERN, request) || kEmptyObject; @@ -639,7 +653,7 @@ function resolveExports(nmPath, request) { const { packageExportsResolve } = require('internal/modules/esm/resolve'); return finalizeEsmResolution(packageExportsResolve( pathToFileURL(pkgPath + '/package.json'), '.' + expansion, pkg, null, - getCjsConditions()), null, pkgPath); + conditions), null, pkgPath); } catch (e) { if (e.code === 'ERR_MODULE_NOT_FOUND') { throw createEsmNotFoundErr(request, pkgPath + '/package.json'); @@ -681,7 +695,7 @@ function getDefaultExtensions() { * @param {boolean} isMain Whether the request is the main app entry point * @returns {string | false} */ -Module._findPath = function(request, paths, isMain) { +Module._findPath = function(request, paths, isMain, conditions = getCjsConditions()) { const absoluteRequest = path.isAbsolute(request); if (absoluteRequest) { paths = ['']; @@ -736,7 +750,7 @@ Module._findPath = function(request, paths, isMain) { } if (!absoluteRequest) { - const exportsResolved = resolveExports(curPath, request); + const exportsResolved = resolveExports(curPath, request, conditions); if (exportsResolved) { return exportsResolved; } @@ -1017,6 +1031,153 @@ function getExportsForCircularRequire(module) { return module.exports; } +/** + * Resolve a module request for CommonJS, invoking hooks from module.registerHooks() + * if necessary. + * @param {string} specifier + * @param {Module|undefined} parent + * @param {boolean} isMain + * @returns {{url?: string, format?: string, parentURL?: string, filename: string}} + */ +function resolveForCJSWithHooks(specifier, parent, isMain) { + let defaultResolvedURL; + let defaultResolvedFilename; + let format; + + function defaultResolveImpl(specifier, parent, isMain, options) { + // For backwards compatibility, when encountering requests starting with node:, + // throw ERR_UNKNOWN_BUILTIN_MODULE on failure or return the normalized ID on success + // without going into Module._resolveFilename. + let normalized; + if (StringPrototypeStartsWith(specifier, 'node:')) { + normalized = BuiltinModule.normalizeRequirableId(specifier); + if (!normalized) { + throw new ERR_UNKNOWN_BUILTIN_MODULE(specifier); + } + defaultResolvedURL = specifier; + format = 'builtin'; + return normalized; + } + return Module._resolveFilename(specifier, parent, isMain, options).toString(); + } + + // Fast path: no hooks, just return simple results. + if (!resolveHooks.length) { + const filename = defaultResolveImpl(specifier, parent, isMain); + return { __proto__: null, url: defaultResolvedURL, filename, format }; + } + + // Slow path: has hooks, do the URL conversions and invoke hooks with contexts. + let parentURL; + if (parent) { + if (!parent[kURL] && parent.filename) { + parent[kURL] = convertCJSFilenameToURL(parent.filename); + } + parentURL = parent[kURL]; + } + + // This is used as the last nextResolve for the resolve hooks. + function defaultResolve(specifier, context) { + // TODO(joyeecheung): parent and isMain should be part of context, then we + // no longer need to use a different defaultResolve for every resolution. + defaultResolvedFilename = defaultResolveImpl(specifier, parent, isMain, { + __proto__: null, + conditions: context.conditions, + }); + + defaultResolvedURL = convertCJSFilenameToURL(defaultResolvedFilename); + return { __proto__: null, url: defaultResolvedURL }; + } + + const resolveResult = resolveWithHooks(specifier, parentURL, /* importAttributes */ undefined, + getCjsConditions(), defaultResolve); + const { url } = resolveResult; + format = resolveResult.format; + + let filename; + if (url === defaultResolvedURL) { // Not overridden, skip the re-conversion. + filename = defaultResolvedFilename; + } else { + filename = convertURLToCJSFilename(url); + } + + return { __proto__: null, url, format, filename, parentURL }; +} + +/** + * @typedef {import('internal/modules/customization_hooks').ModuleLoadContext} ModuleLoadContext; + * @typedef {import('internal/modules/customization_hooks').ModuleLoadResult} ModuleLoadResult; + */ + +/** + * Load the source code of a module based on format. + * @param {string} filename Filename of the module. + * @param {string|undefined|null} format Format of the module. + * @returns {string|null} + */ +function defaultLoadImpl(filename, format) { + switch (format) { + case undefined: + case null: + case 'module': + case 'commonjs': + case 'json': + case 'module-typescript': + case 'commonjs-typescript': + case 'typescript': { + return fs.readFileSync(filename, 'utf8'); + } + case 'builtin': + return null; + default: + // URL is not necessarily necessary/available - convert it on the spot for errors. + throw new ERR_UNKNOWN_MODULE_FORMAT(format, convertCJSFilenameToURL(filename)); + } +} + +/** + * Construct a last nextLoad() for load hooks invoked for the CJS loader. + * @param {string} url URL passed from the hook. + * @param {string} filename Filename inferred from the URL. + * @returns {(url: string, context: ModuleLoadContext) => ModuleLoadResult} + */ +function getDefaultLoad(url, filename) { + return function defaultLoad(urlFromHook, context) { + // If the url is the same as the original one, save the conversion. + const isLoadingOriginalModule = (urlFromHook === url); + const filenameFromHook = isLoadingOriginalModule ? filename : convertURLToCJSFilename(url); + const source = defaultLoadImpl(filenameFromHook, context.format); + // Format from context is directly returned, because format detection should only be + // done after the entire load chain is completed. + return { source, format: context.format }; + }; +} + +/** + * Load a specified builtin module, invoking load hooks if necessary. + * @param {string} id The module ID (without the node: prefix) + * @param {string} url The module URL (with the node: prefix) + * @param {string} format Format from resolution. + * @returns {any} If there are no load hooks or the load hooks do not override the format of the + * builtin, load and return the exports of the builtin. Otherwise, return undefined. + */ +function loadBuiltinWithHooks(id, url, format) { + if (loadHooks.length) { + url ??= `node:${id}`; + // TODO(joyeecheung): do we really want to invoke the load hook for the builtins? + const loadResult = loadWithHooks(url, format || 'builtin', /* importAttributes */ undefined, + getCjsConditions(), getDefaultLoad(url, id)); + if (loadResult.format && loadResult.format !== 'builtin') { + return undefined; // Format has been overridden, return undefined for the caller to continue loading. + } + } + + // No hooks or the hooks have not overridden the format. Load it as a builtin module and return the + // exports. + const mod = loadBuiltinModule(id); + return mod.exports; +} + /** * Load a module from cache if it exists, otherwise create a new module instance. * 1. If a module already exists in the cache: return its exports object. @@ -1051,19 +1212,18 @@ Module._load = function(request, parent, isMain) { } } - if (StringPrototypeStartsWith(request, 'node:')) { - // Slice 'node:' prefix - const id = StringPrototypeSlice(request, 5); + const { url, format, filename } = resolveForCJSWithHooks(request, parent, isMain); - if (!BuiltinModule.canBeRequiredByUsers(id)) { - throw new ERR_UNKNOWN_BUILTIN_MODULE(request); + // For backwards compatibility, if the request itself starts with node:, load it before checking + // Module._cache. Otherwise, load it after the check. + if (StringPrototypeStartsWith(request, 'node:')) { + const result = loadBuiltinWithHooks(filename, url, format); + if (result) { + return result; } - - const module = loadBuiltinModule(id, request); - return module.exports; + // The format of the builtin has been overridden by user hooks. Continue loading. } - const filename = Module._resolveFilename(request, parent, isMain); const cachedModule = Module._cache[filename]; if (cachedModule !== undefined) { updateChildren(parent, cachedModule, true); @@ -1088,8 +1248,11 @@ Module._load = function(request, parent, isMain) { } if (BuiltinModule.canBeRequiredWithoutScheme(filename)) { - const mod = loadBuiltinModule(filename, request); - return mod.exports; + const result = loadBuiltinWithHooks(filename, url, format); + if (result) { + return result; + } + // The format of the builtin has been overridden by user hooks. Continue loading. } // Don't call updateChildren(), Module constructor already does. @@ -1108,6 +1271,10 @@ Module._load = function(request, parent, isMain) { reportModuleToWatchMode(filename); Module._cache[filename] = module; module[kIsCachedByESMLoader] = false; + // If there are resolve hooks, carry the context information into the + // load hooks for the module keyed by the (potentially customized) filename. + module[kURL] = url; + module[kFormat] = format; } if (parent !== undefined) { @@ -1150,11 +1317,13 @@ Module._load = function(request, parent, isMain) { * @param {ResolveFilenameOptions} options Options object * @typedef {object} ResolveFilenameOptions * @property {string[]} paths Paths to search for modules in + * @property {string[]} conditions Conditions used for resolution. */ Module._resolveFilename = function(request, parent, isMain, options) { if (BuiltinModule.normalizeRequirableId(request)) { return request; } + const conditions = (options?.conditions) || getCjsConditions(); let paths; @@ -1200,7 +1369,7 @@ Module._resolveFilename = function(request, parent, isMain, options) { try { const { packageImportsResolve } = require('internal/modules/esm/resolve'); return finalizeEsmResolution( - packageImportsResolve(request, pathToFileURL(parentPath), getCjsConditions()), + packageImportsResolve(request, pathToFileURL(parentPath), conditions), parentPath, pkg.path, ); @@ -1215,7 +1384,7 @@ Module._resolveFilename = function(request, parent, isMain, options) { // Try module self resolution first const parentPath = trySelfParentPath(parent); - const selfResolved = trySelf(parentPath, request); + const selfResolved = trySelf(parentPath, request, conditions); if (selfResolved) { const cacheKey = request + '\x00' + (paths.length === 1 ? paths[0] : ArrayPrototypeJoin(paths, '\x00')); @@ -1224,7 +1393,7 @@ Module._resolveFilename = function(request, parent, isMain, options) { } // Look up the filename first, since that's the cache key. - const filename = Module._findPath(request, paths, isMain); + const filename = Module._findPath(request, paths, isMain, conditions); if (filename) { return filename; } const requireStack = []; for (let cursor = parent; @@ -1291,8 +1460,8 @@ Module.prototype.load = function(filename) { debug('load %j for module %j', filename, this.id); assert(!this.loaded); - this.filename = filename; - this.paths = Module._nodeModulePaths(path.dirname(filename)); + this.filename ??= filename; + this.paths ??= Module._nodeModulePaths(path.dirname(filename)); const extension = findLongestRegisteredExtension(filename); @@ -1572,27 +1741,41 @@ Module.prototype._compile = function(content, filename, format) { }; /** - * Get the source code of a module, using cached ones if it's cached. + * Get the source code of a module, using cached ones if it's cached. This is used + * for TypeScript, JavaScript and JSON loading. * After this returns, mod[kFormat], mod[kModuleSource] and mod[kURL] will be set. * @param {Module} mod Module instance whose source is potentially already cached. * @param {string} filename Absolute path to the file of the module. * @returns {{source: string, format?: string}} */ function loadSource(mod, filename, formatFromNode) { - if (formatFromNode !== undefined) { + if (mod[kFormat] === undefined) { mod[kFormat] = formatFromNode; } - const format = mod[kFormat]; + // If the module was loaded before, just return. + if (mod[kModuleSource] !== undefined) { + return { source: mod[kModuleSource], format: mod[kFormat] }; + } - let source = mod[kModuleSource]; - if (source !== undefined) { - mod[kModuleSource] = undefined; - } else { - // TODO(joyeecheung): we can read a buffer instead to speed up - // compilation. - source = fs.readFileSync(filename, 'utf8'); + // Fast path: no hooks, just load it and return. + if (!loadHooks.length) { + const source = defaultLoadImpl(filename, formatFromNode); + return { source, format: formatFromNode }; + } + + if (mod[kURL] === undefined) { + mod[kURL] = convertCJSFilenameToURL(filename); } - return { source, format }; + + const loadResult = loadWithHooks(mod[kURL], mod[kFormat], /* importAttributes */ undefined, getCjsConditions(), + getDefaultLoad(mod[kURL], filename)); + + // Reset the module properties with load hook results. + if (loadResult.format !== undefined) { + mod[kFormat] = loadResult.format; + } + mod[kModuleSource] = loadResult.source; + return { source: mod[kModuleSource], format: mod[kFormat] }; } /** @@ -1610,7 +1793,6 @@ function loadMTS(mod, filename) { * @param {Module} module CJS module instance * @param {string} filename The file path of the module */ - function loadCTS(module, filename) { const loadResult = loadSource(module, filename, 'commonjs-typescript'); module._compile(loadResult.source, filename, loadResult.format); @@ -1724,7 +1906,7 @@ Module._extensions['.js'] = function(module, filename) { * @param {string} filename The file path of the module */ Module._extensions['.json'] = function(module, filename) { - const content = fs.readFileSync(filename, 'utf8'); + const { source: content } = loadSource(module, filename, 'json'); try { setOwnProperty(module, 'exports', JSONParse(stripBOM(content))); @@ -1878,3 +2060,4 @@ ObjectDefineProperty(Module.prototype, 'constructor', { // Backwards compatibility Module.Module = Module; +Module.registerHooks = registerHooks; diff --git a/lib/internal/modules/customization_hooks.js b/lib/internal/modules/customization_hooks.js new file mode 100644 index 00000000000000..c7a7a6d53dffd8 --- /dev/null +++ b/lib/internal/modules/customization_hooks.js @@ -0,0 +1,366 @@ +'use strict'; + +const { + ArrayPrototypeFindIndex, + ArrayPrototypePush, + ArrayPrototypeSplice, + ObjectFreeze, + StringPrototypeStartsWith, + Symbol, +} = primordials; +const { + isAnyArrayBuffer, + isArrayBufferView, +} = require('internal/util/types'); + +const { BuiltinModule } = require('internal/bootstrap/realm'); +const { + ERR_INVALID_RETURN_PROPERTY_VALUE, +} = require('internal/errors').codes; +const { validateFunction } = require('internal/validators'); +const { isAbsolute } = require('path'); +const { pathToFileURL, fileURLToPath } = require('internal/url'); + +let debug = require('internal/util/debuglog').debuglog('module_hooks', (fn) => { + debug = fn; +}); + +/** @typedef {import('internal/modules/cjs/loader.js').Module} Module */ +/** + * @typedef {(specifier: string, context: ModuleResolveContext, nextResolve: ResolveHook) + * => ModuleResolveResult} ResolveHook + * @typedef {(url: string, context: ModuleLoadContext, nextLoad: LoadHook) + * => ModuleLoadResult} LoadHook + */ + +// Use arrays for better insertion and iteration performance, we don't care +// about deletion performance as much. +const resolveHooks = []; +const loadHooks = []; +const hookId = Symbol('kModuleHooksIdKey'); +let nextHookId = 0; + +class ModuleHooks { + /** + * @param {ResolveHook|undefined} resolve User-provided hook. + * @param {LoadHook|undefined} load User-provided hook. + */ + constructor(resolve, load) { + this[hookId] = Symbol(`module-hook-${nextHookId++}`); + // Always initialize all hooks, if it's unspecified it'll be an owned undefined. + this.resolve = resolve; + this.load = load; + + if (resolve) { + ArrayPrototypePush(resolveHooks, this); + } + if (load) { + ArrayPrototypePush(loadHooks, this); + } + + ObjectFreeze(this); + } + // TODO(joyeecheung): we may want methods that allow disabling/enabling temporarily + // which just sets the item in the array to undefined temporarily. + // TODO(joyeecheung): this can be the [Symbol.dispose] implementation to pair with + // `using` when the explicit resource management proposal is shipped by V8. + /** + * Deregister the hook instance. + */ + deregister() { + const id = this[hookId]; + let index = ArrayPrototypeFindIndex(resolveHooks, (hook) => hook[hookId] === id); + if (index !== -1) { + ArrayPrototypeSplice(resolveHooks, index, 1); + } + index = ArrayPrototypeFindIndex(loadHooks, (hook) => hook[hookId] === id); + if (index !== -1) { + ArrayPrototypeSplice(loadHooks, index, 1); + } + } +}; + +/** + * TODO(joyeecheung): taken an optional description? + * @param {{ resolve?: ResolveHook, load?: LoadHook }} hooks User-provided hooks + * @returns {ModuleHooks} + */ +function registerHooks(hooks) { + const { resolve, load } = hooks; + if (resolve) { + validateFunction(resolve, 'hooks.resolve'); + } + if (load) { + validateFunction(load, 'hooks.load'); + } + return new ModuleHooks(resolve, load); +} + +/** + * @param {string} filename + * @returns {string} + */ +function convertCJSFilenameToURL(filename) { + if (!filename) { return filename; } + const builtinId = BuiltinModule.normalizeRequirableId(filename); + if (builtinId) { + return `node:${builtinId}`; + } + // Handle the case where filename is neither a path, nor a built-in id, + // which is possible via monkey-patching. + if (isAbsolute(filename)) { + return pathToFileURL(filename).href; + } + return filename; +} + +/** + * @param {string} url + * @returns {string} + */ +function convertURLToCJSFilename(url) { + if (!url) { return url; } + const builtinId = BuiltinModule.normalizeRequirableId(url); + if (builtinId) { + return builtinId; + } + if (StringPrototypeStartsWith(url, 'file://')) { + return fileURLToPath(url); + } + return url; +} + +/** + * Convert a list of hooks into a function that can be used to do an operation through + * a chain of hooks. If any of the hook returns without calling the next hook, it + * must return shortCircuit: true to stop the chain from continuing to avoid + * forgetting to invoke the next hook by mistake. + * @param {ModuleHooks[]} hooks A list of hooks whose last argument is `nextHook`. + * @param {'load'|'resolve'} name Name of the hook in ModuleHooks. + * @param {Function} defaultStep The default step in the chain. + * @param {Function} validate A function that validates and sanitize the result returned by the chain. + * @returns {Function} + */ +function buildHooks(hooks, name, defaultStep, validate) { + let lastRunIndex = hooks.length; + function wrapHook(index, userHook, next) { + return function wrappedHook(...args) { + lastRunIndex = index; + const hookResult = userHook(...args, next); + if (lastRunIndex > 0 && lastRunIndex === index && !hookResult.shortCircuit) { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE('true', name, 'shortCircuit', + hookResult.shortCircuit); + } + return validate(...args, hookResult); + }; + } + const chain = [wrapHook(0, defaultStep)]; + for (let i = 0; i < hooks.length; ++i) { + const wrappedHook = wrapHook(i + 1, hooks[i][name], chain[i]); + ArrayPrototypePush(chain, wrappedHook); + } + return chain[chain.length - 1]; +} + +/** + * @typedef {object} ModuleResolveResult + * @property {string} url Resolved URL of the module. + * @property {string|undefined} format Format of the module. + * @property {ImportAttributes|undefined} importAttributes Import attributes for the request. + * @property {boolean|undefined} shortCircuit Whether the next hook has been skipped. + */ + +/** + * Validate the result returned by a chain of resolve hook. + * @param {string} specifier Specifier passed into the hooks. + * @param {ModuleResolveContext} context Context passed into the hooks. + * @param {ModuleResolveResult} result Result produced by resolve hooks. + * @returns {ModuleResolveResult} + */ +function validateResolve(specifier, context, result) { + const { url, format, importAttributes } = result; + if (typeof url !== 'string') { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE( + 'a URL string', + 'resolve', + 'url', + url, + ); + } + + if (format && typeof format !== 'string') { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE( + 'a string', + 'resolve', + 'format', + format, + ); + } + + if (importAttributes && typeof importAttributes !== 'object') { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE( + 'an object', + 'resolve', + 'importAttributes', + importAttributes, + ); + } + + return { + __proto__: null, + url, + format, + importAttributes, + }; +} + +/** + * @typedef {object} ModuleLoadResult + * @property {string|undefined} format Format of the loaded module. + * @property {string|ArrayBuffer|TypedArray} source Source code of the module. + * @property {boolean|undefined} shortCircuit Whether the next hook has been skipped. + */ + +/** + * Validate the result returned by a chain of resolve hook. + * @param {string} url URL passed into the hooks. + * @param {ModuleLoadContext} context Context passed into the hooks. + * @param {ModuleLoadResult} result Result produced by load hooks. + * @returns {ModuleLoadResult} + */ +function validateLoad(url, context, result) { + const { source, format } = result; + // To align with module.register(), the load hooks are still invoked for + // the builtins even though the default load step only provides null as source, + // and any source content for builtins provided by the user hooks are ignored. + if (!StringPrototypeStartsWith(url, 'node:') && + typeof result.source !== 'string' && + !isAnyArrayBuffer(source) && + !isArrayBufferView(source)) { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE( + 'a string, an ArrayBuffer, or a TypedArray', + 'load', + 'source', + source, + ); + } + + if (typeof format !== 'string' && format !== undefined) { + throw new ERR_INVALID_RETURN_PROPERTY_VALUE( + 'a string', + 'load', + 'format', + format, + ); + } + + return { + __proto__: null, + format, + source, + }; +} + +class ModuleResolveContext { + /** + * Context for the resolve hook. + * @param {string|undefined} parentURL Parent URL. + * @param {ImportAttributes|undefined} importAttributes Import attributes. + * @param {string[]} conditions Conditions. + */ + constructor(parentURL, importAttributes, conditions) { + this.parentURL = parentURL; + this.importAttributes = importAttributes; + this.conditions = conditions; + // TODO(joyeecheung): a field to differentiate between require and import? + } +}; + +class ModuleLoadContext { + /** + * Context for the load hook. + * @param {string|undefined} format URL. + * @param {ImportAttributes|undefined} importAttributes Import attributes. + * @param {string[]} conditions Conditions. + */ + constructor(format, importAttributes, conditions) { + this.format = format; + this.importAttributes = importAttributes; + this.conditions = conditions; + } +}; + +let decoder; +/** + * Load module source for a url, through a hooks chain if it exists. + * @param {string} url + * @param {string|undefined} originalFormat + * @param {ImportAttributes|undefined} importAttributes + * @param {string[]} conditions + * @param {(url: string, context: ModuleLoadContext) => ModuleLoadResult} defaultLoad + * @returns {ModuleLoadResult} + */ +function loadWithHooks(url, originalFormat, importAttributes, conditions, defaultLoad) { + debug('loadWithHooks', url, originalFormat); + const context = new ModuleLoadContext(originalFormat, importAttributes, conditions); + if (loadHooks.length === 0) { + return defaultLoad(url, context); + } + + const runner = buildHooks(loadHooks, 'load', defaultLoad, validateLoad); + + const result = runner(url, context); + const { source, format } = result; + if (!isAnyArrayBuffer(source) && !isArrayBufferView(source)) { + return result; + } + + switch (format) { + // Text formats: + case undefined: + case 'module': + case 'commonjs': + case 'json': + case 'module-typescript': + case 'commonjs-typescript': + case 'typescript': { + decoder ??= new (require('internal/encoding').TextDecoder)(); + result.source = decoder.decode(source); + break; + } + default: + break; + } + return result; +} + +/** + * Resolve module request to a url, through a hooks chain if it exists. + * @param {string} specifier + * @param {string|undefined} parentURL + * @param {ImportAttributes|undefined} importAttributes + * @param {string[]} conditions + * @param {(specifier: string, context: ModuleResolveContext) => ModuleResolveResult} defaultResolve + * @returns {ModuleResolveResult} + */ +function resolveWithHooks(specifier, parentURL, importAttributes, conditions, defaultResolve) { + debug('resolveWithHooks', specifier, parentURL, importAttributes); + const context = new ModuleResolveContext(parentURL, importAttributes, conditions); + if (resolveHooks.length === 0) { + return defaultResolve(specifier, context); + } + + const runner = buildHooks(resolveHooks, 'resolve', defaultResolve, validateResolve); + + return runner(specifier, context); +} + +module.exports = { + convertCJSFilenameToURL, + convertURLToCJSFilename, + loadHooks, + loadWithHooks, + registerHooks, + resolveHooks, + resolveWithHooks, +}; diff --git a/lib/internal/modules/esm/loader.js b/lib/internal/modules/esm/loader.js index c5594e07d667c3..c52f388754d5f1 100644 --- a/lib/internal/modules/esm/loader.js +++ b/lib/internal/modules/esm/loader.js @@ -42,6 +42,12 @@ const { ModuleWrap, kEvaluating, kEvaluated } = internalBinding('module_wrap'); const { urlToFilename, } = require('internal/modules/helpers'); +const { + resolveHooks, + resolveWithHooks, + loadHooks, + loadWithHooks, +} = require('internal/modules/customization_hooks'); let defaultResolve, defaultLoad, defaultLoadSync, importMetaInitializer; const { tracingChannel } = require('diagnostics_channel'); @@ -137,7 +143,7 @@ class ModuleLoader { /** * Customizations to pass requests to. - * + * @type {import('./hooks.js').Hooks} * Note that this value _MUST_ be set with `setCustomizations` * because it needs to copy `customizations.allowImportMetaResolve` * to this property and failure to do so will cause undefined @@ -350,7 +356,7 @@ class ModuleLoader { // TODO(joyeecheung): consolidate cache behavior and use resolveSync() and // loadSync() here. - const resolveResult = this.#cachedDefaultResolve(specifier, parentURL, importAttributes); + const resolveResult = this.#cachedResolveSync(specifier, parentURL, importAttributes); const { url, format } = resolveResult; if (!getOptionValue('--experimental-require-module')) { throw new ERR_REQUIRE_ESM(url, true); @@ -375,8 +381,7 @@ class ModuleLoader { return job; } - defaultLoadSync ??= require('internal/modules/esm/load').defaultLoadSync; - const loadResult = defaultLoadSync(url, { format, importAttributes }); + const loadResult = this.#loadSync(url, { format, importAttributes }); // Use the synchronous commonjs translator which can deal with cycles. const finalFormat = loadResult.format === 'commonjs' ? 'commonjs-sync' : loadResult.format; @@ -580,6 +585,10 @@ class ModuleLoader { */ resolve(specifier, parentURL, importAttributes) { specifier = `${specifier}`; + if (resolveHooks.length) { + // Has module.registerHooks() hooks, use the synchronous variant that can handle both hooks. + return this.resolveSync(specifier, parentURL, importAttributes); + } if (this.#customizations) { // Only has module.register hooks. return this.#customizations.resolve(specifier, parentURL, importAttributes); } @@ -606,7 +615,26 @@ class ModuleLoader { } /** - * This is the default resolve step for future synchronous hooks, which incorporates asynchronous hooks + * Either return a cached resolution, or perform the synchronous resolution, and + * cache the result. + * @param {string} specifier See {@link resolve}. + * @param {string} [parentURL] See {@link resolve}. + * @param {ImportAttributes} importAttributes See {@link resolve}. + * @returns {{ format: string, url: string }} + */ + #cachedResolveSync(specifier, parentURL, importAttributes) { + const requestKey = this.#resolveCache.serializeKey(specifier, importAttributes); + const cachedResult = this.#resolveCache.get(requestKey, parentURL); + if (cachedResult != null) { + return cachedResult; + } + const result = this.resolveSync(specifier, parentURL, importAttributes); + this.#resolveCache.set(requestKey, parentURL, result); + return result; + } + + /** + * This is the default resolve step for module.registerHooks(), which incorporates asynchronous hooks * from module.register() which are run in a blocking fashion for it to be synchronous. * @param {string|URL} specifier See {@link resolveSync}. * @param {{ parentURL?: string, importAttributes: ImportAttributes}} context See {@link resolveSync}. @@ -624,7 +652,7 @@ class ModuleLoader { * asynchronous resolve hooks from module.register(), it will block until the results are returned * from the loader thread for this to be synchornous. * This is here to support `import.meta.resolve()`, `require()` in imported CJS, and - * future synchronous hooks. + * `module.registerHooks()` hooks. * * TODO(joyeecheung): consolidate the cache behavior and use this in require(esm). * @param {string|URL} specifier See {@link resolve}. @@ -633,7 +661,13 @@ class ModuleLoader { * @returns {{ format: string, url: string }} */ resolveSync(specifier, parentURL, importAttributes = { __proto__: null }) { - return this.#resolveAndMaybeBlockOnLoaderThread(`${specifier}`, { parentURL, importAttributes }); + specifier = `${specifier}`; + if (resolveHooks.length) { + // Has module.registerHooks() hooks, chain the asynchronous hooks in the default step. + return resolveWithHooks(specifier, parentURL, importAttributes, this.#defaultConditions, + this.#resolveAndMaybeBlockOnLoaderThread.bind(this)); + } + return this.#resolveAndMaybeBlockOnLoaderThread(specifier, { parentURL, importAttributes }); } /** @@ -662,6 +696,10 @@ class ModuleLoader { * @returns {Promise<{ format: ModuleFormat, source: ModuleSource }>} */ async load(url, context) { + if (loadHooks.length) { + // Has module.registerHooks() hooks, use the synchronous variant that can handle both hooks. + return this.#loadSync(url, context); + } if (this.#customizations) { return this.#customizations.load(url, context); } @@ -671,7 +709,7 @@ class ModuleLoader { } /** - * This is the default load step for future synchronous hooks, which incorporates asynchronous hooks + * This is the default load step for module.registerHooks(), which incorporates asynchronous hooks * from module.register() which are run in a blocking fashion for it to be synchronous. * @param {string} url See {@link load} * @param {object} context See {@link load} @@ -689,7 +727,7 @@ class ModuleLoader { * Similar to {@link load} but this is always run synchronously. If there are asynchronous hooks * from module.register(), this blocks on the loader thread for it to return synchronously. * - * This is here to support `require()` in imported CJS and future synchronous hooks. + * This is here to support `require()` in imported CJS and `module.registerHooks()` hooks. * * TODO(joyeecheung): consolidate the cache behavior and use this in require(esm). * @param {string} url See {@link load} @@ -697,6 +735,13 @@ class ModuleLoader { * @returns {{ format: ModuleFormat, source: ModuleSource }} */ #loadSync(url, context) { + if (loadHooks.length) { + // Has module.registerHooks() hooks, chain the asynchronous hooks in the default step. + // TODO(joyeecheung): construct the ModuleLoadContext in the loaders directly instead + // of converting them from plain objects in the hooks. + return loadWithHooks(url, context.format, context.importAttributes, this.#defaultConditions, + this.#loadAndMaybeBlockOnLoaderThread.bind(this)); + } return this.#loadAndMaybeBlockOnLoaderThread(url, context); } diff --git a/lib/internal/modules/esm/module_job.js b/lib/internal/modules/esm/module_job.js index 8fba05e7b8f699..8039e2f57a500f 100644 --- a/lib/internal/modules/esm/module_job.js +++ b/lib/internal/modules/esm/module_job.js @@ -131,7 +131,8 @@ class ModuleJob extends ModuleJobBase { // Iterate with index to avoid calling into userspace with `Symbol.iterator`. for (let idx = 0; idx < moduleRequests.length; idx++) { const { specifier, attributes } = moduleRequests[idx]; - + // TODO(joyeecheung): resolve all requests first, then load them in another + // loop so that hooks can pre-fetch sources off-thread. const dependencyJobPromise = this.#loader.getModuleJobForImport( specifier, this.url, attributes, ); diff --git a/lib/internal/modules/helpers.js b/lib/internal/modules/helpers.js index 1e4b623af77877..c3122118cab75d 100644 --- a/lib/internal/modules/helpers.js +++ b/lib/internal/modules/helpers.js @@ -97,15 +97,14 @@ function getCjsConditions() { /** * Provide one of Node.js' public modules to user code. * @param {string} id - The identifier/specifier of the builtin module to load - * @param {string} request - The module requiring or importing the builtin module */ -function loadBuiltinModule(id, request) { +function loadBuiltinModule(id) { if (!BuiltinModule.canBeRequiredByUsers(id)) { return; } /** @type {import('internal/bootstrap/realm.js').BuiltinModule} */ const mod = BuiltinModule.map.get(id); - debug('load built-in module %s', request); + debug('load built-in module %s', id); // compileForPublicLoader() throws if canBeRequiredByUsers is false: mod.compileForPublicLoader(); return mod; diff --git a/test/fixtures/module-hooks/add-hook.js b/test/fixtures/module-hooks/add-hook.js new file mode 100644 index 00000000000000..807a73953c3d6b --- /dev/null +++ b/test/fixtures/module-hooks/add-hook.js @@ -0,0 +1,30 @@ +'use strict'; +const { fileURLToPath } = require('url'); +const { registerHooks } = require('module'); + +// This is a simplified version of the pirates package API to +// check that a similar API can be built on top of the public +// hooks. +function addHook(hook, options) { + function load(url, context, nextLoad) { + const result = nextLoad(url, context); + const index = url.lastIndexOf('.'); + const ext = url.slice(index); + if (!options.exts.includes(ext)) { + return result; + } + const filename = fileURLToPath(url); + if (!options.matcher(filename)) { + return result; + } + return { ...result, source: hook(result.source.toString(), filename) } + } + + const registered = registerHooks({ load }); + + return function revert() { + registered.deregister(); + }; +} + +module.exports = { addHook }; diff --git a/test/fixtures/module-hooks/get-stats.js b/test/fixtures/module-hooks/get-stats.js new file mode 100644 index 00000000000000..fa5869a455cea1 --- /dev/null +++ b/test/fixtures/module-hooks/get-stats.js @@ -0,0 +1,20 @@ +'use strict'; + +const path = require('path'); + +// Adapted from https://github.com/watson/module-details-from-path/blob/master/index.js +// used by require-in-the-middle to check the logic is still compatible with our new hooks. +exports.getStats = function getStats(filepath) { + const segments = filepath.split(path.sep); + const index = segments.lastIndexOf('node_modules'); + if (index === -1) return {}; + if (!segments[index + 1]) return {}; + const scoped = segments[index + 1][0] === '@'; + const name = scoped ? segments[index + 1] + '/' + segments[index + 2] : segments[index + 1]; + const offset = scoped ? 3 : 2; + return { + name: name, + basedir: segments.slice(0, index + offset).join(path.sep), + path: segments.slice(index + offset).join(path.sep) + } +}; diff --git a/test/fixtures/module-hooks/load-from-this-dir.js b/test/fixtures/module-hooks/load-from-this-dir.js new file mode 100644 index 00000000000000..e1c51d2f43db32 --- /dev/null +++ b/test/fixtures/module-hooks/load-from-this-dir.js @@ -0,0 +1,4 @@ +'use strict'; + +exports.require = require; +exports.import = (id) => import(id); diff --git a/test/fixtures/module-hooks/log-user.cts b/test/fixtures/module-hooks/log-user.cts new file mode 100644 index 00000000000000..2b2754f48c4c74 --- /dev/null +++ b/test/fixtures/module-hooks/log-user.cts @@ -0,0 +1,3 @@ +const { UserAccount, UserType } = require('./user.ts'); +const account: typeof UserAccount = new UserAccount('john', 100, UserType.Admin); +console.log(account); diff --git a/test/fixtures/module-hooks/log-user.mts b/test/fixtures/module-hooks/log-user.mts new file mode 100644 index 00000000000000..9e2c3bfe1a3bb8 --- /dev/null +++ b/test/fixtures/module-hooks/log-user.mts @@ -0,0 +1,4 @@ +import { UserAccount, UserType } from './user.ts'; +import { log } from 'node:console'; +const account: UserAccount = new UserAccount('john', 100, UserType.Admin); +log(account); diff --git a/test/fixtures/module-hooks/node_modules/bar-esm/bar-esm.js b/test/fixtures/module-hooks/node_modules/bar-esm/bar-esm.js new file mode 100644 index 00000000000000..2130577ddf4b51 --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/bar-esm/bar-esm.js @@ -0,0 +1 @@ +export const $key = 'bar-esm'; diff --git a/test/fixtures/module-hooks/node_modules/bar-esm/package.json b/test/fixtures/module-hooks/node_modules/bar-esm/package.json new file mode 100644 index 00000000000000..3c3282814fb87c --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/bar-esm/package.json @@ -0,0 +1,6 @@ +{ + "name": "bar-esm", + "main": "bar-esm.js", + "type": "module", + "version": "1.0.0" +} diff --git a/test/fixtures/module-hooks/node_modules/bar/bar.js b/test/fixtures/module-hooks/node_modules/bar/bar.js new file mode 100644 index 00000000000000..4d1a1e6dc010fd --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/bar/bar.js @@ -0,0 +1,3 @@ +module.exports = { + $key: 'bar' +}; diff --git a/test/fixtures/module-hooks/node_modules/bar/package.json b/test/fixtures/module-hooks/node_modules/bar/package.json new file mode 100644 index 00000000000000..0a2e2f7d1dad6b --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/bar/package.json @@ -0,0 +1,6 @@ +{ + "name": "bar", + "main": "bar.js", + "version": "1.0.0" +} + diff --git a/test/fixtures/module-hooks/node_modules/foo-esm/foo-esm.js b/test/fixtures/module-hooks/node_modules/foo-esm/foo-esm.js new file mode 100644 index 00000000000000..caf20f7cf2b78e --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/foo-esm/foo-esm.js @@ -0,0 +1 @@ +export const $key = 'foo-esm'; \ No newline at end of file diff --git a/test/fixtures/module-hooks/node_modules/foo-esm/package.json b/test/fixtures/module-hooks/node_modules/foo-esm/package.json new file mode 100644 index 00000000000000..2a98229ba262a3 --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/foo-esm/package.json @@ -0,0 +1,7 @@ +{ + "name": "foo-esm", + "type": "module", + "main": "foo-esm.js", + "version": "1.0.0" +} + diff --git a/test/fixtures/module-hooks/node_modules/foo/foo.js b/test/fixtures/module-hooks/node_modules/foo/foo.js new file mode 100644 index 00000000000000..91592faf7ce0a6 --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/foo/foo.js @@ -0,0 +1,3 @@ +module.exports = { + $key: 'foo' +}; diff --git a/test/fixtures/module-hooks/node_modules/foo/package.json b/test/fixtures/module-hooks/node_modules/foo/package.json new file mode 100644 index 00000000000000..53416530e84f2f --- /dev/null +++ b/test/fixtures/module-hooks/node_modules/foo/package.json @@ -0,0 +1,6 @@ +{ + "name": "foo", + "main": "foo.js", + "version": "1.0.0" +} + diff --git a/test/fixtures/module-hooks/redirected-assert.js b/test/fixtures/module-hooks/redirected-assert.js new file mode 100644 index 00000000000000..9855afd7ee3a3c --- /dev/null +++ b/test/fixtures/module-hooks/redirected-assert.js @@ -0,0 +1 @@ +exports.exports_for_test = 'redirected assert' diff --git a/test/fixtures/module-hooks/redirected-fs.js b/test/fixtures/module-hooks/redirected-fs.js new file mode 100644 index 00000000000000..84631b34c3539a --- /dev/null +++ b/test/fixtures/module-hooks/redirected-fs.js @@ -0,0 +1 @@ +export const exports_for_test = 'redirected fs'; diff --git a/test/fixtures/module-hooks/redirected-zlib.js b/test/fixtures/module-hooks/redirected-zlib.js new file mode 100644 index 00000000000000..9c2fcd5ac75b40 --- /dev/null +++ b/test/fixtures/module-hooks/redirected-zlib.js @@ -0,0 +1 @@ +exports.exports_for_test = 'redirected zlib'; diff --git a/test/fixtures/module-hooks/register-typescript-hooks.js b/test/fixtures/module-hooks/register-typescript-hooks.js new file mode 100644 index 00000000000000..2f9177124ab304 --- /dev/null +++ b/test/fixtures/module-hooks/register-typescript-hooks.js @@ -0,0 +1,4 @@ +'use strict'; + +const { registerHooks } = require('node:module'); +registerHooks(require('./typescript-transpiler')); diff --git a/test/fixtures/module-hooks/typescript-transpiler.js b/test/fixtures/module-hooks/typescript-transpiler.js new file mode 100644 index 00000000000000..b8cb638332ce85 --- /dev/null +++ b/test/fixtures/module-hooks/typescript-transpiler.js @@ -0,0 +1,71 @@ +'use strict'; + +const ts = require('../snapshot/typescript'); +const extensions = { + '.cts': 'commonjs-typescript', + '.mts': 'module-typescript', + '.ts': 'typescript', +}; + +const output = { + 'commonjs-typescript': { + options: { module: ts.ModuleKind.CommonJS }, + format: 'commonjs', + }, + 'module-typescript': { + options: { module: ts.ModuleKind.ESNext }, + format: 'module', + }, + 'typescript': { + options: { module: ts.ModuleKind.NodeNext }, + format: 'commonjs', + }, +}; + +function resolve(specifier, context, nextResolve) { + const resolved = nextResolve(specifier, context); + const index = resolved.url.lastIndexOf('.'); + if (index === -1) { + return resolved; + } + const ext = resolved.url.slice(index); + const supportedFormat = extensions[ext]; + if (!supportedFormat) { + return resolved; + } + const result = { + ...resolved, + format: supportedFormat, + }; + return result; +} + +let decoder; +function load(url, context, nextLoad) { + const loadResult = nextLoad(url, context); + const { source, format } = loadResult; + + if (!format || !format.includes('typescript')) { + return { format, source }; + } + + let str = source; + if (typeof str !== 'string') { + decoder ??= new TextDecoder(); + str = decoder.decode(source); + } + const transpiled = ts.transpileModule(str, { + compilerOptions: output[format].options + }); + + const result = { + ...loadResult, + format: output[format].format, + source: transpiled.outputText, + }; + + return result; +} + +exports.load = load; +exports.resolve = resolve; diff --git a/test/fixtures/module-hooks/user.ts b/test/fixtures/module-hooks/user.ts new file mode 100644 index 00000000000000..f4e064b2739345 --- /dev/null +++ b/test/fixtures/module-hooks/user.ts @@ -0,0 +1,18 @@ +enum UserType { + Staff, + Admin, +}; + +class UserAccount { + name: string; + id: number; + type: UserType; + + constructor(name: string, id: number, type: UserType) { + this.name = name; + this.id = id; + this.type = type; + } +} + +export { UserAccount, UserType }; diff --git a/test/module-hooks/module-hooks.status b/test/module-hooks/module-hooks.status new file mode 100644 index 00000000000000..cb697c3ae80155 --- /dev/null +++ b/test/module-hooks/module-hooks.status @@ -0,0 +1,7 @@ +prefix module-hooks + +# To mark a test as flaky, list the test name in the appropriate section +# below, without ".js", followed by ": PASS,FLAKY". Example: +# sample-test : PASS,FLAKY + +[true] # This section applies to all platforms diff --git a/test/module-hooks/test-module-hooks-import-wasm.mjs b/test/module-hooks/test-module-hooks-import-wasm.mjs new file mode 100644 index 00000000000000..f2c357cd50390c --- /dev/null +++ b/test/module-hooks/test-module-hooks-import-wasm.mjs @@ -0,0 +1,35 @@ +// Flags: --no-experimental-wasm-modules +// This tests that module.registerHooks() can be used to support unknown formats, like +// import(wasm) (without --experimental-wasm-modules). +import '../common/index.mjs'; + +import assert from 'node:assert'; +import { registerHooks, createRequire } from 'node:module'; +import { readFileSync } from 'node:fs'; + +registerHooks({ + load(url, context, nextLoad) { + assert.match(url, /simple\.wasm$/); + const source = + `const buf = Buffer.from([${Array.from(readFileSync(new URL(url))).join(',')}]); + const compiled = new WebAssembly.Module(buf); + const { exports } = new WebAssembly.Instance(compiled); + export default exports; + export { exports as 'module.exports' }; + `; + return { + shortCircuit: true, + source, + format: 'module', + }; + }, +}); + +// Checks that it works with require. +const require = createRequire(import.meta.url); +const { add } = require('../fixtures/simple.wasm'); +assert.strictEqual(add(1, 2), 3); + +// Checks that it works with import. +const { default: { add: add2 } } = await import('../fixtures/simple.wasm'); +assert.strictEqual(add2(1, 2), 3); diff --git a/test/module-hooks/test-module-hooks-load-buffers.js b/test/module-hooks/test-module-hooks-load-buffers.js new file mode 100644 index 00000000000000..07f7374fd96161 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-buffers.js @@ -0,0 +1,50 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// This tests that the source in the load hook can be returned as +// array buffers or array buffer views. +const arrayBufferSource = 'module.exports = "arrayBuffer"'; +const arrayBufferViewSource = 'module.exports = "arrayBufferView"'; + +const encoder = new TextEncoder(); + +const hook1 = registerHooks({ + resolve(specifier, context, nextResolve) { + return { shortCircuit: true, url: `test://${specifier}` }; + }, + load(url, context, nextLoad) { + const result = nextLoad(url, context); + if (url === 'test://array_buffer') { + assert.deepStrictEqual(result.source, encoder.encode(arrayBufferSource).buffer); + } else if (url === 'test://array_buffer_view') { + assert.deepStrictEqual(result.source, encoder.encode(arrayBufferViewSource)); + } + return result; + }, +}); + +const hook2 = registerHooks({ + load(url, context, nextLoad) { + if (url === 'test://array_buffer') { + return { + shortCircuit: true, + source: encoder.encode(arrayBufferSource).buffer, + }; + } else if (url === 'test://array_buffer_view') { + return { + shortCircuit: true, + source: encoder.encode(arrayBufferViewSource), + }; + } + assert.fail('unreachable'); + }, +}); + +assert.strictEqual(require('array_buffer'), 'arrayBuffer'); +assert.strictEqual(require('array_buffer_view'), 'arrayBufferView'); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-builtin-import.mjs b/test/module-hooks/test-module-hooks-load-builtin-import.mjs new file mode 100644 index 00000000000000..f78c69692fe04b --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-builtin-import.mjs @@ -0,0 +1,29 @@ +import { mustCall } from '../common/index.mjs'; +import assert from 'node:assert'; +import { registerHooks } from 'node:module'; +import process from 'node:process'; + +// This tests that imported builtins get null as source from default +// step, and the source returned are ignored. +// TODO(joyeecheung): this is to align with the module.register() behavior +// but perhaps the load hooks should not be invoked for builtins at all. + +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + load: mustCall(function load(url, context, nextLoad) { + assert.strictEqual(url, 'node:zlib'); + const result = nextLoad(url, context); + assert.strictEqual(result.source, null); + return { + source: 'throw new Error("I should not be thrown")', + format: 'builtin', + }; + }), +}); + +const ns = await import('node:zlib'); +assert.strictEqual(typeof ns.createGzip, 'function'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-builtin-require.js b/test/module-hooks/test-module-hooks-load-builtin-require.js new file mode 100644 index 00000000000000..78f732d2dd9207 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-builtin-require.js @@ -0,0 +1,29 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// This tests that required builtins get null as source from default +// step, and the source returned are ignored. +// TODO(joyeecheung): this is to align with the module.register() behavior +// but perhaps the load hooks should not be invoked for builtins at all. + +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + load: common.mustCall(function load(url, context, nextLoad) { + assert.strictEqual(url, 'node:zlib'); + const result = nextLoad(url, context); + assert.strictEqual(result.source, null); + return { + source: 'throw new Error("I should not be thrown")', + format: 'builtin', + }; + }), +}); + +assert.strictEqual(typeof require('zlib').createGzip, 'function'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-chained.js b/test/module-hooks/test-module-hooks-load-chained.js new file mode 100644 index 00000000000000..5227658262a752 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-chained.js @@ -0,0 +1,34 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that multiple loaders works together. +const hook1 = registerHooks({ + load(url, context, nextLoad) { + const result = nextLoad(url, context); + assert.strictEqual(result.source, ''); + return { + source: 'exports.hello = "world"', + format: 'commonjs', + }; + }, +}); + +const hook2 = registerHooks({ + load(url, context, nextLoad) { + const result = nextLoad(url, context); + assert.strictEqual(result.source, 'exports.hello = "world"'); + return { + source: 'export const hello = "world"', + format: 'module', + }; + }, +}); + +const mod = require('../fixtures/empty.js'); +assert.strictEqual(mod.hello, 'world'); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-detection.js b/test/module-hooks/test-module-hooks-load-detection.js new file mode 100644 index 00000000000000..9915b98440355b --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-detection.js @@ -0,0 +1,21 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that module syntax detection works. +const hook = registerHooks({ + load(url, context, nextLoad) { + const result = nextLoad(url, context); + assert.strictEqual(result.source, ''); + return { + source: 'export const hello = "world"', + }; + }, +}); + +const mod = require('../fixtures/empty.js'); +assert.strictEqual(mod.hello, 'world'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-esm-mock.js b/test/module-hooks/test-module-hooks-load-esm-mock.js new file mode 100644 index 00000000000000..88941b5d685f07 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-esm-mock.js @@ -0,0 +1,51 @@ +'use strict'; + +// This tests a pirates-like load hook works. + +const common = require('../common'); +const assert = require('assert'); +const fixtures = require('../common/fixtures'); +const { readFileSync } = require('fs'); + +const loader = require('../fixtures/module-hooks/load-from-this-dir'); +const { addHook } = require('../fixtures/module-hooks/add-hook'); + +const matcherArgs = []; +function matcher(filename) { + matcherArgs.push(filename); + return true; +} + +const hookArgs = []; +function hook(code, filename) { + hookArgs.push({ code, filename }); + return code.replace('$key', 'hello'); +} + +(async () => { + const revert = addHook(hook, { exts: ['.js'], matcher }); + + { + const foo = await loader.import('foo-esm'); + const filename = fixtures.path('module-hooks', 'node_modules', 'foo-esm', 'foo-esm.js'); + assert.deepStrictEqual(matcherArgs, [filename]); + const code = readFileSync(filename, 'utf-8'); + assert.deepStrictEqual(hookArgs, [{ code, filename }]); + assert.deepStrictEqual({ ...foo }, { hello: 'foo-esm' }); + } + + matcherArgs.splice(0, 1); + hookArgs.splice(0, 1); + + revert(); + + // Later loads are unaffected. + + { + const bar = await loader.import('bar-esm'); + assert.deepStrictEqual(matcherArgs, []); + assert.deepStrictEqual(hookArgs, []); + assert.deepStrictEqual({ ...bar }, { $key: 'bar-esm' }); + } + +})().catch(common.mustNotCall()); diff --git a/test/module-hooks/test-module-hooks-load-esm.js b/test/module-hooks/test-module-hooks-load-esm.js new file mode 100644 index 00000000000000..88941b5d685f07 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-esm.js @@ -0,0 +1,51 @@ +'use strict'; + +// This tests a pirates-like load hook works. + +const common = require('../common'); +const assert = require('assert'); +const fixtures = require('../common/fixtures'); +const { readFileSync } = require('fs'); + +const loader = require('../fixtures/module-hooks/load-from-this-dir'); +const { addHook } = require('../fixtures/module-hooks/add-hook'); + +const matcherArgs = []; +function matcher(filename) { + matcherArgs.push(filename); + return true; +} + +const hookArgs = []; +function hook(code, filename) { + hookArgs.push({ code, filename }); + return code.replace('$key', 'hello'); +} + +(async () => { + const revert = addHook(hook, { exts: ['.js'], matcher }); + + { + const foo = await loader.import('foo-esm'); + const filename = fixtures.path('module-hooks', 'node_modules', 'foo-esm', 'foo-esm.js'); + assert.deepStrictEqual(matcherArgs, [filename]); + const code = readFileSync(filename, 'utf-8'); + assert.deepStrictEqual(hookArgs, [{ code, filename }]); + assert.deepStrictEqual({ ...foo }, { hello: 'foo-esm' }); + } + + matcherArgs.splice(0, 1); + hookArgs.splice(0, 1); + + revert(); + + // Later loads are unaffected. + + { + const bar = await loader.import('bar-esm'); + assert.deepStrictEqual(matcherArgs, []); + assert.deepStrictEqual(hookArgs, []); + assert.deepStrictEqual({ ...bar }, { $key: 'bar-esm' }); + } + +})().catch(common.mustNotCall()); diff --git a/test/module-hooks/test-module-hooks-load-invalid.js b/test/module-hooks/test-module-hooks-load-invalid.js new file mode 100644 index 00000000000000..7836a864ca57b9 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-invalid.js @@ -0,0 +1,39 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// This tests that the invalid return values in load hooks are not accepted. + +const hook = registerHooks({ + resolve(specifier, context, nextResolve) { + return { shortCircuit: true, url: `test://${specifier}` }; + }, + load(url, context, nextLoad) { + const result = { shortCircuit: true }; + if (url.endsWith('array')) { + result.source = []; + } else if (url.endsWith('null')) { + result.source = null; + } else if (url.endsWith('number')) { + result.source = 1; + } else if (url.endsWith('boolean')) { + result.source = true; + } else if (url.endsWith('function')) { + result.source = () => {}; + } else if (url.endsWith('object')) { + result.source = {}; + } + return result; + }, +}); + +for (const item of ['undefined', 'array', 'null', 'number', 'boolean', 'function', 'object']) { + assert.throws(() => { require(item); }, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /"source" from the "load" hook/, + }); +} + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-mock.js b/test/module-hooks/test-module-hooks-load-mock.js new file mode 100644 index 00000000000000..bf00182bc32bb4 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-mock.js @@ -0,0 +1,48 @@ +'use strict'; + +// This tests a pirates-like load hook works. + +require('../common'); +const assert = require('assert'); +const fixtures = require('../common/fixtures'); +const { readFileSync } = require('fs'); + +const loader = require('../fixtures/module-hooks/load-from-this-dir'); +const { addHook } = require('../fixtures/module-hooks/add-hook'); + +const matcherArgs = []; +function matcher(filename) { + matcherArgs.push(filename); + return true; +} + +const hookArgs = []; +function hook(code, filename) { + hookArgs.push({ code, filename }); + return code.replace('$key', 'hello'); +} + +const revert = addHook(hook, { exts: ['.js'], matcher }); + +{ + const foo = loader.require('foo'); + const filename = fixtures.path('module-hooks', 'node_modules', 'foo', 'foo.js'); + assert.deepStrictEqual(matcherArgs, [filename]); + const code = readFileSync(filename, 'utf-8'); + assert.deepStrictEqual(hookArgs, [{ code, filename }]); + assert.deepStrictEqual(foo, { hello: 'foo' }); +} + +matcherArgs.splice(0, 1); +hookArgs.splice(0, 1); + +revert(); + +// Later loads are unaffected. + +{ + const bar = loader.require('bar'); + assert.deepStrictEqual(matcherArgs, []); + assert.deepStrictEqual(hookArgs, []); + assert.deepStrictEqual(bar, { $key: 'bar' }); +} diff --git a/test/module-hooks/test-module-hooks-load-short-circuit-required-middle.js b/test/module-hooks/test-module-hooks-load-short-circuit-required-middle.js new file mode 100644 index 00000000000000..a3d7d9c28cc50d --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-short-circuit-required-middle.js @@ -0,0 +1,33 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit is required in a middle hook when nextLoad is not called. +const hook1 = registerHooks({ + load(url, context, nextLoad) { + return nextLoad(url, context); + }, +}); +const hook2 = registerHooks({ + load(url, context, nextLoad) { + if (url.includes('empty')) { + return { + format: 'commonjs', + source: 'module.exports = "modified"', + }; + } + return nextLoad(url, context); + }, +}); + +assert.throws(() => { + require('../fixtures/empty.js'); +}, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /shortCircuit/, +}); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-short-circuit-required-start.js b/test/module-hooks/test-module-hooks-load-short-circuit-required-start.js new file mode 100644 index 00000000000000..7de85018427bc3 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-short-circuit-required-start.js @@ -0,0 +1,29 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit is required in the starting hook when nextLoad is not called. +const hook = registerHooks({ + load(url, context, nextLoad) { + if (url.includes('empty')) { + return { + format: 'commonjs', + source: 'module.exports = "modified"', + }; + } + return nextLoad(url, context); + }, +}); + +assert.throws(() => { + require('../fixtures/empty.js'); +}, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /shortCircuit/, +}); + +const baz = require('../fixtures/baz.js'); +assert.strictEqual(baz, 'perhaps I work'); +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-load-short-circuit.js b/test/module-hooks/test-module-hooks-load-short-circuit.js new file mode 100644 index 00000000000000..d4f3d2f2341cb7 --- /dev/null +++ b/test/module-hooks/test-module-hooks-load-short-circuit.js @@ -0,0 +1,28 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit is required in a middle hook when nextResolve is not called. +const hook1 = registerHooks({ + load: common.mustNotCall(), +}); +const hook2 = registerHooks({ + load(url, context, nextLoad) { + if (url.includes('empty')) { + return { + format: 'commonjs', + source: 'module.exports = "modified"', + shortCircuit: true, + }; + } + return nextLoad(url, context); + }, +}); + +const value = require('../fixtures/empty.js'); +assert.strictEqual(value, 'modified'); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/test-module-hooks-preload.js b/test/module-hooks/test-module-hooks-preload.js new file mode 100644 index 00000000000000..a88cd672a59a78 --- /dev/null +++ b/test/module-hooks/test-module-hooks-preload.js @@ -0,0 +1,49 @@ +'use strict'; + +require('../common'); +const fixtures = require('../common/fixtures.js'); +const { spawnSyncAndAssert } = require('../common/child_process.js'); + +spawnSyncAndAssert(process.execPath, + [ + '--require', + fixtures.path('module-hooks', 'register-typescript-hooks.js'), + fixtures.path('module-hooks', 'log-user.cts'), + ], { + trim: true, + stdout: 'UserAccount { name: \'john\', id: 100, type: 1 }', + }); + +spawnSyncAndAssert(process.execPath, + [ + '--experimental-strip-types', + '--no-experimental-transform-types', + '--require', + fixtures.path('module-hooks', 'register-typescript-hooks.js'), + fixtures.path('module-hooks', 'log-user.cts'), + ], { + trim: true, + stdout: 'UserAccount { name: \'john\', id: 100, type: 1 }', + }); + +spawnSyncAndAssert(process.execPath, + [ + '--import', + fixtures.fileURL('module-hooks', 'register-typescript-hooks.js'), + fixtures.path('module-hooks', 'log-user.mts'), + ], { + trim: true, + stdout: 'UserAccount { name: \'john\', id: 100, type: 1 }', + }); + +spawnSyncAndAssert(process.execPath, + [ + '--experimental-strip-types', + '--no-experimental-transform-types', + '--import', + fixtures.fileURL('module-hooks', 'register-typescript-hooks.js'), + fixtures.path('module-hooks', 'log-user.mts'), + ], { + trim: true, + stdout: 'UserAccount { name: \'john\', id: 100, type: 1 }', + }); diff --git a/test/module-hooks/test-module-hooks-require-wasm.js b/test/module-hooks/test-module-hooks-require-wasm.js new file mode 100644 index 00000000000000..b4276bcc749a01 --- /dev/null +++ b/test/module-hooks/test-module-hooks-require-wasm.js @@ -0,0 +1,34 @@ +// Flags: --no-experimental-wasm-modules +'use strict'; + +// This tests that module.registerHooks() can be used to support unknown formats, like +// require(wasm) and import(wasm) (without --experimental-wasm-modules). +const common = require('../common'); + +const assert = require('assert'); +const { registerHooks } = require('module'); +const { readFileSync } = require('fs'); + +registerHooks({ + load(url, context, nextLoad) { + assert.match(url, /simple\.wasm$/); + const source = + `const buf = Buffer.from([${Array.from(readFileSync(new URL(url))).join(',')}]); + const compiled = new WebAssembly.Module(buf); + module.exports = (new WebAssembly.Instance(compiled)).exports;`; + return { + shortCircuit: true, + source, + format: 'commonjs', + }; + }, +}); + +// Checks that it works with require. +const { add } = require('../fixtures/simple.wasm'); +assert.strictEqual(add(1, 2), 3); + +(async () => { // Checks that it works with import. + const { default: { add } } = await import('../fixtures/simple.wasm'); + assert.strictEqual(add(1, 2), 3); +})().then(common.mustCall()); diff --git a/test/module-hooks/test-module-hooks-resolve-builtin-builtin-import.mjs b/test/module-hooks/test-module-hooks-resolve-builtin-builtin-import.mjs new file mode 100644 index 00000000000000..b7c31678137e5e --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-builtin-builtin-import.mjs @@ -0,0 +1,27 @@ +import '../common/index.mjs'; +import assert from 'node:assert'; +import { registerHooks } from 'node:module'; +import process from 'node:process'; + +// This tests that builtins can be redirected to another builtin. +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + resolve(specifier, context, nextLoad) { + if (specifier === 'node:assert') { + return { + url: 'node:zlib', + shortCircuit: true, + }; + } + }, +}); + + +// Check assert, which is already loaded. +// zlib.createGzip is a function. +const redirected = await import('node:assert'); +assert.strictEqual(typeof redirected.createGzip, 'function'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-builtin-builtin-require.js b/test/module-hooks/test-module-hooks-resolve-builtin-builtin-require.js new file mode 100644 index 00000000000000..6de7b0d23d6675 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-builtin-builtin-require.js @@ -0,0 +1,26 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// This tests that builtins can be redirected to another builtin. +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + resolve(specifier, context, nextLoad) { + if (specifier === 'assert') { + return { + url: 'node:zlib', + shortCircuit: true, + }; + } + }, +}); + +// Check assert, which is already loaded. +// zlib.createGzip is a function. +assert.strictEqual(typeof require('assert').createGzip, 'function'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-import.mjs b/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-import.mjs new file mode 100644 index 00000000000000..0afd294298c814 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-import.mjs @@ -0,0 +1,36 @@ +import '../common/index.mjs'; +import { fileURL } from '../common/fixtures.mjs'; +import assert from 'node:assert'; +import { registerHooks } from 'node:module'; +import process from 'node:process'; + +// This tests that builtins can be redirected to a local file. +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + resolve(specifier, context, nextLoad) { + // FIXME(joyeecheung): when it gets redirected to a CommonJS module, the + // ESM loader invokes the CJS loader with the resolved URL again even when + // it already has the url and source code. Fix it so that the hooks are + // skipped during the second loading. + if (!specifier.startsWith('node:')) { + return nextLoad(specifier, context); + } + return { + url: fileURL( + 'module-hooks', + `redirected-${specifier.replace('node:', '')}.js`).href, + shortCircuit: true, + }; + }, +}); + +// Check assert, which is already loaded. +assert.strictEqual((await import('node:assert')).exports_for_test, 'redirected assert'); +// Check zlib, which is not yet loaded. +assert.strictEqual((await import('node:zlib')).exports_for_test, 'redirected zlib'); +// Check fs, which is redirected to an ESM +assert.strictEqual((await import('node:fs')).exports_for_test, 'redirected fs'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-require.js b/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-require.js new file mode 100644 index 00000000000000..0006975867ce9c --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-builtin-on-disk-require.js @@ -0,0 +1,29 @@ +'use strict'; + +require('../common'); + +const assert = require('assert'); +const { registerHooks } = require('module'); +const fixtures = require('../common/fixtures'); + +// This tests that builtins can be redirected to a local file. +// Pick a builtin that's unlikely to be loaded already - like zlib. +assert(!process.moduleLoadList.includes('NativeModule zlib')); + +const hook = registerHooks({ + resolve(specifier, context, nextLoad) { + return { + url: fixtures.fileURL('module-hooks', `redirected-${specifier}.js`).href, + shortCircuit: true, + }; + }, +}); + +// Check assert, which is already loaded. +assert.strictEqual(require('assert').exports_for_test, 'redirected assert'); +// Check zlib, which is not yet loaded. +assert.strictEqual(require('zlib').exports_for_test, 'redirected zlib'); +// Check fs, which is redirected to an ESM +assert.strictEqual(require('fs').exports_for_test, 'redirected fs'); + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-invalid.js b/test/module-hooks/test-module-hooks-resolve-invalid.js new file mode 100644 index 00000000000000..48f121dfe70b31 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-invalid.js @@ -0,0 +1,36 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// This tests that the invalid return values in resolve hooks are not accepted. + +const hook = registerHooks({ + resolve(specifier, context, nextLoad) { + const result = { shortCircuit: true }; + if (specifier === 'array') { + result.url = []; + } else if (specifier === 'null') { + result.url = null; + } else if (specifier === 'number') { + result.url = 1; + } else if (specifier === 'boolean') { + result.url = true; + } else if (specifier === 'function') { + result.url = () => {}; + } else if (specifier === 'object') { + result.url = {}; + } + return result; + }, +}); + +for (const item of ['undefined', 'array', 'null', 'number', 'boolean', 'function', 'object']) { + assert.throws(() => { require(item); }, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /"url" from the "resolve" hook/, + }); +} + +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript-override.mjs b/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript-override.mjs new file mode 100644 index 00000000000000..18e8d20ef2d93b --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript-override.mjs @@ -0,0 +1,11 @@ +// Flags: --experimental-strip-types --no-experimental-transform-types +// This tests that a mini TypeScript loader works with resolve and +// load hooks when overriding --experimental-strip-types in ESM. +import '../common/index.mjs'; +import assert from 'node:assert'; + +await import('../fixtures/module-hooks/register-typescript-hooks.js'); +// Test inline import(), if override fails, this should fail too because enum is +// not supported when --experimental-transform-types is disabled. +const { UserAccount, UserType } = await import('../fixtures/module-hooks/user.ts'); +assert.strictEqual((new UserAccount('foo', 1, UserType.Admin).name), 'foo'); diff --git a/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript.mjs b/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript.mjs new file mode 100644 index 00000000000000..797597764308c2 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-load-import-inline-typescript.mjs @@ -0,0 +1,11 @@ +// Flags: --no-experimental-strip-types --no-experimental-transform-types +// This tests that a mini TypeScript loader works with resolve and +// load hooks when TypeScript support is disabled. +import '../common/index.mjs'; +import assert from 'node:assert'; + +await import('../fixtures/module-hooks/register-typescript-hooks.js'); +// Test inline import(), if override fails, this should fail too because enum is +// not supported when --experimental-transform-types is disabled. +const { UserAccount, UserType } = await import('../fixtures/module-hooks/user.ts'); +assert.strictEqual((new UserAccount('foo', 1, UserType.Admin).name), 'foo'); diff --git a/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript-override.js b/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript-override.js new file mode 100644 index 00000000000000..967e362c70413f --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript-override.js @@ -0,0 +1,13 @@ +'use strict'; +// Flags: --experimental-strip-types --no-experimental-transform-types +// This tests that a mini TypeScript loader works with resolve and +// load hooks when overriding --experimental-strip-types in CJS. + +require('../common'); +const assert = require('assert'); + +require('../fixtures/module-hooks/register-typescript-hooks.js'); +// Test inline require(), if override fails, this should fail too because enum is +// not supported when --experimental-transform-types is disabled. +const { UserAccount, UserType } = require('../fixtures/module-hooks/user.ts'); +assert.strictEqual((new UserAccount('foo', 1, UserType.Admin).name), 'foo'); diff --git a/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript.js b/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript.js new file mode 100644 index 00000000000000..4366438f684262 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-load-require-inline-typescript.js @@ -0,0 +1,12 @@ +'use strict'; +// Flags: --no-experimental-strip-types --no-experimental-transform-types +// This tests that a mini TypeScript loader works with resolve and +// load hooks when TypeScript support is disabled. + +require('../common'); +const assert = require('assert'); + +// Test inline require(). +require('../fixtures/module-hooks/register-typescript-hooks.js'); +const { UserAccount, UserType } = require('../fixtures/module-hooks/user.ts'); +assert.strictEqual((new UserAccount('foo', 1, UserType.Admin).name), 'foo'); diff --git a/test/module-hooks/test-module-hooks-resolve-short-circuit-required-middle.js b/test/module-hooks/test-module-hooks-resolve-short-circuit-required-middle.js new file mode 100644 index 00000000000000..1275304f997d9f --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-short-circuit-required-middle.js @@ -0,0 +1,32 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit is required in a middle hook when nextResolve is not called. +const hook1 = registerHooks({ + resolve(specifier, context, nextResolve) { + return nextResolve(specifier, context); + }, +}); +const hook2 = registerHooks({ + resolve(specifier, context, nextResolve) { + if (specifier === 'bar') { + return { + url: 'node:bar', + }; + } + return nextResolve(specifier, context); + }, +}); + +assert.throws(() => { + require('bar'); +}, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /shortCircuit/, +}); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-short-circuit-required-start.js b/test/module-hooks/test-module-hooks-resolve-short-circuit-required-start.js new file mode 100644 index 00000000000000..69c68212b0d025 --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-short-circuit-required-start.js @@ -0,0 +1,28 @@ +'use strict'; + +require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit is required in the starting hook when nextResolve is not called. +const hook = registerHooks({ + resolve(specifier, context, nextResolve) { + if (specifier === 'foo') { + return { + url: 'node:foo', + }; + } + return nextResolve(specifier, context); + }, +}); + +assert.throws(() => { + require('foo'); +}, { + code: 'ERR_INVALID_RETURN_PROPERTY_VALUE', + message: /shortCircuit/, +}); + +const baz = require('../fixtures/baz.js'); +assert.strictEqual(baz, 'perhaps I work'); +hook.deregister(); diff --git a/test/module-hooks/test-module-hooks-resolve-short-circuit.js b/test/module-hooks/test-module-hooks-resolve-short-circuit.js new file mode 100644 index 00000000000000..83e7057fa1ab7a --- /dev/null +++ b/test/module-hooks/test-module-hooks-resolve-short-circuit.js @@ -0,0 +1,29 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const { registerHooks } = require('module'); + +// Test that shortCircuit works for the resolve hook. +const source1 = 'module.exports = "modified"'; +const hook1 = registerHooks({ + load: common.mustNotCall(), +}); +const hook2 = registerHooks({ + load(url, context, nextLoad) { + if (url.includes('empty')) { + return { + format: 'commonjs', + source: source1, + shortCircuit: true, + }; + } + return nextLoad(url, context); + }, +}); + +const value = require('../fixtures/empty.js'); +assert.strictEqual(value, 'modified'); + +hook1.deregister(); +hook2.deregister(); diff --git a/test/module-hooks/testcfg.py b/test/module-hooks/testcfg.py new file mode 100644 index 00000000000000..f904b1e9170fde --- /dev/null +++ b/test/module-hooks/testcfg.py @@ -0,0 +1,6 @@ +import sys, os +sys.path.append(os.path.join(os.path.dirname(__file__), '..')) +import testpy + +def GetConfiguration(context, root): + return testpy.ParallelTestConfiguration(context, root, 'module-hooks') diff --git a/test/parallel/test-bootstrap-modules.js b/test/parallel/test-bootstrap-modules.js index 12adfaa7f5c5e1..c0ba01d3891477 100644 --- a/test/parallel/test-bootstrap-modules.js +++ b/test/parallel/test-bootstrap-modules.js @@ -98,6 +98,7 @@ expected.beforePreExec = new Set([ 'Internal Binding contextify', 'NativeModule internal/vm', 'NativeModule internal/modules/helpers', + 'NativeModule internal/modules/customization_hooks', 'NativeModule internal/modules/package_json_reader', 'Internal Binding module_wrap', 'NativeModule internal/modules/cjs/loader', diff --git a/test/parallel/test-repl.js b/test/parallel/test-repl.js index 610c7813e0439c..c2670c6cc942b4 100644 --- a/test/parallel/test-repl.js +++ b/test/parallel/test-repl.js @@ -51,6 +51,7 @@ async function runReplTests(socket, prompt, tests) { // Expect can be a single line or multiple lines const expectedLines = Array.isArray(expect) ? expect : [ expect ]; + console.error('\n------------'); console.error('out:', JSON.stringify(send)); socket.write(`${send}\n`); @@ -593,17 +594,18 @@ const errorTests = [ // REPL should get a normal require() function, not one that allows // access to internal modules without the --expose-internals flag. { - send: 'require("internal/repl")', + // Shrink the stack trace to avoid having to update this test whenever the + // implementation of require() changes. It's set to 4 because somehow setting it + // to a lower value breaks the error formatting and the message becomes + // "Uncaught [Error...", which is probably a bug(?). + send: 'Error.stackTraceLimit = 4; require("internal/repl")', expect: [ /^Uncaught Error: Cannot find module 'internal\/repl'/, /^Require stack:/, - /^- /, - /^ {4}at .*/, // at Module._resolveFilename - /^ {4}at .*/, // at Module._load - /^ {4}at .*/, // at TracingChannel.traceSync - /^ {4}at .*/, // at wrapModuleLoad - /^ {4}at .*/, // at Module.require - /^ {4}at .*/, // at require + /^- /, // This just tests MODULE_NOT_FOUND so let's skip the stack trace + /^ {4}at .*/, // Some stack frame that we have to capture otherwise error message is buggy. + /^ {4}at .*/, // Some stack frame that we have to capture otherwise error message is buggy. + /^ {4}at .*/, // Some stack frame that we have to capture otherwise error message is buggy. " code: 'MODULE_NOT_FOUND',", " requireStack: [ '' ]", '}', From 8e780bc5ae764a9eb25e2e5abdd08dd11961dcc2 Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Sat, 2 Nov 2024 09:42:29 +0100 Subject: [PATCH 22/88] module: use synchronous hooks for preparsing in import(cjs) PR-URL: https://github.com/nodejs/node/pull/55698 Reviewed-By: Geoffrey Booth Reviewed-By: Chengzhong Wu Reviewed-By: Guy Bedford --- lib/internal/modules/esm/translators.js | 65 +++++++++++++------------ 1 file changed, 34 insertions(+), 31 deletions(-) diff --git a/lib/internal/modules/esm/translators.js b/lib/internal/modules/esm/translators.js index a9a3234befe10f..5b2a865582e5cd 100644 --- a/lib/internal/modules/esm/translators.js +++ b/lib/internal/modules/esm/translators.js @@ -26,7 +26,7 @@ const { const { BuiltinModule } = require('internal/bootstrap/realm'); const assert = require('internal/assert'); const { readFileSync } = require('fs'); -const { dirname, extname, isAbsolute } = require('path'); +const { dirname, extname } = require('path'); const { assertBufferSource, loadBuiltinModule, @@ -42,6 +42,9 @@ const { kModuleSource, kModuleExport, kModuleExportNames, + findLongestRegisteredExtension, + resolveForCJSWithHooks, + loadSourceForCJSWithHooks, } = require('internal/modules/cjs/loader'); const { fileURLToPath, pathToFileURL, URL } = require('internal/url'); let debug = require('internal/util/debuglog').debuglog('esm', (fn) => { @@ -171,17 +174,18 @@ const cjsCache = new SafeMap(); * @param {string} url - The URL of the module. * @param {string} source - The source code of the module. * @param {boolean} isMain - Whether the module is the main module. + * @param {string} format - Format of the module. * @param {typeof loadCJSModule} [loadCJS=loadCJSModule] - The function to load the CommonJS module. * @returns {ModuleWrap} The ModuleWrap object for the CommonJS module. */ -function createCJSModuleWrap(url, source, isMain, loadCJS = loadCJSModule) { +function createCJSModuleWrap(url, source, isMain, format, loadCJS = loadCJSModule) { debug(`Translating CJSModule ${url}`); const filename = urlToFilename(url); // In case the source was not provided by the `load` step, we need fetch it now. source = stringify(source ?? getSource(new URL(url)).source); - const { exportNames, module } = cjsPreparseModuleExports(filename, source); + const { exportNames, module } = cjsPreparseModuleExports(filename, source, isMain, format); cjsCache.set(url, module); const wrapperNames = [...exportNames, 'module.exports']; @@ -228,7 +232,7 @@ function createCJSModuleWrap(url, source, isMain, loadCJS = loadCJSModule) { translators.set('commonjs-sync', function requireCommonJS(url, source, isMain) { initCJSParseSync(); - return createCJSModuleWrap(url, source, isMain, (module, source, url, filename, isMain) => { + return createCJSModuleWrap(url, source, isMain, 'commonjs', (module, source, url, filename, isMain) => { assert(module === CJSModule._cache[filename]); wrapModuleLoad(filename, null, isMain); }); @@ -240,7 +244,7 @@ translators.set('require-commonjs', (url, source, isMain) => { initCJSParseSync(); assert(cjsParse); - return createCJSModuleWrap(url, source); + return createCJSModuleWrap(url, source, isMain, 'commonjs'); }); // Handle CommonJS modules referenced by `require` calls. @@ -249,7 +253,7 @@ translators.set('require-commonjs-typescript', (url, source, isMain) => { emitExperimentalWarning('Type Stripping'); assert(cjsParse); const code = stripTypeScriptModuleTypes(stringify(source), url); - return createCJSModuleWrap(url, code); + return createCJSModuleWrap(url, code, isMain, 'commonjs-typescript'); }); // Handle CommonJS modules referenced by `import` statements or expressions, @@ -273,16 +277,17 @@ translators.set('commonjs', function commonjsStrategy(url, source, isMain) { } catch { // Continue regardless of error. } - return createCJSModuleWrap(url, source, isMain, cjsLoader); + return createCJSModuleWrap(url, source, isMain, 'commonjs', cjsLoader); }); /** * Pre-parses a CommonJS module's exports and re-exports. * @param {string} filename - The filename of the module. * @param {string} [source] - The source code of the module. + * @param {boolean} isMain - Whether it is pre-parsing for the entry point. + * @param {string} format */ -function cjsPreparseModuleExports(filename, source) { - // TODO: Do we want to keep hitting the user mutable CJS loader here? +function cjsPreparseModuleExports(filename, source, isMain, format) { let module = CJSModule._cache[filename]; if (module && module[kModuleExportNames] !== undefined) { return { module, exportNames: module[kModuleExportNames] }; @@ -293,10 +298,15 @@ function cjsPreparseModuleExports(filename, source) { module.filename = filename; module.paths = CJSModule._nodeModulePaths(module.path); module[kIsCachedByESMLoader] = true; - module[kModuleSource] = source; CJSModule._cache[filename] = module; } + if (source === undefined) { + ({ source } = loadSourceForCJSWithHooks(module, filename, format)); + } + module[kModuleSource] = source; + + debug(`Preparsing exports of ${filename}`); let exports, reexports; try { ({ exports, reexports } = cjsParse(source || '')); @@ -310,34 +320,27 @@ function cjsPreparseModuleExports(filename, source) { // Set first for cycles. module[kModuleExportNames] = exportNames; + // If there are any re-exports e.g. `module.exports = { ...require(...) }`, + // pre-parse the dependencies to find transitively exported names. if (reexports.length) { - module.filename = filename; - module.paths = CJSModule._nodeModulePaths(module.path); + module.filename ??= filename; + module.paths ??= CJSModule._nodeModulePaths(dirname(filename)); + for (let i = 0; i < reexports.length; i++) { + debug(`Preparsing re-exports of '${filename}'`); const reexport = reexports[i]; let resolved; + let format; try { - // TODO: this should be calling the `resolve` hook chain instead. - // Doing so would mean dropping support for CJS in the loader thread, as - // this call needs to be sync from the perspective of the main thread, - // which we can do via HooksProxy and Atomics, but we can't do within - // the loaders thread. Until this is done, the lexer will use the - // monkey-patchable CJS loader to get the path to the module file to - // load (which may or may not be aligned with the URL that the `resolve` - // hook have returned). - resolved = CJSModule._resolveFilename(reexport, module); - } catch { + ({ format, filename: resolved } = resolveForCJSWithHooks(reexport, module, false)); + } catch (e) { + debug(`Failed to resolve '${reexport}', skipping`, e); continue; } - // TODO: this should be calling the `load` hook chain and check if it returns - // `format: 'commonjs'` instead of relying on file extensions. - const ext = extname(resolved); - if ((ext === '.js' || ext === '.cjs' || !CJSModule._extensions[ext]) && - isAbsolute(resolved)) { - // TODO: this should be calling the `load` hook chain to get the source - // (and fallback to reading the FS only if the source is nullish). - const source = readFileSync(resolved, 'utf-8'); - const { exportNames: reexportNames } = cjsPreparseModuleExports(resolved, source); + + if (format === 'commonjs' || + (!BuiltinModule.normalizeRequirableId(resolved) && findLongestRegisteredExtension(resolved) === '.js')) { + const { exportNames: reexportNames } = cjsPreparseModuleExports(resolved, undefined, false, format); for (const name of reexportNames) { exportNames.add(name); } From a0c4a5f1224a2a7b9556db7aa93d703cf43e389c Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Mon, 9 Dec 2024 19:36:23 -0500 Subject: [PATCH 23/88] test: update WPT for url to 6fa3fe8a92 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56136 Reviewed-By: Michaël Zasso Reviewed-By: Antoine du Hamel --- test/fixtures/wpt/README.md | 2 +- test/fixtures/wpt/url/resources/toascii.json | 173 +++++++++++++++++- test/fixtures/wpt/versions.json | 2 +- .../test-whatwg-url-custom-domainto.js | 2 +- .../test-whatwg-url-toascii.js | 4 +- test/wpt/status/url.json | 16 ++ 6 files changed, 193 insertions(+), 6 deletions(-) rename test/{parallel => known_issues}/test-whatwg-url-custom-domainto.js (96%) rename test/{parallel => known_issues}/test-whatwg-url-toascii.js (97%) diff --git a/test/fixtures/wpt/README.md b/test/fixtures/wpt/README.md index 8336539a7d70d4..cc97c8787a2c98 100644 --- a/test/fixtures/wpt/README.md +++ b/test/fixtures/wpt/README.md @@ -28,7 +28,7 @@ Last update: - resource-timing: https://github.com/web-platform-tests/wpt/tree/22d38586d0/resource-timing - resources: https://github.com/web-platform-tests/wpt/tree/1e140d63ec/resources - streams: https://github.com/web-platform-tests/wpt/tree/bc9dcbbf1a/streams -- url: https://github.com/web-platform-tests/wpt/tree/67880a4eb8/url +- url: https://github.com/web-platform-tests/wpt/tree/6fa3fe8a92/url - user-timing: https://github.com/web-platform-tests/wpt/tree/5ae85bf826/user-timing - wasm/jsapi: https://github.com/web-platform-tests/wpt/tree/cde25e7e3c/wasm/jsapi - wasm/webapi: https://github.com/web-platform-tests/wpt/tree/fd1b23eeaa/wasm/webapi diff --git a/test/fixtures/wpt/url/resources/toascii.json b/test/fixtures/wpt/url/resources/toascii.json index d02c4c7e86654c..6445db80e3c8f3 100644 --- a/test/fixtures/wpt/url/resources/toascii.json +++ b/test/fixtures/wpt/url/resources/toascii.json @@ -1,5 +1,6 @@ [ - "This resource is focused on highlighting issues with UTS #46 ToASCII", + "This contains assorted IDNA tests that IdnaTestV2 might not cover.", + "Feel free to deduplicate with a clear commit message.", { "comment": "Label with hyphens in 3rd and 4th position", "input": "aa--", @@ -198,5 +199,175 @@ { "input": ">\u00AD\u0338", "output": "xn--hdh" + }, + "Tests below are from WebKit (fast/url/idna2003.html & fast/url/idna2008.html; contributed by Chris Weber back in 2011).", + { + "input": "fa\u00DF.de", + "output": "xn--fa-hia.de" + }, + { + "input": "\u03B2\u03CC\u03BB\u03BF\u03C2.com", + "output": "xn--nxasmm1c.com" + }, + { + "input": "\u0DC1\u0DCA\u200D\u0DBB\u0DD3.com", + "output": "xn--10cl1a0b660p.com" + }, + { + "input": "\u0646\u0627\u0645\u0647\u200C\u0627\u06CC.com", + "output": "xn--mgba3gch31f060k.com" + }, + { + "input": "www.loo\u0138out.net", + "output": "www.xn--looout-5bb.net" + }, + { + "input": "\u15EF\u15EF\u15EF.lookout.net", + "output": "xn--1qeaa.lookout.net" + }, + { + "input": "www.lookout.\u0441\u043E\u043C", + "output": "www.lookout.xn--l1adi" + }, + { + "input": "www\u2025lookout.net", + "output": null + }, + { + "input": "www.lookout\u2027net", + "output": "www.xn--lookoutnet-406e" + }, + { + "input": "www.lookout.net\u2A7480", + "output": null + }, + { + "input": "www\u00A0.lookout.net", + "output": null + }, + { + "input": "\u1680lookout.net", + "output": null + }, + { + "input": "\u001flookout.net", + "output": null + }, + { + "input": "look\u06DDout.net", + "output": null + }, + { + "input": "look\u180Eout.net", + "output": null + }, + { + "input": "look\u2060out.net", + "output": "lookout.net" + }, + { + "input": "look\uFEFFout.net", + "output": "lookout.net" + }, + { + "input": "look\uD83F\uDFFEout.net", + "output": null + }, + { + "input": "look\uFFFAout.net", + "output": null + }, + { + "input": "look\u2FF0out.net", + "output": null + }, + { + "input": "look\u0341out.net", + "output": "xn--looout-kp7b.net" + }, + { + "input": "look\u202Eout.net", + "output": null + }, + { + "input": "look\u206Bout.net", + "output": null + }, + { + "input": "look\uDB40\uDC01out.net", + "output": null + }, + { + "input": "look\uDB40\uDC20out.net", + "output": null + }, + { + "input": "look\u05BEout.net", + "output": null + }, + { + "input": "B\u00FCcher.de", + "output": "xn--bcher-kva.de" + }, + { + "input": "\u2665.net", + "output": "xn--g6h.net" + }, + { + "input": "\u0378.net", + "output": null + }, + { + "input": "\u04C0.com", + "output": null + }, + { + "comment": "This is U+2F868 (which is mapped to U+36FC starting with Unicode 16.0)", + "input": "\uD87E\uDC68.com", + "output": "xn--snl.com" + }, + { + "input": "\u2183.com", + "output": null + }, + { + "input": "look\u034Fout.net", + "output": "lookout.net" + }, + { + "input": "gOoGle.com", + "output": "google.com" + }, + { + "input": "\u09dc.com", + "output": "xn--15b8c.com" + }, + { + "input": "\u1E9E.com", + "output": "xn--zca.com" + }, + { + "input": "\u1E9E.foo.com", + "output": "xn--zca.foo.com" + }, + { + "input": "-foo.bar.com", + "output": "-foo.bar.com" + }, + { + "input": "foo-.bar.com", + "output": "foo-.bar.com" + }, + { + "input": "ab--cd.com", + "output": "ab--cd.com" + }, + { + "input": "xn--0.com", + "output": null + }, + { + "input": "foo\u0300.bar.com", + "output": "xn--fo-3ja.bar.com" } ] diff --git a/test/fixtures/wpt/versions.json b/test/fixtures/wpt/versions.json index a3e0dd60ed66ab..2560056dba990d 100644 --- a/test/fixtures/wpt/versions.json +++ b/test/fixtures/wpt/versions.json @@ -72,7 +72,7 @@ "path": "streams" }, "url": { - "commit": "67880a4eb83ca9aa732eec4b35a1971ff5bf37ff", + "commit": "6fa3fe8a929be45422cd46a8961e647e13d0cab8", "path": "url" }, "user-timing": { diff --git a/test/parallel/test-whatwg-url-custom-domainto.js b/test/known_issues/test-whatwg-url-custom-domainto.js similarity index 96% rename from test/parallel/test-whatwg-url-custom-domainto.js rename to test/known_issues/test-whatwg-url-custom-domainto.js index b7458d7a8e1a86..9e70e34f7095d3 100644 --- a/test/parallel/test-whatwg-url-custom-domainto.js +++ b/test/known_issues/test-whatwg-url-custom-domainto.js @@ -13,7 +13,7 @@ const { domainToASCII, domainToUnicode } = require('url'); const tests = require('../fixtures/url-idna'); const fixtures = require('../common/fixtures'); const wptToASCIITests = require( - fixtures.path('wpt', 'url', 'resources', 'toascii.json') + fixtures.path('wpt', 'url', 'resources', 'toascii.json'), ); { diff --git a/test/parallel/test-whatwg-url-toascii.js b/test/known_issues/test-whatwg-url-toascii.js similarity index 97% rename from test/parallel/test-whatwg-url-toascii.js rename to test/known_issues/test-whatwg-url-toascii.js index e5180bfb344127..0d2485f0d38398 100644 --- a/test/parallel/test-whatwg-url-toascii.js +++ b/test/known_issues/test-whatwg-url-toascii.js @@ -10,8 +10,8 @@ const { test, assert_equals, assert_throws } = require('../common/wpt').harness; const request = { response: require( - fixtures.path('wpt', 'url', 'resources', 'toascii.json') - ) + fixtures.path('wpt', 'url', 'resources', 'toascii.json'), + ), }; // The following tests are copied from WPT. Modifications to them should be diff --git a/test/wpt/status/url.json b/test/wpt/status/url.json index 96dafd91b707d5..d361048874d72d 100644 --- a/test/wpt/status/url.json +++ b/test/wpt/status/url.json @@ -13,6 +13,22 @@ ] } }, + "toascii.window.js": { + "fail": { + "note": "Unicode 15.1", + "expected": [ + "\uD87E\uDC68.com (using URL)", + "\uD87E\uDC68.com (using URL.host)", + "\uD87E\uDC68.com (using URL.hostname)", + "\u1E9E.com (using URL)", + "\u1E9E.com (using URL.host)", + "\u1E9E.com (using URL.hostname)", + "\u1E9E.foo.com (using URL)", + "\u1E9E.foo.com (using URL.host)", + "\u1E9E.foo.com (using URL.hostname)" + ] + } + }, "url-setters-a-area.window.js": { "skip": "already tested in url-setters.any.js" }, From 54308c51bb90d0f6b581f32c30118a16d6dbb668 Mon Sep 17 00:00:00 2001 From: "Node.js GitHub Bot" Date: Mon, 9 Dec 2024 20:04:41 -0500 Subject: [PATCH 24/88] deps: update sqlite to 3.47.2 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56178 Reviewed-By: Rafael Gonzaga Reviewed-By: Michaël Zasso Reviewed-By: Marco Ippolito --- deps/sqlite/sqlite3.c | 59 +++++++++++++++++++++++-------------------- deps/sqlite/sqlite3.h | 6 ++--- 2 files changed, 35 insertions(+), 30 deletions(-) diff --git a/deps/sqlite/sqlite3.c b/deps/sqlite/sqlite3.c index 099c5482f68df0..c748d033461fae 100644 --- a/deps/sqlite/sqlite3.c +++ b/deps/sqlite/sqlite3.c @@ -1,6 +1,6 @@ /****************************************************************************** ** This file is an amalgamation of many separate C source files from SQLite -** version 3.47.1. By combining all the individual C code files into this +** version 3.47.2. By combining all the individual C code files into this ** single large file, the entire code can be compiled as a single translation ** unit. This allows many compilers to do optimizations that would not be ** possible if the files were compiled separately. Performance improvements @@ -18,7 +18,7 @@ ** separate file. This file contains only code for the core SQLite library. ** ** The content in this amalgamation comes from Fossil check-in -** b95d11e958643b969c47a8e5857f3793b9e6. +** 2aabe05e2e8cae4847a802ee2daddc1d7413. */ #define SQLITE_CORE 1 #define SQLITE_AMALGAMATION 1 @@ -462,9 +462,9 @@ extern "C" { ** [sqlite3_libversion_number()], [sqlite3_sourceid()], ** [sqlite_version()] and [sqlite_source_id()]. */ -#define SQLITE_VERSION "3.47.1" -#define SQLITE_VERSION_NUMBER 3047001 -#define SQLITE_SOURCE_ID "2024-11-25 12:07:48 b95d11e958643b969c47a8e5857f3793b9e69700b8f1469371386369a26e577e" +#define SQLITE_VERSION "3.47.2" +#define SQLITE_VERSION_NUMBER 3047002 +#define SQLITE_SOURCE_ID "2024-12-07 20:39:59 2aabe05e2e8cae4847a802ee2daddc1d7413d8fc560254d93ee3e72c14685b6c" /* ** CAPI3REF: Run-Time Library Version Numbers @@ -35697,8 +35697,8 @@ SQLITE_PRIVATE int sqlite3AtoF(const char *z, double *pResult, int length, u8 en int eValid = 1; /* True exponent is either not used or is well-formed */ int nDigit = 0; /* Number of digits processed */ int eType = 1; /* 1: pure integer, 2+: fractional -1 or less: bad UTF16 */ + u64 s2; /* round-tripped significand */ double rr[2]; - u64 s2; assert( enc==SQLITE_UTF8 || enc==SQLITE_UTF16LE || enc==SQLITE_UTF16BE ); *pResult = 0.0; /* Default return value, in case of an error */ @@ -35801,7 +35801,7 @@ SQLITE_PRIVATE int sqlite3AtoF(const char *z, double *pResult, int length, u8 en e = (e*esign) + d; /* Try to adjust the exponent to make it smaller */ - while( e>0 && s<(LARGEST_UINT64/10) ){ + while( e>0 && s<((LARGEST_UINT64-0x7ff)/10) ){ s *= 10; e--; } @@ -35811,11 +35811,16 @@ SQLITE_PRIVATE int sqlite3AtoF(const char *z, double *pResult, int length, u8 en } rr[0] = (double)s; - s2 = (u64)rr[0]; -#if defined(_MSC_VER) && _MSC_VER<1700 - if( s2==0x8000000000000000LL ){ s2 = 2*(u64)(0.5*rr[0]); } -#endif - rr[1] = s>=s2 ? (double)(s - s2) : -(double)(s2 - s); + assert( sizeof(s2)==sizeof(rr[0]) ); + memcpy(&s2, &rr[0], sizeof(s2)); + if( s2<=0x43efffffffffffffLL ){ + s2 = (u64)rr[0]; + rr[1] = s>=s2 ? (double)(s - s2) : -(double)(s2 - s); + }else{ + rr[1] = 0.0; + } + assert( rr[1]<=1.0e-10*rr[0] ); /* Equal only when rr[0]==0.0 */ + if( e>0 ){ while( e>=100 ){ e -= 100; @@ -147605,32 +147610,32 @@ static Expr *substExpr( if( pSubst->isOuterJoin ){ ExprSetProperty(pNew, EP_CanBeNull); } - if( ExprHasProperty(pExpr,EP_OuterON|EP_InnerON) ){ - sqlite3SetJoinExpr(pNew, pExpr->w.iJoin, - pExpr->flags & (EP_OuterON|EP_InnerON)); - } - sqlite3ExprDelete(db, pExpr); - pExpr = pNew; - if( pExpr->op==TK_TRUEFALSE ){ - pExpr->u.iValue = sqlite3ExprTruthValue(pExpr); - pExpr->op = TK_INTEGER; - ExprSetProperty(pExpr, EP_IntValue); + if( pNew->op==TK_TRUEFALSE ){ + pNew->u.iValue = sqlite3ExprTruthValue(pNew); + pNew->op = TK_INTEGER; + ExprSetProperty(pNew, EP_IntValue); } /* Ensure that the expression now has an implicit collation sequence, ** just as it did when it was a column of a view or sub-query. */ { - CollSeq *pNat = sqlite3ExprCollSeq(pSubst->pParse, pExpr); + CollSeq *pNat = sqlite3ExprCollSeq(pSubst->pParse, pNew); CollSeq *pColl = sqlite3ExprCollSeq(pSubst->pParse, pSubst->pCList->a[iColumn].pExpr ); - if( pNat!=pColl || (pExpr->op!=TK_COLUMN && pExpr->op!=TK_COLLATE) ){ - pExpr = sqlite3ExprAddCollateString(pSubst->pParse, pExpr, + if( pNat!=pColl || (pNew->op!=TK_COLUMN && pNew->op!=TK_COLLATE) ){ + pNew = sqlite3ExprAddCollateString(pSubst->pParse, pNew, (pColl ? pColl->zName : "BINARY") ); } } - ExprClearProperty(pExpr, EP_Collate); + ExprClearProperty(pNew, EP_Collate); + if( ExprHasProperty(pExpr,EP_OuterON|EP_InnerON) ){ + sqlite3SetJoinExpr(pNew, pExpr->w.iJoin, + pExpr->flags & (EP_OuterON|EP_InnerON)); + } + sqlite3ExprDelete(db, pExpr); + pExpr = pNew; } } }else{ @@ -254938,7 +254943,7 @@ static void fts5SourceIdFunc( ){ assert( nArg==0 ); UNUSED_PARAM2(nArg, apUnused); - sqlite3_result_text(pCtx, "fts5: 2024-11-25 12:07:48 b95d11e958643b969c47a8e5857f3793b9e69700b8f1469371386369a26e577e", -1, SQLITE_TRANSIENT); + sqlite3_result_text(pCtx, "fts5: 2024-12-07 20:39:59 2aabe05e2e8cae4847a802ee2daddc1d7413d8fc560254d93ee3e72c14685b6c", -1, SQLITE_TRANSIENT); } /* diff --git a/deps/sqlite/sqlite3.h b/deps/sqlite/sqlite3.h index dbecc3fe896cf7..d8ce1482a352af 100644 --- a/deps/sqlite/sqlite3.h +++ b/deps/sqlite/sqlite3.h @@ -146,9 +146,9 @@ extern "C" { ** [sqlite3_libversion_number()], [sqlite3_sourceid()], ** [sqlite_version()] and [sqlite_source_id()]. */ -#define SQLITE_VERSION "3.47.1" -#define SQLITE_VERSION_NUMBER 3047001 -#define SQLITE_SOURCE_ID "2024-11-25 12:07:48 b95d11e958643b969c47a8e5857f3793b9e69700b8f1469371386369a26e577e" +#define SQLITE_VERSION "3.47.2" +#define SQLITE_VERSION_NUMBER 3047002 +#define SQLITE_SOURCE_ID "2024-12-07 20:39:59 2aabe05e2e8cae4847a802ee2daddc1d7413d8fc560254d93ee3e72c14685b6c" /* ** CAPI3REF: Run-Time Library Version Numbers From fa667d609ea59f995cb06fdbe4b6ccda55159e29 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Micha=C3=ABl=20Zasso?= Date: Tue, 10 Dec 2024 11:10:16 +0100 Subject: [PATCH 25/88] tools: remove has_absl_stringify from gyp file File was removed in https://github.com/nodejs/node/pull/54536 PR-URL: https://github.com/nodejs/node/pull/56157 Reviewed-By: Richard Lau Reviewed-By: Luigi Pinca --- tools/v8_gypfiles/v8.gyp | 1 - 1 file changed, 1 deletion(-) diff --git a/tools/v8_gypfiles/v8.gyp b/tools/v8_gypfiles/v8.gyp index 9b5d8e2f648467..9acad07d966a35 100644 --- a/tools/v8_gypfiles/v8.gyp +++ b/tools/v8_gypfiles/v8.gyp @@ -2401,7 +2401,6 @@ '<(ABSEIL_ROOT)/absl/strings/cord_buffer.cc', '<(ABSEIL_ROOT)/absl/strings/escaping.h', '<(ABSEIL_ROOT)/absl/strings/escaping.cc', - '<(ABSEIL_ROOT)/absl/strings/has_absl_stringify.h', '<(ABSEIL_ROOT)/absl/strings/has_ostream_operator.h', '<(ABSEIL_ROOT)/absl/strings/internal/charconv_bigint.h', '<(ABSEIL_ROOT)/absl/strings/internal/charconv_bigint.cc', From f10239fde781b6fdd7c13c8e2cec16d4e3071280 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?G=C3=BCrg=C3=BCn=20Day=C4=B1o=C4=9Flu?= Date: Tue, 10 Dec 2024 13:18:18 +0100 Subject: [PATCH 26/88] lib: remove redundant global regexps MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56182 Reviewed-By: Michaël Zasso Reviewed-By: Juan José Arboleda Reviewed-By: Yagiz Nizipli Reviewed-By: LiviaMedeiros Reviewed-By: Luigi Pinca --- lib/internal/util/debuglog.js | 2 +- lib/url.js | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/lib/internal/util/debuglog.js b/lib/internal/util/debuglog.js index 271c9d1497d88f..96b10c4bbac767 100644 --- a/lib/internal/util/debuglog.js +++ b/lib/internal/util/debuglog.js @@ -173,7 +173,7 @@ function formatTime(ms) { } function safeTraceLabel(label) { - return label.replace(/\\/g, '\\\\').replaceAll('"', '\\"'); + return label.replaceAll('\\', '\\\\').replaceAll('"', '\\"'); } /** diff --git a/lib/url.js b/lib/url.js index ef1b1a23d9a5c8..8acec11816f88e 100644 --- a/lib/url.js +++ b/lib/url.js @@ -705,7 +705,7 @@ Url.prototype.format = function format() { } } - search = search.replace(/#/g, '%23'); + search = search.replaceAll('#', '%23'); if (hash && hash.charCodeAt(0) !== CHAR_HASH) hash = '#' + hash; From 595851b5ed40d71d51bc43c08a95794c539ac59a Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?H=C3=BCseyin=20A=C3=A7acak?= <110401522+huseyinacacak-janea@users.noreply.github.com> Date: Tue, 10 Dec 2024 16:31:21 +0300 Subject: [PATCH 27/88] fs,win: fix readdir for named pipe MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56110 Fixes: https://github.com/nodejs/node/issues/56002 Refs: https://github.com/nodejs/node/pull/55623 Refs: https://github.com/nodejs/node/pull/56088 Reviewed-By: Gerhard Stöbich Reviewed-By: Luigi Pinca --- src/node_file.cc | 21 +++++++++++++++++++++ test/parallel/test-fs-readdir-pipe.js | 21 +++++++++++++++++++++ 2 files changed, 42 insertions(+) create mode 100644 test/parallel/test-fs-readdir-pipe.js diff --git a/src/node_file.cc b/src/node_file.cc index 5a50aacb1b939d..34a86ef7f140d7 100644 --- a/src/node_file.cc +++ b/src/node_file.cc @@ -1986,8 +1986,29 @@ static void ReadDir(const FunctionCallbackInfo& args) { BufferValue path(isolate, args[0]); CHECK_NOT_NULL(*path); +#ifdef _WIN32 + // On Windows, some API functions accept paths with trailing slashes, + // while others do not. This code checks if the input path ends with + // a slash (either '/' or '\\') and, if so, ensures that the processed + // path also ends with a trailing backslash ('\\'). + bool slashCheck = false; + if (path.ToStringView().ends_with("/") || + path.ToStringView().ends_with("\\")) { + slashCheck = true; + } +#endif + ToNamespacedPath(env, &path); +#ifdef _WIN32 + if (slashCheck) { + size_t new_length = path.length() + 1; + path.AllocateSufficientStorage(new_length + 1); + path.SetLengthAndZeroTerminate(new_length); + path.out()[new_length - 1] = '\\'; + } +#endif + const enum encoding encoding = ParseEncoding(isolate, args[1], UTF8); bool with_types = args[2]->IsTrue(); diff --git a/test/parallel/test-fs-readdir-pipe.js b/test/parallel/test-fs-readdir-pipe.js new file mode 100644 index 00000000000000..592e7a3d54009f --- /dev/null +++ b/test/parallel/test-fs-readdir-pipe.js @@ -0,0 +1,21 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const { readdir, readdirSync } = require('fs'); + +if (!common.isWindows) { + common.skip('This test is specific to Windows to test enumerate pipes'); +} + +// Ref: https://github.com/nodejs/node/issues/56002 +// This test is specific to Windows. + +const pipe = '\\\\.\\pipe\\'; + +const { length } = readdirSync(pipe); +assert.ok(length >= 0, `${length} is not greater or equal to 0`); + +readdir(pipe, common.mustSucceed((files) => { + assert.ok(files.length >= 0, `${files.length} is not greater or equal to 0`); +})); From e4922ab15fd2c31d58d40866688c5def1ee2d5a0 Mon Sep 17 00:00:00 2001 From: Yuan-Ming Hsu <48866415+technic960183@users.noreply.github.com> Date: Wed, 11 Dec 2024 03:13:22 +0800 Subject: [PATCH 28/88] doc: fix incorrect link to style guide The link to the style guide in `pull-requests.md` linked to the main `README.md` instead of `doc/README.md`. This commit fixes the link. Refs: https://github.com/nodejs/node/pull/41119 PR-URL: https://github.com/nodejs/node/pull/56181 Reviewed-By: Gireesh Punathil Reviewed-By: Rafael Gonzaga Reviewed-By: Luigi Pinca --- doc/contributing/pull-requests.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/contributing/pull-requests.md b/doc/contributing/pull-requests.md index 295e9d3695c47e..2ad538b3fd8e29 100644 --- a/doc/contributing/pull-requests.md +++ b/doc/contributing/pull-requests.md @@ -122,7 +122,7 @@ If you are modifying code, please be sure to run `make lint` (or code style guide. Any documentation you write (including code comments and API documentation) -should follow the [Style Guide](../../README.md). Code samples +should follow the [Style Guide](../../doc/README.md). Code samples included in the API docs will also be checked when running `make lint` (or `vcbuild.bat lint` on Windows). If you are adding to or deprecating an API, add or change the appropriate YAML documentation. Use `REPLACEME` for the From a9e67280e7a0d3e55b86289d00d668451b4b5d73 Mon Sep 17 00:00:00 2001 From: Michael Dawson Date: Fri, 29 Nov 2024 19:56:17 +0000 Subject: [PATCH 29/88] doc: add ambassador message - benefits of Node.js MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Add an initial message to be promoted. Signed-off-by: Michael Dawson PR-URL: https://github.com/nodejs/node/pull/56085 Reviewed-By: James M Snell Reviewed-By: Ulises Gascón Reviewed-By: Gireesh Punathil Reviewed-By: Marco Ippolito --- .../advocacy-ambassador-program.md | 35 +++++++++++++++++++ 1 file changed, 35 insertions(+) diff --git a/doc/contributing/advocacy-ambassador-program.md b/doc/contributing/advocacy-ambassador-program.md index cfb8c5cb1cd484..31d8fd58a1a4bf 100644 --- a/doc/contributing/advocacy-ambassador-program.md +++ b/doc/contributing/advocacy-ambassador-program.md @@ -130,3 +130,38 @@ or the information to be shared. Add a list of GitHub handles for those within the project that have volunteered to be contacated when necessary by ambassadors to get more info about the message to be promoted. + +#### Node.js is a great choice for a JavaScript runtime + +##### Goal + +Highlight the benefits of chosing Node.js as your backend JavaScript runtime. Focus on what is great +about Node.js without drawing comparisons to alternatives. We don't want to say negative things about +other options, only highlight what is great about Node.js as a choice. + +Some of the things to highlight include: + +* How widely it is used (you never get fired for chosing Node.js). +* The openess of the project. It is part of the OpenJS Foundation and it's governance is set up to avoid + any one company from dominating the project. Decisions are made by the collaborators (of which there are quite + a few) versus a small number of people. +* It has predictable and stable releases and has delivered on the release schedule since 2015. +* It was a well defined security release process and manages security releases well. +* As the defacto standard, it has the highest likelihood of being supported for a given package on npm. +* It is not dependent on any one company for its continued existence reducing risk of using it. +* The large number of platforms supported. +* Asynchronous non-blocking i/o architecture drives high transactional throughput, making it ideal for web workloads. +* Single threaded programming model enables very low resource consumption, making it ideal for containerised workloads. +* Highly vibrant ecosystem with enterprise support from many vendors. + +#### Related Links + +* +* +* +* + for slide usage and topping recent surveys. + +#### Project contacts + +* @mhdawson From 683cc1579679a3a094aa0a470666947ca0140720 Mon Sep 17 00:00:00 2001 From: Livia Medeiros Date: Wed, 11 Dec 2024 08:44:07 +0900 Subject: [PATCH 30/88] test: move localizationd data from `test-icu-env` to external file PR-URL: https://github.com/nodejs/node/pull/55618 Reviewed-By: Richard Lau Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Antoine du Hamel --- test/fixtures/icu/localizationData-v74.2.json | 128 +++++++++++++++++ test/fixtures/icu/localizationData-v75.1.json | 128 +++++++++++++++++ test/fixtures/icu/localizationData-v76.1.json | 128 +++++++++++++++++ test/parallel/test-icu-env.js | 132 +++++------------- tools/icu/update-test-data.mjs | 81 +++++++++++ 5 files changed, 503 insertions(+), 94 deletions(-) create mode 100644 test/fixtures/icu/localizationData-v74.2.json create mode 100644 test/fixtures/icu/localizationData-v75.1.json create mode 100644 test/fixtures/icu/localizationData-v76.1.json create mode 100644 tools/icu/update-test-data.mjs diff --git a/test/fixtures/icu/localizationData-v74.2.json b/test/fixtures/icu/localizationData-v74.2.json new file mode 100644 index 00000000000000..65671ba5acb299 --- /dev/null +++ b/test/fixtures/icu/localizationData-v74.2.json @@ -0,0 +1,128 @@ +{ + "dateStrings": { + "en": "Fri Jul 25 1980 01:35:33 GMT+0100 (Central European Standard Time)", + "zh": "Fri Jul 25 1980 01:35:33 GMT+0100 (中欧标准时间)", + "hi": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्य यूरोपीय मानक समय)", + "es": "Fri Jul 25 1980 01:35:33 GMT+0100 (hora estándar de Europa central)", + "fr": "Fri Jul 25 1980 01:35:33 GMT+0100 (heure normale d’Europe centrale)", + "ar": "Fri Jul 25 1980 01:35:33 GMT+0100 (توقيت وسط أوروبا الرسمي)", + "bn": "Fri Jul 25 1980 01:35:33 GMT+0100 (মধ্য ইউরোপীয় মানক সময়)", + "ru": "Fri Jul 25 1980 01:35:33 GMT+0100 (Центральная Европа, стандартное время)", + "pt": "Fri Jul 25 1980 01:35:33 GMT+0100 (Horário Padrão da Europa Central)", + "ur": "Fri Jul 25 1980 01:35:33 GMT+0100 (وسطی یورپ کا معیاری وقت)", + "id": "Fri Jul 25 1980 01:35:33 GMT+0100 (Waktu Standar Eropa Tengah)", + "de": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mitteleuropäische Normalzeit)", + "ja": "Fri Jul 25 1980 01:35:33 GMT+0100 (中央ヨーロッパ標準時)", + "pcm": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mídúl Yúrop Fíksd Taim)", + "mr": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्‍य युरोपियन प्रमाण वेळ)", + "te": "Fri Jul 25 1980 01:35:33 GMT+0100 (సెంట్రల్ యూరోపియన్ ప్రామాణిక సమయం)" + }, + "dateTimeFormats": { + "en": "7/25/1980, 1:35:33 AM", + "zh": "1980/7/25 01:35:33", + "hi": "25/7/1980, 1:35:33 am", + "es": "25/7/1980, 1:35:33", + "fr": "25/07/1980 01:35:33", + "ar": "٢٥‏/٧‏/١٩٨٠، ١:٣٥:٣٣ ص", + "bn": "২৫/৭/১৯৮০, ১:৩৫:৩৩ AM", + "ru": "25.07.1980, 01:35:33", + "pt": "25/07/1980, 01:35:33", + "ur": "25/7/1980، 1:35:33 AM", + "id": "25/7/1980, 01.35.33", + "de": "25.7.1980, 01:35:33", + "ja": "1980/7/25 1:35:33", + "pcm": "25/7/1980 01:35:33", + "mr": "२५/७/१९८०, १:३५:३३ AM", + "te": "25/7/1980 1:35:33 AM" + }, + "dateFormats": { + "en": "7/25/1980", + "zh": "1980/7/25", + "hi": "25/7/1980", + "es": "25/7/1980", + "fr": "25/07/1980", + "ar": "٢٥‏/٧‏/١٩٨٠", + "bn": "২৫/৭/১৯৮০", + "ru": "25.07.1980", + "pt": "25/07/1980", + "ur": "25/7/1980", + "id": "25/7/1980", + "de": "25.7.1980", + "ja": "1980/7/25", + "pcm": "25/7/1980", + "mr": "२५/७/१९८०", + "te": "25/7/1980" + }, + "displayNames": { + "en": "Switzerland", + "zh": "瑞士", + "hi": "स्विट्ज़रलैंड", + "es": "Suiza", + "fr": "Suisse", + "ar": "سويسرا", + "bn": "সুইজারল্যান্ড", + "ru": "Швейцария", + "pt": "Suíça", + "ur": "سوئٹزر لینڈ", + "id": "Swiss", + "de": "Schweiz", + "ja": "スイス", + "pcm": "Swítsaland", + "mr": "स्वित्झर्लंड", + "te": "స్విట్జర్లాండ్" + }, + "numberFormats": { + "en": "275,760.913", + "zh": "275,760.913", + "hi": "2,75,760.913", + "es": "275.760,913", + "fr": "275 760,913", + "ar": "٢٧٥٬٧٦٠٫٩١٣", + "bn": "২,৭৫,৭৬০.৯১৩", + "ru": "275 760,913", + "pt": "275.760,913", + "ur": "275,760.913", + "id": "275.760,913", + "de": "275.760,913", + "ja": "275,760.913", + "pcm": "275,760.913", + "mr": "२,७५,७६०.९१३", + "te": "2,75,760.913" + }, + "pluralRules": { + "en": "other", + "zh": "other", + "hi": "one", + "es": "other", + "fr": "one", + "ar": "zero", + "bn": "one", + "ru": "many", + "pt": "one", + "ur": "other", + "id": "other", + "de": "other", + "ja": "other", + "pcm": "one", + "mr": "other", + "te": "other" + }, + "relativeTime": { + "en": "586,920.617 hours ago", + "zh": "586,920.617小时前", + "hi": "5,86,920.617 घंटे पहले", + "es": "hace 586.920,617 horas", + "fr": "il y a 586 920,617 heures", + "ar": "قبل ٥٨٦٬٩٢٠٫٦١٧ ساعة", + "bn": "৫,৮৬,৯২০.৬১৭ ঘন্টা আগে", + "ru": "586 920,617 часа назад", + "pt": "há 586.920,617 horas", + "ur": "586,920.617 گھنٹے پہلے", + "id": "586.920,617 jam yang lalu", + "de": "vor 586.920,617 Stunden", + "ja": "586,920.617 時間前", + "pcm": "586,920.617 áwa wé dọ́n pas", + "mr": "५,८६,९२०.६१७ तासांपूर्वी", + "te": "5,86,920.617 గంటల క్రితం" + } +} diff --git a/test/fixtures/icu/localizationData-v75.1.json b/test/fixtures/icu/localizationData-v75.1.json new file mode 100644 index 00000000000000..65671ba5acb299 --- /dev/null +++ b/test/fixtures/icu/localizationData-v75.1.json @@ -0,0 +1,128 @@ +{ + "dateStrings": { + "en": "Fri Jul 25 1980 01:35:33 GMT+0100 (Central European Standard Time)", + "zh": "Fri Jul 25 1980 01:35:33 GMT+0100 (中欧标准时间)", + "hi": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्य यूरोपीय मानक समय)", + "es": "Fri Jul 25 1980 01:35:33 GMT+0100 (hora estándar de Europa central)", + "fr": "Fri Jul 25 1980 01:35:33 GMT+0100 (heure normale d’Europe centrale)", + "ar": "Fri Jul 25 1980 01:35:33 GMT+0100 (توقيت وسط أوروبا الرسمي)", + "bn": "Fri Jul 25 1980 01:35:33 GMT+0100 (মধ্য ইউরোপীয় মানক সময়)", + "ru": "Fri Jul 25 1980 01:35:33 GMT+0100 (Центральная Европа, стандартное время)", + "pt": "Fri Jul 25 1980 01:35:33 GMT+0100 (Horário Padrão da Europa Central)", + "ur": "Fri Jul 25 1980 01:35:33 GMT+0100 (وسطی یورپ کا معیاری وقت)", + "id": "Fri Jul 25 1980 01:35:33 GMT+0100 (Waktu Standar Eropa Tengah)", + "de": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mitteleuropäische Normalzeit)", + "ja": "Fri Jul 25 1980 01:35:33 GMT+0100 (中央ヨーロッパ標準時)", + "pcm": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mídúl Yúrop Fíksd Taim)", + "mr": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्‍य युरोपियन प्रमाण वेळ)", + "te": "Fri Jul 25 1980 01:35:33 GMT+0100 (సెంట్రల్ యూరోపియన్ ప్రామాణిక సమయం)" + }, + "dateTimeFormats": { + "en": "7/25/1980, 1:35:33 AM", + "zh": "1980/7/25 01:35:33", + "hi": "25/7/1980, 1:35:33 am", + "es": "25/7/1980, 1:35:33", + "fr": "25/07/1980 01:35:33", + "ar": "٢٥‏/٧‏/١٩٨٠، ١:٣٥:٣٣ ص", + "bn": "২৫/৭/১৯৮০, ১:৩৫:৩৩ AM", + "ru": "25.07.1980, 01:35:33", + "pt": "25/07/1980, 01:35:33", + "ur": "25/7/1980، 1:35:33 AM", + "id": "25/7/1980, 01.35.33", + "de": "25.7.1980, 01:35:33", + "ja": "1980/7/25 1:35:33", + "pcm": "25/7/1980 01:35:33", + "mr": "२५/७/१९८०, १:३५:३३ AM", + "te": "25/7/1980 1:35:33 AM" + }, + "dateFormats": { + "en": "7/25/1980", + "zh": "1980/7/25", + "hi": "25/7/1980", + "es": "25/7/1980", + "fr": "25/07/1980", + "ar": "٢٥‏/٧‏/١٩٨٠", + "bn": "২৫/৭/১৯৮০", + "ru": "25.07.1980", + "pt": "25/07/1980", + "ur": "25/7/1980", + "id": "25/7/1980", + "de": "25.7.1980", + "ja": "1980/7/25", + "pcm": "25/7/1980", + "mr": "२५/७/१९८०", + "te": "25/7/1980" + }, + "displayNames": { + "en": "Switzerland", + "zh": "瑞士", + "hi": "स्विट्ज़रलैंड", + "es": "Suiza", + "fr": "Suisse", + "ar": "سويسرا", + "bn": "সুইজারল্যান্ড", + "ru": "Швейцария", + "pt": "Suíça", + "ur": "سوئٹزر لینڈ", + "id": "Swiss", + "de": "Schweiz", + "ja": "スイス", + "pcm": "Swítsaland", + "mr": "स्वित्झर्लंड", + "te": "స్విట్జర్లాండ్" + }, + "numberFormats": { + "en": "275,760.913", + "zh": "275,760.913", + "hi": "2,75,760.913", + "es": "275.760,913", + "fr": "275 760,913", + "ar": "٢٧٥٬٧٦٠٫٩١٣", + "bn": "২,৭৫,৭৬০.৯১৩", + "ru": "275 760,913", + "pt": "275.760,913", + "ur": "275,760.913", + "id": "275.760,913", + "de": "275.760,913", + "ja": "275,760.913", + "pcm": "275,760.913", + "mr": "२,७५,७६०.९१३", + "te": "2,75,760.913" + }, + "pluralRules": { + "en": "other", + "zh": "other", + "hi": "one", + "es": "other", + "fr": "one", + "ar": "zero", + "bn": "one", + "ru": "many", + "pt": "one", + "ur": "other", + "id": "other", + "de": "other", + "ja": "other", + "pcm": "one", + "mr": "other", + "te": "other" + }, + "relativeTime": { + "en": "586,920.617 hours ago", + "zh": "586,920.617小时前", + "hi": "5,86,920.617 घंटे पहले", + "es": "hace 586.920,617 horas", + "fr": "il y a 586 920,617 heures", + "ar": "قبل ٥٨٦٬٩٢٠٫٦١٧ ساعة", + "bn": "৫,৮৬,৯২০.৬১৭ ঘন্টা আগে", + "ru": "586 920,617 часа назад", + "pt": "há 586.920,617 horas", + "ur": "586,920.617 گھنٹے پہلے", + "id": "586.920,617 jam yang lalu", + "de": "vor 586.920,617 Stunden", + "ja": "586,920.617 時間前", + "pcm": "586,920.617 áwa wé dọ́n pas", + "mr": "५,८६,९२०.६१७ तासांपूर्वी", + "te": "5,86,920.617 గంటల క్రితం" + } +} diff --git a/test/fixtures/icu/localizationData-v76.1.json b/test/fixtures/icu/localizationData-v76.1.json new file mode 100644 index 00000000000000..cb519d2bea2faa --- /dev/null +++ b/test/fixtures/icu/localizationData-v76.1.json @@ -0,0 +1,128 @@ +{ + "dateStrings": { + "en": "Fri Jul 25 1980 01:35:33 GMT+0100 (Central European Standard Time)", + "zh": "Fri Jul 25 1980 01:35:33 GMT+0100 (中欧标准时间)", + "hi": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्य यूरोपीय मानक समय)", + "es": "Fri Jul 25 1980 01:35:33 GMT+0100 (hora estándar de Europa central)", + "fr": "Fri Jul 25 1980 01:35:33 GMT+0100 (heure normale d’Europe centrale)", + "ar": "Fri Jul 25 1980 01:35:33 GMT+0100 (توقيت وسط أوروبا الرسمي)", + "bn": "Fri Jul 25 1980 01:35:33 GMT+0100 (মধ্য ইউরোপীয় মানক সময়)", + "ru": "Fri Jul 25 1980 01:35:33 GMT+0100 (Центральная Европа, стандартное время)", + "pt": "Fri Jul 25 1980 01:35:33 GMT+0100 (Horário Padrão da Europa Central)", + "ur": "Fri Jul 25 1980 01:35:33 GMT+0100 (وسطی یورپ کا معیاری وقت)", + "id": "Fri Jul 25 1980 01:35:33 GMT+0100 (Waktu Standar Eropa Tengah)", + "de": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mitteleuropäische Normalzeit)", + "ja": "Fri Jul 25 1980 01:35:33 GMT+0100 (中央ヨーロッパ標準時)", + "pcm": "Fri Jul 25 1980 01:35:33 GMT+0100 (Mídúl Yúrop Fíksd Taim)", + "mr": "Fri Jul 25 1980 01:35:33 GMT+0100 (मध्‍य युरोपियन प्रमाण वेळ)", + "te": "Fri Jul 25 1980 01:35:33 GMT+0100 (సెంట్రల్ యూరోపియన్ ప్రామాణిక సమయం)" + }, + "dateTimeFormats": { + "en": "7/25/1980, 1:35:33 AM", + "zh": "1980/7/25 01:35:33", + "hi": "25/7/1980, 1:35:33 am", + "es": "25/7/1980, 1:35:33", + "fr": "25/07/1980 01:35:33", + "ar": "25‏/7‏/1980، 1:35:33 ص", + "bn": "২৫/৭/১৯৮০, ১:৩৫:৩৩ AM", + "ru": "25.07.1980, 01:35:33", + "pt": "25/07/1980, 01:35:33", + "ur": "25/7/1980، 1:35:33 AM", + "id": "25/7/1980, 01.35.33", + "de": "25.7.1980, 01:35:33", + "ja": "1980/7/25 1:35:33", + "pcm": "25/7/1980 01:35:33", + "mr": "२५/७/१९८०, १:३५:३३ AM", + "te": "25/7/1980 1:35:33 AM" + }, + "dateFormats": { + "en": "7/25/1980", + "zh": "1980/7/25", + "hi": "25/7/1980", + "es": "25/7/1980", + "fr": "25/07/1980", + "ar": "25‏/7‏/1980", + "bn": "২৫/৭/১৯৮০", + "ru": "25.07.1980", + "pt": "25/07/1980", + "ur": "25/7/1980", + "id": "25/7/1980", + "de": "25.7.1980", + "ja": "1980/7/25", + "pcm": "25/7/1980", + "mr": "२५/७/१९८०", + "te": "25/7/1980" + }, + "displayNames": { + "en": "Switzerland", + "zh": "瑞士", + "hi": "स्विट्ज़रलैंड", + "es": "Suiza", + "fr": "Suisse", + "ar": "سويسرا", + "bn": "সুইজারল্যান্ড", + "ru": "Швейцария", + "pt": "Suíça", + "ur": "سوئٹزر لینڈ", + "id": "Swiss", + "de": "Schweiz", + "ja": "スイス", + "pcm": "Swítsaland", + "mr": "स्वित्झर्लंड", + "te": "స్విట్జర్లాండ్" + }, + "numberFormats": { + "en": "275,760.913", + "zh": "275,760.913", + "hi": "2,75,760.913", + "es": "275.760,913", + "fr": "275 760,913", + "ar": "275,760.913", + "bn": "২,৭৫,৭৬০.৯১৩", + "ru": "275 760,913", + "pt": "275.760,913", + "ur": "275,760.913", + "id": "275.760,913", + "de": "275.760,913", + "ja": "275,760.913", + "pcm": "275,760.913", + "mr": "२,७५,७६०.९१३", + "te": "2,75,760.913" + }, + "pluralRules": { + "en": "other", + "zh": "other", + "hi": "one", + "es": "other", + "fr": "one", + "ar": "zero", + "bn": "one", + "ru": "many", + "pt": "one", + "ur": "other", + "id": "other", + "de": "other", + "ja": "other", + "pcm": "one", + "mr": "other", + "te": "other" + }, + "relativeTime": { + "en": "586,920.617 hours ago", + "zh": "586,920.617小时前", + "hi": "5,86,920.617 घंटे पहले", + "es": "hace 586.920,617 horas", + "fr": "il y a 586 920,617 heures", + "ar": "قبل 586,920.617 ساعة", + "bn": "৫,৮৬,৯২০.৬১৭ ঘন্টা আগে", + "ru": "586 920,617 часа назад", + "pt": "há 586.920,617 horas", + "ur": "586,920.617 گھنٹے پہلے", + "id": "586.920,617 jam yang lalu", + "de": "vor 586.920,617 Stunden", + "ja": "586,920.617 時間前", + "pcm": "586,920.617 áwa wé dọ́n pas", + "mr": "५,८६,९२०.६१७ तासांपूर्वी", + "te": "5,86,920.617 గంటల క్రితం" + } +} diff --git a/test/parallel/test-icu-env.js b/test/parallel/test-icu-env.js index 7a153d41beae5a..afa36132f60e8d 100644 --- a/test/parallel/test-icu-env.js +++ b/test/parallel/test-icu-env.js @@ -2,11 +2,38 @@ const common = require('../common'); const assert = require('assert'); const { execFileSync } = require('child_process'); +const { readFileSync, globSync } = require('fs'); +const { path } = require('../common/fixtures'); + + +// This test checks for regressions in environment variable handling and +// caching, but the localization data originated from ICU might change +// over time. +// +// The json file can be updated using `tools/icu/update-test-data.js` +// whenever ICU is updated. Run the update script if this test fails after +// an ICU update, and verify that only expected values are updated. +// Typically, only a few strings change with each ICU update. If this script +// suddenly generates identical values for all locales, it indicates a bug. +// Editing json file manually is also fine. +const localizationDataFile = path(`icu/localizationData-v${process.versions.icu}.json`); + +let localizationData; +try { + localizationData = JSON.parse(readFileSync(localizationDataFile)); +} catch ({ code }) { + assert.strictEqual(code, 'ENOENT'); + + // No data for current version, try latest known version + const [ latestVersion ] = globSync('test/fixtures/icu/localizationData-*.json') + .map((file) => file.match(/localizationData-v(.*)\.json/)[1]) + .sort((a, b) => b.localeCompare(a, undefined, { numeric: true })); + console.log(`The ICU is v${process.versions.icu}, but there is no fixture for this version. ` + + `Trying the latest known version: v${latestVersion}. If this test fails with a few strings changed ` + + `after ICU update, run this: \n${process.argv[0]} tools/icu/update-test-data.mjs\n`); + localizationData = JSON.parse(readFileSync(path(`icu/localizationData-v${latestVersion}.json`))); +} -// system-icu should not be tested -const hasBuiltinICU = process.config.variables.icu_gyp_path === 'tools/icu/icu-generic.gyp'; -if (!hasBuiltinICU) - common.skip('system ICU'); // small-icu doesn't support non-English locales const hasFullICU = (() => { @@ -100,45 +127,11 @@ if (isMockable) { ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG, TZ: 'Europe/Zurich' }, 'new Date(333333333333).toString()')), - [ - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Central European Standard Time)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (中欧标准时间)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (मध्य यूरोपीय मानक समय)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (hora estándar de Europa central)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (heure normale d’Europe centrale)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (توقيت وسط أوروبا الرسمي)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (মধ্য ইউরোপীয় মানক সময়)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Центральная Европа, стандартное время)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Horário Padrão da Europa Central)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (وسطی یورپ کا معیاری وقت)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Waktu Standar Eropa Tengah)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Mitteleuropäische Normalzeit)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (中央ヨーロッパ標準時)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (Mídúl Yúrop Fíksd Taim)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (मध्‍य युरोपियन प्रमाण वेळ)', - 'Fri Jul 25 1980 01:35:33 GMT+0100 (సెంట్రల్ యూరోపియన్ ప్రామాణిక సమయం)', - ] + Object.values(localizationData.dateStrings) ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG, TZ: 'Europe/Zurich' }, 'new Date(333333333333).toLocaleString()')), - [ - '7/25/1980, 1:35:33 AM', - '1980/7/25 01:35:33', - '25/7/1980, 1:35:33 am', - '25/7/1980, 1:35:33', - '25/07/1980 01:35:33', - '25‏/7‏/1980، 1:35:33 ص', - '২৫/৭/১৯৮০, ১:৩৫:৩৩ AM', - '25.07.1980, 01:35:33', - '25/07/1980, 01:35:33', - '25/7/1980، 1:35:33 AM', - '25/7/1980, 01.35.33', - '25.7.1980, 01:35:33', - '1980/7/25 1:35:33', - '25/7/1980 01:35:33', - '२५/७/१९८०, १:३५:३३ AM', - '25/7/1980 1:35:33 AM', - ] + Object.values(localizationData.dateTimeFormats) ); assert.strictEqual( runEnvOutside({ LANG: 'en' }, '["z", "ä"].sort(new Intl.Collator().compare)'), @@ -152,72 +145,23 @@ if (isMockable) { locales.map( (LANG) => runEnvOutside({ LANG, TZ: 'Europe/Zurich' }, 'new Intl.DateTimeFormat().format(333333333333)') ), - [ - '7/25/1980', '1980/7/25', - '25/7/1980', '25/7/1980', - '25/07/1980', '25‏/7‏/1980', - '২৫/৭/১৯৮০', '25.07.1980', - '25/07/1980', '25/7/1980', - '25/7/1980', '25.7.1980', - '1980/7/25', '25/7/1980', - '२५/७/१९८०', '25/7/1980', - ] + Object.values(localizationData.dateFormats) ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG }, 'new Intl.DisplayNames(undefined, { type: "region" }).of("CH")')), - [ - 'Switzerland', '瑞士', - 'स्विट्ज़रलैंड', 'Suiza', - 'Suisse', 'سويسرا', - 'সুইজারল্যান্ড', 'Швейцария', - 'Suíça', 'سوئٹزر لینڈ', - 'Swiss', 'Schweiz', - 'スイス', 'Swítsaland', - 'स्वित्झर्लंड', 'స్విట్జర్లాండ్', - ] + Object.values(localizationData.displayNames) ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG }, 'new Intl.NumberFormat().format(275760.913)')), - [ - '275,760.913', '275,760.913', - '2,75,760.913', '275.760,913', - '275 760,913', '275,760.913', - '২,৭৫,৭৬০.৯১৩', '275 760,913', - '275.760,913', '275,760.913', - '275.760,913', '275.760,913', - '275,760.913', '275,760.913', - '२,७५,७६०.९१३', '2,75,760.913', - ] + Object.values(localizationData.numberFormats) ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG }, 'new Intl.PluralRules().select(0)')), - [ - 'other', 'other', 'one', 'other', - 'one', 'zero', 'one', 'many', - 'one', 'other', 'other', 'other', - 'other', 'one', 'other', 'other', - ] + Object.values(localizationData.pluralRules) ); assert.deepStrictEqual( locales.map((LANG) => runEnvOutside({ LANG }, 'new Intl.RelativeTimeFormat().format(-586920.617, "hour")')), - [ - '586,920.617 hours ago', - '586,920.617小时前', - '5,86,920.617 घंटे पहले', - 'hace 586.920,617 horas', - 'il y a 586 920,617 heures', - 'قبل 586,920.617 ساعة', - '৫,৮৬,৯২০.৬১৭ ঘন্টা আগে', - '586 920,617 часа назад', - 'há 586.920,617 horas', - '586,920.617 گھنٹے پہلے', - '586.920,617 jam yang lalu', - 'vor 586.920,617 Stunden', - '586,920.617 時間前', - '586,920.617 áwa wé dọ́n pas', - '५,८६,९२०.६१७ तासांपूर्वी', - '5,86,920.617 గంటల క్రితం', - ] + Object.values(localizationData.relativeTime) ); } diff --git a/tools/icu/update-test-data.mjs b/tools/icu/update-test-data.mjs new file mode 100644 index 00000000000000..fae784b07e958e --- /dev/null +++ b/tools/icu/update-test-data.mjs @@ -0,0 +1,81 @@ +/* + * This script updates the `test/fixtures/icu/localizationData.json` data + * used by `test/parallel/test-icu-env.js` test. + * Run this script after an ICU update if locale-specific output changes are + * causing the test to fail. + * Typically, only a few strings change with each ICU update. If this script + * suddenly generates identical values for all locales, it indicates a bug. + * Note that Node.js must be built with either `--with-intl=full-icu` after + * updating ICU, or with `--with-intl=system-icu` if system version matches. + * Wrong version or small-icu might produce wrong values. + * Manually editing the json file is fine, too. + */ + +import { execFileSync } from 'node:child_process'; +import { writeFileSync } from 'node:fs'; + +const locales = [ + 'en', 'zh', 'hi', 'es', + 'fr', 'ar', 'bn', 'ru', + 'pt', 'ur', 'id', 'de', + 'ja', 'pcm', 'mr', 'te', +]; + +const outputFilePath = new URL(`../../test/fixtures/icu/localizationData-v${process.versions.icu}.json`, import.meta.url); + +const runEnvCommand = (envVars, code) => + execFileSync( + process.execPath, + ['-e', `process.stdout.write(String(${code}));`], + { env: { ...process.env, ...envVars }, encoding: 'utf8' }, + ); + +// Generate the localization data for all locales +const localizationData = locales.reduce((acc, locale) => { + acc.dateStrings[locale] = runEnvCommand( + { LANG: locale, TZ: 'Europe/Zurich' }, + `new Date(333333333333).toString()`, + ); + + acc.dateTimeFormats[locale] = runEnvCommand( + { LANG: locale, TZ: 'Europe/Zurich' }, + `new Date(333333333333).toLocaleString()`, + ); + + acc.dateFormats[locale] = runEnvCommand( + { LANG: locale, TZ: 'Europe/Zurich' }, + `new Intl.DateTimeFormat().format(333333333333)`, + ); + + acc.displayNames[locale] = runEnvCommand( + { LANG: locale }, + `new Intl.DisplayNames(undefined, { type: "region" }).of("CH")`, + ); + + acc.numberFormats[locale] = runEnvCommand( + { LANG: locale }, + `new Intl.NumberFormat().format(275760.913)`, + ); + + acc.pluralRules[locale] = runEnvCommand( + { LANG: locale }, + `new Intl.PluralRules().select(0)`, + ); + + acc.relativeTime[locale] = runEnvCommand( + { LANG: locale }, + `new Intl.RelativeTimeFormat().format(-586920.617, "hour")`, + ); + + return acc; +}, { + dateStrings: {}, + dateTimeFormats: {}, + dateFormats: {}, + displayNames: {}, + numberFormats: {}, + pluralRules: {}, + relativeTime: {}, +}); + +writeFileSync(outputFilePath, JSON.stringify(localizationData, null, 2) + '\n'); From 5665e86da65f3443c70cc32cf68cb0cb442eeb10 Mon Sep 17 00:00:00 2001 From: Shima Ryuhei <65934663+islandryu@users.noreply.github.com> Date: Wed, 11 Dec 2024 08:52:29 +0900 Subject: [PATCH 31/88] module: prevent main thread exiting before esm worker ends PR-URL: https://github.com/nodejs/node/pull/56183 Reviewed-By: Matteo Collina Reviewed-By: Antoine du Hamel Reviewed-By: Jacob Smith --- lib/internal/modules/esm/worker.js | 6 ++++-- .../test-esm-loader-spawn-promisified.mjs | 16 ++++++++++++++++ test/fixtures/es-module-loaders/hooks-custom.mjs | 7 +++++++ 3 files changed, 27 insertions(+), 2 deletions(-) diff --git a/lib/internal/modules/esm/worker.js b/lib/internal/modules/esm/worker.js index 311d77fb099384..0213df7a92a0eb 100644 --- a/lib/internal/modules/esm/worker.js +++ b/lib/internal/modules/esm/worker.js @@ -215,8 +215,6 @@ async function customizedModuleWorker(lock, syncCommPort, errorHandler) { (port ?? syncCommPort).postMessage(wrapMessage('error', exception)); } - AtomicsAdd(lock, WORKER_TO_MAIN_THREAD_NOTIFICATION, 1); - AtomicsNotify(lock, WORKER_TO_MAIN_THREAD_NOTIFICATION); if (shouldRemoveGlobalErrorHandler) { process.off('uncaughtException', errorHandler); } @@ -225,6 +223,10 @@ async function customizedModuleWorker(lock, syncCommPort, errorHandler) { // We keep checking for new messages to not miss any. clearImmediate(immediate); immediate = setImmediate(checkForMessages).unref(); + // To prevent the main thread from terminating before this function completes after unlocking, + // the following process is executed at the end of the function. + AtomicsAdd(lock, WORKER_TO_MAIN_THREAD_NOTIFICATION, 1); + AtomicsNotify(lock, WORKER_TO_MAIN_THREAD_NOTIFICATION); } } diff --git a/test/es-module/test-esm-loader-spawn-promisified.mjs b/test/es-module/test-esm-loader-spawn-promisified.mjs index 628ff3f0d423e5..2f27f7850f646e 100644 --- a/test/es-module/test-esm-loader-spawn-promisified.mjs +++ b/test/es-module/test-esm-loader-spawn-promisified.mjs @@ -285,4 +285,20 @@ describe('Loader hooks parsing modules', { concurrency: !process.env.TEST_PARALL assert.strictEqual(code, 0); assert.strictEqual(signal, null); }); + + it('throw maximum call stack error on the loader', async () => { + const { code, signal, stdout, stderr } = await spawnPromisified(execPath, [ + '--no-warnings', + '--experimental-loader', + fixtures.fileURL('/es-module-loaders/hooks-custom.mjs'), + '--input-type=module', + '--eval', + 'await import("esmHook/maximumCallStack.mjs")', + ]); + + assert(stderr.includes('Maximum call stack size exceeded')); + assert.strictEqual(stdout, ''); + assert.strictEqual(code, 1); + assert.strictEqual(signal, null); + }); }); diff --git a/test/fixtures/es-module-loaders/hooks-custom.mjs b/test/fixtures/es-module-loaders/hooks-custom.mjs index 3c38649a88794f..5109d20f4d3711 100644 --- a/test/fixtures/es-module-loaders/hooks-custom.mjs +++ b/test/fixtures/es-module-loaders/hooks-custom.mjs @@ -105,5 +105,12 @@ export function load(url, context, next) { }; } + if (url.endsWith('esmHook/maximumCallStack.mjs')) { + function recurse() { + recurse(); + } + recurse(); + } + return next(url); } From f3b3ff85e0d084fa1a493e112c7fe0e5c47f09e1 Mon Sep 17 00:00:00 2001 From: Anton Kastritskii Date: Wed, 11 Dec 2024 01:43:13 +0000 Subject: [PATCH 32/88] doc: call out import.meta is only supported in ES modules MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56186 Reviewed-By: Matteo Collina Reviewed-By: Michaël Zasso --- doc/api/esm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/api/esm.md b/doc/api/esm.md index ecb27680129737..23348d28c3649c 100644 --- a/doc/api/esm.md +++ b/doc/api/esm.md @@ -340,7 +340,7 @@ modules it can be used to load ES modules. * {Object} The `import.meta` meta property is an `Object` that contains the following -properties. +properties. It is only supported in ES modules. ### `import.meta.dirname` From 77397c5013e7f5595f7184d8d62da1111a84ec75 Mon Sep 17 00:00:00 2001 From: Antoine du Hamel Date: Wed, 11 Dec 2024 12:11:38 +0100 Subject: [PATCH 33/88] util: do not rely on mutable `Object` and `Function`' `constructor` prop PR-URL: https://github.com/nodejs/node/pull/56188 Fixes: https://github.com/nodejs/node/issues/55924 Reviewed-By: Ruben Bridgewater Reviewed-By: Jordan Harband --- lib/internal/util/inspect.js | 30 +++++++++++++++++++++++++----- test/parallel/test-util-inspect.js | 30 ++++++++++++++++++++++++++++++ 2 files changed, 55 insertions(+), 5 deletions(-) diff --git a/lib/internal/util/inspect.js b/lib/internal/util/inspect.js index a697459468d7b9..1f1be555c96e08 100644 --- a/lib/internal/util/inspect.js +++ b/lib/internal/util/inspect.js @@ -22,8 +22,11 @@ const { DatePrototypeToISOString, DatePrototypeToString, ErrorPrototypeToString, + Function, + FunctionPrototype, FunctionPrototypeBind, FunctionPrototypeCall, + FunctionPrototypeSymbolHasInstance, FunctionPrototypeToString, JSONStringify, MapPrototypeEntries, @@ -50,6 +53,7 @@ const { ObjectGetPrototypeOf, ObjectIs, ObjectKeys, + ObjectPrototype, ObjectPrototypeHasOwnProperty, ObjectPrototypePropertyIsEnumerable, ObjectSeal, @@ -593,10 +597,26 @@ function isInstanceof(object, proto) { } } +// Special-case for some builtin prototypes in case their `constructor` property has been tampered. +const wellKnownPrototypes = new SafeMap(); +wellKnownPrototypes.set(ObjectPrototype, { name: 'Object', constructor: Object }); +wellKnownPrototypes.set(FunctionPrototype, { name: 'Function', constructor: Function }); + function getConstructorName(obj, ctx, recurseTimes, protoProps) { let firstProto; const tmp = obj; while (obj || isUndetectableObject(obj)) { + const wellKnownPrototypeNameAndConstructor = wellKnownPrototypes.get(obj); + if (wellKnownPrototypeNameAndConstructor != null) { + const { name, constructor } = wellKnownPrototypeNameAndConstructor; + if (FunctionPrototypeSymbolHasInstance(constructor, tmp)) { + if (protoProps !== undefined && firstProto !== obj) { + addPrototypeProperties( + ctx, tmp, firstProto || tmp, recurseTimes, protoProps); + } + return name; + } + } const descriptor = ObjectGetOwnPropertyDescriptor(obj, 'constructor'); if (descriptor !== undefined && typeof descriptor.value === 'function' && @@ -954,7 +974,11 @@ function formatRaw(ctx, value, recurseTimes, typedArray) { if (noIterator) { keys = getKeys(value, ctx.showHidden); braces = ['{', '}']; - if (constructor === 'Object') { + if (typeof value === 'function') { + base = getFunctionBase(value, constructor, tag); + if (keys.length === 0 && protoProps === undefined) + return ctx.stylize(base, 'special'); + } else if (constructor === 'Object') { if (isArgumentsObject(value)) { braces[0] = '[Arguments] {'; } else if (tag !== '') { @@ -963,10 +987,6 @@ function formatRaw(ctx, value, recurseTimes, typedArray) { if (keys.length === 0 && protoProps === undefined) { return `${braces[0]}}`; } - } else if (typeof value === 'function') { - base = getFunctionBase(value, constructor, tag); - if (keys.length === 0 && protoProps === undefined) - return ctx.stylize(base, 'special'); } else if (isRegExp(value)) { // Make RegExps say that they are RegExps base = RegExpPrototypeToString( diff --git a/test/parallel/test-util-inspect.js b/test/parallel/test-util-inspect.js index 3da292fc663c35..04fc82cfc1aa80 100644 --- a/test/parallel/test-util-inspect.js +++ b/test/parallel/test-util-inspect.js @@ -3323,3 +3323,33 @@ assert.strictEqual( } }), '{ [Symbol(Symbol.iterator)]: [Getter] }'); } + +{ + const o = {}; + const { prototype: BuiltinPrototype } = Object; + const desc = Reflect.getOwnPropertyDescriptor(BuiltinPrototype, 'constructor'); + Object.defineProperty(BuiltinPrototype, 'constructor', { + get: () => BuiltinPrototype, + configurable: true, + }); + assert.strictEqual( + util.inspect(o), + '{}', + ); + Object.defineProperty(BuiltinPrototype, 'constructor', desc); +} + +{ + const o = { f() {} }; + const { prototype: BuiltinPrototype } = Function; + const desc = Reflect.getOwnPropertyDescriptor(BuiltinPrototype, 'constructor'); + Object.defineProperty(BuiltinPrototype, 'constructor', { + get: () => BuiltinPrototype, + configurable: true, + }); + assert.strictEqual( + util.inspect(o), + '{ f: [Function: f] }', + ); + Object.defineProperty(BuiltinPrototype, 'constructor', desc); +} From b0ebd23e52aff9e96be7a2b58b17da515a237f76 Mon Sep 17 00:00:00 2001 From: ZYSzys Date: Wed, 11 Dec 2024 21:37:21 +0800 Subject: [PATCH 34/88] http2: support ALPNCallback option MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56187 Fixes: https://github.com/nodejs/node/issues/55994 Refs: https://github.com/nodejs/node/pull/45190 Reviewed-By: Tim Perry Reviewed-By: Juan José Arboleda Reviewed-By: Luigi Pinca --- lib/internal/http2/core.js | 10 +++++-- test/parallel/test-http2-alpn.js | 47 ++++++++++++++++++++++++++++++++ 2 files changed, 54 insertions(+), 3 deletions(-) create mode 100644 test/parallel/test-http2-alpn.js diff --git a/lib/internal/http2/core.js b/lib/internal/http2/core.js index 6ce633092bca4b..b41e1baee24644 100644 --- a/lib/internal/http2/core.js +++ b/lib/internal/http2/core.js @@ -3136,9 +3136,13 @@ function initializeOptions(options) { function initializeTLSOptions(options, servername) { options = initializeOptions(options); - options.ALPNProtocols = ['h2']; - if (options.allowHTTP1 === true) - options.ALPNProtocols.push('http/1.1'); + + if (!options.ALPNCallback) { + options.ALPNProtocols = ['h2']; + if (options.allowHTTP1 === true) + options.ALPNProtocols.push('http/1.1'); + } + if (servername !== undefined && !options.servername) options.servername = servername; return options; diff --git a/test/parallel/test-http2-alpn.js b/test/parallel/test-http2-alpn.js new file mode 100644 index 00000000000000..a073d26e576cce --- /dev/null +++ b/test/parallel/test-http2-alpn.js @@ -0,0 +1,47 @@ +'use strict'; +const common = require('../common'); +const fixtures = require('../common/fixtures'); + +// This test verifies that http2 server support ALPNCallback option. + +if (!common.hasCrypto) common.skip('missing crypto'); + +const assert = require('assert'); +const h2 = require('http2'); +const tls = require('tls'); + +{ + // Server sets two incompatible ALPN options: + assert.throws(() => h2.createSecureServer({ + ALPNCallback: () => 'a', + ALPNProtocols: ['b', 'c'] + }), (error) => error.code === 'ERR_TLS_ALPN_CALLBACK_WITH_PROTOCOLS'); +} + +{ + const server = h2.createSecureServer({ + key: fixtures.readKey('rsa_private.pem'), + cert: fixtures.readKey('rsa_cert.crt'), + ALPNCallback: () => 'a', + }); + + server.on( + 'secureConnection', + common.mustCall((socket) => { + assert.strictEqual(socket.alpnProtocol, 'a'); + socket.end(); + server.close(); + }) + ); + + server.listen(0, function() { + const client = tls.connect({ + port: server.address().port, + rejectUnauthorized: false, + ALPNProtocols: ['a'], + }, common.mustCall(() => { + assert.strictEqual(client.alpnProtocol, 'a'); + client.end(); + })); + }); +} From 8325fa5c04fa2156364800db80591472ba412f0f Mon Sep 17 00:00:00 2001 From: Stephen Belanger Date: Wed, 11 Dec 2024 21:37:36 +0800 Subject: [PATCH 35/88] worker: fix crash when a worker joins after exit If a worker has not already joined before running to completion it will join in a SetImmediateThreadsafe which could occur after the worker has already ended by other means. Mutating a JS object at that point would fail because the isolate is already disposed. PR-URL: https://github.com/nodejs/node/pull/56191 Reviewed-By: Matteo Collina Reviewed-By: Anna Henningsen Reviewed-By: Santiago Gimeno --- src/node_worker.cc | 3 +++ 1 file changed, 3 insertions(+) diff --git a/src/node_worker.cc b/src/node_worker.cc index e8026fe24c7021..f64609cf045441 100644 --- a/src/node_worker.cc +++ b/src/node_worker.cc @@ -449,6 +449,9 @@ void Worker::JoinThread() { env()->remove_sub_worker_context(this); + // Join may happen after the worker exits and disposes the isolate + if (!env()->can_call_into_js()) return; + { HandleScope handle_scope(env()->isolate()); Context::Scope context_scope(env()->context()); From 9863d2756623f09dc7af5b01f5bded36d7b6d3ae Mon Sep 17 00:00:00 2001 From: Joyee Cheung Date: Mon, 9 Dec 2024 15:47:25 +0100 Subject: [PATCH 36/88] module: only emit require(esm) warning under --trace-require-module require(esm) is relatively stable now and the experimental warning has run its course - it's now more troublesome than useful. This patch changes it to no longer emit a warning unless `--trace-require-module` is explicitly used. The flag supports two modes: - `--trace-require-module=all`: emit warnings for all usages - `--trace-require-module=no-node-modules`: emit warnings for usages that do not come from a `node_modules` folder. PR-URL: https://github.com/nodejs/node/pull/56194 Fixes: https://github.com/nodejs/node/issues/55417 Reviewed-By: James M Snell Reviewed-By: Antoine du Hamel Reviewed-By: Yagiz Nizipli Reviewed-By: Matteo Collina Reviewed-By: Geoffrey Booth Reviewed-By: Marco Ippolito Reviewed-By: Rafael Gonzaga --- doc/api/cli.md | 13 +++++++++ doc/api/modules.md | 11 +++++-- lib/internal/modules/cjs/loader.js | 29 +++++++++++-------- src/node_options.cc | 12 ++++++++ src/node_options.h | 1 + test/es-module/test-require-module-preload.js | 11 ------- test/es-module/test-require-module-warning.js | 14 +++++++-- test/es-module/test-require-module.js | 10 ------- .../test-require-node-modules-warning.js | 24 +++++++++++---- test/es-module/test-typescript-commonjs.mjs | 1 - test/es-module/test-typescript.mjs | 2 -- 11 files changed, 80 insertions(+), 48 deletions(-) diff --git a/doc/api/cli.md b/doc/api/cli.md index 8a4f376360008f..7d22f746860b89 100644 --- a/doc/api/cli.md +++ b/doc/api/cli.md @@ -2613,6 +2613,18 @@ added: Prints a stack trace whenever an environment is exited proactively, i.e. invoking `process.exit()`. +### `--trace-require-module=mode` + + + +Prints information about usage of [Loading ECMAScript modules using `require()`][]. + +When `mode` is `all`, all usage is printed. When `mode` is `no-node-modules`, usage +from the `node_modules` folder is excluded. + ### `--trace-sigint` + +An error occurred while loading a SQLite extension. + ### `ERR_MEMORY_ALLOCATION_FAILED` diff --git a/doc/api/permissions.md b/doc/api/permissions.md index a03285e28641e8..d03913e7858d66 100644 --- a/doc/api/permissions.md +++ b/doc/api/permissions.md @@ -147,6 +147,8 @@ There are constraints you need to know before using this system: flags that can be set via runtime through `v8.setFlagsFromString`. * OpenSSL engines cannot be requested at runtime when the Permission Model is enabled, affecting the built-in crypto, https, and tls modules. +* Run-Time Loadable Extensions cannot be loaded when the Permission Model is + enabled, affecting the sqlite module. * Using existing file descriptors via the `node:fs` module bypasses the Permission Model. diff --git a/doc/api/sqlite.md b/doc/api/sqlite.md index cdf5d690521a06..29d9525d6ea4d8 100644 --- a/doc/api/sqlite.md +++ b/doc/api/sqlite.md @@ -108,6 +108,10 @@ added: v22.5.0 [double-quoted string literals][]. This is not recommended but can be enabled for compatibility with legacy database schemas. **Default:** `false`. + * `allowExtension` {boolean} If `true`, the `loadExtension` SQL function + and the `loadExtension()` method are enabled. + You can call `enableLoadExtension(false)` later to disable this feature. + **Default:** `false`. Constructs a new `DatabaseSync` instance. @@ -120,6 +124,30 @@ added: v22.5.0 Closes the database connection. An exception is thrown if the database is not open. This method is a wrapper around [`sqlite3_close_v2()`][]. +### `database.loadExtension(path)` + + + +* `path` {string} The path to the shared library to load. + +Loads a shared library into the database connection. This method is a wrapper +around [`sqlite3_load_extension()`][]. It is required to enable the +`allowExtension` option when constructing the `DatabaseSync` instance. + +### `database.enableLoadExtension(allow)` + + + +* `allow` {boolean} Whether to allow loading extensions. + +Enables or disables the `loadExtension` SQL function, and the `loadExtension()` +method. When `allowExtension` is `false` when constructing, you cannot enable +loading extensions for security reasons. + ### `database.exec(sql)` -> Stability: 1.1 - Active development +> Stability: 2 - Stable. This flag configures file system read permissions using the [Permission Model][]. @@ -210,7 +213,7 @@ Examples can be found in the [File System Permissions][] documentation. The initializer module also needs to be allowed. Consider the following example: ```console -$ node --experimental-permission index.js +$ node --permission index.js Error: Access to this API has been restricted at node:internal/main/run_main_module:23:47 { @@ -223,7 +226,7 @@ Error: Access to this API has been restricted The process needs to have access to the `index.js` module: ```bash -node --experimental-permission --allow-fs-read=/path/to/index.js index.js +node --permission --allow-fs-read=/path/to/index.js index.js ``` ### `--allow-fs-write` @@ -231,12 +234,15 @@ node --experimental-permission --allow-fs-read=/path/to/index.js index.js -> Stability: 1.1 - Active development +> Stability: 2 - Stable. This flag configures file system write permissions using the [Permission Model][]. @@ -282,7 +288,7 @@ new WASI({ ``` ```console -$ node --experimental-permission --allow-fs-read=* index.js +$ node --permission --allow-fs-read=* index.js Error: Access to this API has been restricted at node:internal/main/run_main_module:30:49 { @@ -313,7 +319,7 @@ new Worker(__filename); ``` ```console -$ node --experimental-permission --allow-fs-read=* index.js +$ node --permission --allow-fs-read=* index.js Error: Access to this API has been restricted at node:internal/main/run_main_module:17:47 { @@ -949,24 +955,6 @@ added: Enable experimental support for the network inspection with Chrome DevTools. -### `--experimental-permission` - - - -> Stability: 1.1 - Active development - -Enable the Permission Model for current process. When enabled, the -following permissions are restricted: - -* File System - manageable through - [`--allow-fs-read`][], [`--allow-fs-write`][] flags -* Child Process - manageable through [`--allow-child-process`][] flag -* Worker Threads - manageable through [`--allow-worker`][] flag -* WASI - manageable through [`--allow-wasi`][] flag -* Addons - manageable through [`--allow-addons`][] flag - ### `--experimental-print-required-tla` + +> Stability: 2 - Stable. + +Enable the Permission Model for current process. When enabled, the +following permissions are restricted: + +* File System - manageable through + [`--allow-fs-read`][], [`--allow-fs-write`][] flags +* Child Process - manageable through [`--allow-child-process`][] flag +* Worker Threads - manageable through [`--allow-worker`][] flag +* WASI - manageable through [`--allow-wasi`][] flag +* Addons - manageable through [`--allow-addons`][] flag + ### `--preserve-symlinks` -> Stability: 1.1 - Active development +> Stability: 2 - Stable. The Node.js Permission Model is a mechanism for restricting access to specific resources during execution. -The API exists behind a flag [`--experimental-permission`][] which when enabled, +The API exists behind a flag [`--permission`][] which when enabled, will restrict access to all available permissions. -The available permissions are documented by the [`--experimental-permission`][] +The available permissions are documented by the [`--permission`][] flag. -When starting Node.js with `--experimental-permission`, +When starting Node.js with `--permission`, the ability to access the file system through the `fs` module, spawn processes, use `node:worker_threads`, use native addons, use WASI, and enable the runtime inspector will be restricted. ```console -$ node --experimental-permission index.js +$ node --permission index.js Error: Access to this API has been restricted at node:internal/main/run_main_module:23:47 { @@ -64,7 +64,7 @@ flag. For WASI, use the [`--allow-wasi`][] flag. #### Runtime API -When enabling the Permission Model through the [`--experimental-permission`][] +When enabling the Permission Model through the [`--permission`][] flag a new property `permission` is added to the `process` object. This property contains one function: @@ -90,10 +90,8 @@ To allow access to the file system, use the [`--allow-fs-read`][] and [`--allow-fs-write`][] flags: ```console -$ node --experimental-permission --allow-fs-read=* --allow-fs-write=* index.js +$ node --permission --allow-fs-read=* --allow-fs-write=* index.js Hello world! -(node:19836) ExperimentalWarning: Permission is an experimental feature -(Use `node --trace-warnings ...` to show where the warning was created) ``` The valid arguments for both flags are: @@ -167,5 +165,5 @@ There are constraints you need to know before using this system: [`--allow-fs-write`]: cli.md#--allow-fs-write [`--allow-wasi`]: cli.md#--allow-wasi [`--allow-worker`]: cli.md#--allow-worker -[`--experimental-permission`]: cli.md#--experimental-permission +[`--permission`]: cli.md#--permission [`permission.has()`]: process.md#processpermissionhasscope-reference diff --git a/doc/api/process.md b/doc/api/process.md index 5bf05b1d909860..379694b95d8472 100644 --- a/doc/api/process.md +++ b/doc/api/process.md @@ -3103,7 +3103,7 @@ added: v20.0.0 * {Object} -This API is available through the [`--experimental-permission`][] flag. +This API is available through the [`--permission`][] flag. `process.permission` is an object whose methods are used to manage permissions for the current process. Additional documentation is available in the @@ -4440,8 +4440,8 @@ cases: [`'exit'`]: #event-exit [`'message'`]: child_process.md#event-message [`'uncaughtException'`]: #event-uncaughtexception -[`--experimental-permission`]: cli.md#--experimental-permission [`--no-deprecation`]: cli.md#--no-deprecation +[`--permission`]: cli.md#--permission [`--unhandled-rejections`]: cli.md#--unhandled-rejectionsmode [`Buffer`]: buffer.md [`ChildProcess.disconnect()`]: child_process.md#subprocessdisconnect diff --git a/doc/node.1 b/doc/node.1 index e38ce7f0431e62..2692c1848de359 100644 --- a/doc/node.1 +++ b/doc/node.1 @@ -171,8 +171,8 @@ Specify the .Ar module to use as a custom module loader. . -.It Fl -experimental-permission -Enable the experimental permission model. +.It Fl -permission +Enable the permission model. . .It Fl -experimental-shadow-realm Use this flag to enable ShadowRealm support. diff --git a/lib/internal/process/permission.js b/lib/internal/process/permission.js index 7a6dd80d1d01f3..bfdfe29fe4739f 100644 --- a/lib/internal/process/permission.js +++ b/lib/internal/process/permission.js @@ -9,16 +9,16 @@ const { validateString, validateBuffer } = require('internal/validators'); const { Buffer } = require('buffer'); const { isBuffer } = Buffer; -let experimentalPermission; +let _permission; module.exports = ObjectFreeze({ __proto__: null, isEnabled() { - if (experimentalPermission === undefined) { + if (_permission === undefined) { const { getOptionValue } = require('internal/options'); - experimentalPermission = getOptionValue('--experimental-permission'); + _permission = getOptionValue('--permission'); } - return experimentalPermission; + return _permission; }, has(scope, reference) { validateString(scope, 'scope'); diff --git a/lib/internal/process/pre_execution.js b/lib/internal/process/pre_execution.js index 41ebf85900b100..b3aba59674b82b 100644 --- a/lib/internal/process/pre_execution.js +++ b/lib/internal/process/pre_execution.js @@ -520,14 +520,13 @@ function initializeClusterIPC() { } function initializePermission() { - const experimentalPermission = getOptionValue('--experimental-permission'); - if (experimentalPermission) { + const permission = getOptionValue('--permission'); + if (permission) { process.binding = function binding(_module) { throw new ERR_ACCESS_DENIED('process.binding'); }; // Guarantee path module isn't monkey-patched to bypass permission model ObjectFreeze(require('path')); - emitExperimentalWarning('Permission'); const { has } = require('internal/process/permission'); const warnFlags = [ '--allow-addons', @@ -579,7 +578,7 @@ function initializePermission() { ArrayPrototypeForEach(availablePermissionFlags, (flag) => { const value = getOptionValue(flag); if (value.length) { - throw new ERR_MISSING_OPTION('--experimental-permission'); + throw new ERR_MISSING_OPTION('--permission'); } }); } diff --git a/src/env.cc b/src/env.cc index 8842f69e9bf58f..d4426432d67ba6 100644 --- a/src/env.cc +++ b/src/env.cc @@ -920,7 +920,7 @@ Environment::Environment(IsolateData* isolate_data, std::move(traced_value)); } - if (options_->experimental_permission) { + if (options_->permission) { permission()->EnablePermissions(); // The process shouldn't be able to neither // spawn/worker nor use addons or enable inspector diff --git a/src/node_options.cc b/src/node_options.cc index 83ee298acfad00..5c24a48411123d 100644 --- a/src/node_options.cc +++ b/src/node_options.cc @@ -456,11 +456,12 @@ EnvironmentOptionsParser::EnvironmentOptionsParser() { "experimental ES Module import.meta.resolve() parentURL support", &EnvironmentOptions::experimental_import_meta_resolve, kAllowedInEnvvar); - AddOption("--experimental-permission", + AddOption("--permission", "enable the permission system", - &EnvironmentOptions::experimental_permission, + &EnvironmentOptions::permission, kAllowedInEnvvar, false); + AddAlias("--experimental-permission", "--permission"); AddOption("--allow-fs-read", "allow permissions to read the filesystem", &EnvironmentOptions::allow_fs_read, diff --git a/src/node_options.h b/src/node_options.h index c0a8cb29f90627..24ad821837934f 100644 --- a/src/node_options.h +++ b/src/node_options.h @@ -132,7 +132,7 @@ class EnvironmentOptions : public Options { bool experimental_import_meta_resolve = false; std::string input_type; // Value of --input-type bool entry_is_url = false; - bool experimental_permission = false; + bool permission = false; std::vector allow_fs_read; std::vector allow_fs_write; bool allow_addons = false; diff --git a/test/addons/no-addons/permission.js b/test/addons/no-addons/permission.js index 0fbcd2bb1ee782..1d1bbf6e95468e 100644 --- a/test/addons/no-addons/permission.js +++ b/test/addons/no-addons/permission.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; diff --git a/test/es-module/test-cjs-legacyMainResolve-permission.js b/test/es-module/test-cjs-legacyMainResolve-permission.js index 392bfb753d7764..fcebc22ccf2929 100644 --- a/test/es-module/test-cjs-legacyMainResolve-permission.js +++ b/test/es-module/test-cjs-legacyMainResolve-permission.js @@ -1,6 +1,6 @@ 'use strict'; -// Flags: --expose-internals --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --expose-internals --permission --allow-fs-read=* --allow-child-process require('../common'); @@ -40,7 +40,7 @@ describe('legacyMainResolve', () => { process.execPath, [ '--expose-internals', - '--experimental-permission', + '--permission', ...allowReadFiles, '-e', ` @@ -98,7 +98,7 @@ describe('legacyMainResolve', () => { process.execPath, [ '--expose-internals', - '--experimental-permission', + '--permission', ...allowReadFiles, '-e', ` diff --git a/test/es-module/test-esm-loader-hooks.mjs b/test/es-module/test-esm-loader-hooks.mjs index 4a4d15648a79b5..ed5c27cbc4b84f 100644 --- a/test/es-module/test-esm-loader-hooks.mjs +++ b/test/es-module/test-esm-loader-hooks.mjs @@ -182,7 +182,7 @@ describe('Loader hooks', { concurrency: !process.env.TEST_PARALLEL }, () => { it('should work without worker permission', async () => { const { code, signal, stdout, stderr } = await spawnPromisified(execPath, [ '--no-warnings', - '--experimental-permission', + '--permission', '--allow-fs-read', '*', '--experimental-loader', @@ -199,7 +199,7 @@ describe('Loader hooks', { concurrency: !process.env.TEST_PARALLEL }, () => { it('should allow loader hooks to spawn workers when allowed by the CLI flags', async () => { const { code, signal, stdout, stderr } = await spawnPromisified(execPath, [ '--no-warnings', - '--experimental-permission', + '--permission', '--allow-worker', '--allow-fs-read', '*', @@ -217,7 +217,7 @@ describe('Loader hooks', { concurrency: !process.env.TEST_PARALLEL }, () => { it('should not allow loader hooks to spawn workers if restricted by the CLI flags', async () => { const { code, signal, stdout, stderr } = await spawnPromisified(execPath, [ '--no-warnings', - '--experimental-permission', + '--permission', '--allow-fs-read', '*', '--experimental-loader', diff --git a/test/fixtures/dotenv/node-options.env b/test/fixtures/dotenv/node-options.env index f74ac01bc28de7..bd3be820f64e2b 100644 --- a/test/fixtures/dotenv/node-options.env +++ b/test/fixtures/dotenv/node-options.env @@ -1,6 +1,6 @@ CUSTOM_VARIABLE=hello-world NODE_NO_WARNINGS=1 -NODE_OPTIONS="--experimental-permission --allow-fs-read=*" +NODE_OPTIONS="--permission --allow-fs-read=*" TZ=Pacific/Honolulu UV_THREADPOOL_SIZE=5 BASIC=overridden diff --git a/test/parallel/test-cli-bad-options.js b/test/parallel/test-cli-bad-options.js index 8a77e94babb4fa..6868541325302d 100644 --- a/test/parallel/test-cli-bad-options.js +++ b/test/parallel/test-cli-bad-options.js @@ -14,8 +14,8 @@ if (process.features.inspector) { } requiresArgument('--eval'); -missingOption('--allow-fs-read=*', '--experimental-permission'); -missingOption('--allow-fs-write=*', '--experimental-permission'); +missingOption('--allow-fs-read=*', '--permission'); +missingOption('--allow-fs-write=*', '--permission'); function missingOption(option, requiredOption) { const r = spawnSync(process.execPath, [option], { encoding: 'utf8' }); diff --git a/test/parallel/test-cli-permission-deny-fs.js b/test/parallel/test-cli-permission-deny-fs.js index d38c4a61adbcfc..d5744cac94db3d 100644 --- a/test/parallel/test-cli-permission-deny-fs.js +++ b/test/parallel/test-cli-permission-deny-fs.js @@ -12,7 +12,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', '-e', + '--permission', '-e', `console.log(process.permission.has("fs")); console.log(process.permission.has("fs.read")); console.log(process.permission.has("fs.write"));`, @@ -31,7 +31,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-write', tmpPath, '-e', `console.log(process.permission.has("fs")); console.log(process.permission.has("fs.read")); @@ -51,7 +51,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-write', '*', '-e', `console.log(process.permission.has("fs")); console.log(process.permission.has("fs.read")); @@ -70,7 +70,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', '*', '-e', `console.log(process.permission.has("fs")); console.log(process.permission.has("fs.read")); @@ -89,7 +89,7 @@ const path = require('path'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-write=*', '-p', 'fs.readFileSync(process.execPath)', ] @@ -104,7 +104,7 @@ const path = require('path'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '-p', 'fs.readFileSync(process.execPath)', ] @@ -119,7 +119,7 @@ const path = require('path'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read=*', '-p', 'fs.writeFileSync("policy-deny-example.md", "# test")', ] @@ -145,7 +145,7 @@ const path = require('path'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${firstPath}`, file, ] diff --git a/test/parallel/test-cli-permission-multiple-allow.js b/test/parallel/test-cli-permission-multiple-allow.js index 57ce15535300d5..3ff1935e7de1f4 100644 --- a/test/parallel/test-cli-permission-multiple-allow.js +++ b/test/parallel/test-cli-permission-multiple-allow.js @@ -12,7 +12,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-write', tmpPath, '--allow-fs-write', otherPath, '-e', `console.log(process.permission.has("fs")); console.log(process.permission.has("fs.read")); @@ -36,7 +36,7 @@ const path = require('path'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-write', tmpPath, '--allow-fs-write', @@ -63,7 +63,7 @@ const path = require('path'); const { status, stdout, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read=*', `--allow-fs-write=${filePath}`, '-e', diff --git a/test/parallel/test-compile-cache-api-permission.js b/test/parallel/test-compile-cache-api-permission.js index 4163cadce1428f..1a0123161b1c36 100644 --- a/test/parallel/test-compile-cache-api-permission.js +++ b/test/parallel/test-compile-cache-api-permission.js @@ -26,7 +26,7 @@ const fs = require('fs'); spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', `--allow-fs-read=${scriptDir}`, `--allow-fs-write=${scriptDir}`, + '--permission', `--allow-fs-read=${scriptDir}`, `--allow-fs-write=${scriptDir}`, '-r', wrapper, empty, ], { diff --git a/test/parallel/test-compile-cache-permission-allowed.js b/test/parallel/test-compile-cache-permission-allowed.js index 76dbfab720d8df..43ce4c274780db 100644 --- a/test/parallel/test-compile-cache-permission-allowed.js +++ b/test/parallel/test-compile-cache-permission-allowed.js @@ -23,7 +23,7 @@ function testAllowed(readDir, writeDir, envDir) { spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${dummyDir}`, `--allow-fs-read=${readDir}`, `--allow-fs-write=${writeDir}`, @@ -47,7 +47,7 @@ function testAllowed(readDir, writeDir, envDir) { spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${dummyDir}`, `--allow-fs-read=${readDir}`, `--allow-fs-write=${writeDir}`, diff --git a/test/parallel/test-compile-cache-permission-disallowed.js b/test/parallel/test-compile-cache-permission-disallowed.js index dbbb38fb99f240..9870de81c5d031 100644 --- a/test/parallel/test-compile-cache-permission-disallowed.js +++ b/test/parallel/test-compile-cache-permission-disallowed.js @@ -24,7 +24,7 @@ function testDisallowed(dummyDir, cacheDirInPermission, cacheDirInEnv) { spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${dummyDir}`, // No read or write permission for cache dir. `--allow-fs-write=${dummyDir}`, script, @@ -47,7 +47,7 @@ function testDisallowed(dummyDir, cacheDirInPermission, cacheDirInEnv) { spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${dummyDir}`, `--allow-fs-read=${cacheDirInPermission}`, // Read-only `--allow-fs-write=${dummyDir}`, @@ -71,7 +71,7 @@ function testDisallowed(dummyDir, cacheDirInPermission, cacheDirInEnv) { spawnSyncAndAssert( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${dummyDir}`, `--allow-fs-write=${cacheDirInPermission}`, // Write-only script, diff --git a/test/parallel/test-permission-allow-addons-cli.js b/test/parallel/test-permission-allow-addons-cli.js index 2254d9920cbe71..484f16e0acb3b5 100644 --- a/test/parallel/test-permission-allow-addons-cli.js +++ b/test/parallel/test-permission-allow-addons-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-addons --allow-fs-read=* +// Flags: --permission --allow-addons --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-allow-child-process-cli.js b/test/parallel/test-permission-allow-child-process-cli.js index d805c6fb973c3c..1569b2b5e87459 100644 --- a/test/parallel/test-permission-allow-child-process-cli.js +++ b/test/parallel/test-permission-allow-child-process-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-child-process --allow-fs-read=* +// Flags: --permission --allow-child-process --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-allow-wasi-cli.js b/test/parallel/test-permission-allow-wasi-cli.js index f6f1cfe3c895fb..c6bea9fb39cf0a 100644 --- a/test/parallel/test-permission-allow-wasi-cli.js +++ b/test/parallel/test-permission-allow-wasi-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-wasi --allow-fs-read=* +// Flags: --permission --allow-wasi --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-allow-worker-cli.js b/test/parallel/test-permission-allow-worker-cli.js index ae5a28fdae3597..3dcafea7a3fa35 100644 --- a/test/parallel/test-permission-allow-worker-cli.js +++ b/test/parallel/test-permission-allow-worker-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-worker --allow-fs-read=* +// Flags: --permission --allow-worker --allow-fs-read=* 'use strict'; require('../common'); diff --git a/test/parallel/test-permission-child-process-cli.js b/test/parallel/test-permission-child-process-cli.js index 76586a1c538bed..dfea008a60407b 100644 --- a/test/parallel/test-permission-child-process-cli.js +++ b/test/parallel/test-permission-child-process-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-experimental.js b/test/parallel/test-permission-experimental.js deleted file mode 100644 index bec66e5a731a95..00000000000000 --- a/test/parallel/test-permission-experimental.js +++ /dev/null @@ -1,13 +0,0 @@ -// Flags: --experimental-permission --allow-fs-read=* -'use strict'; - -const common = require('../common'); -common.skipIfWorker(); -const assert = require('assert'); - -// This test ensures that the experimental message is emitted -// when using permission system - -process.on('warning', common.mustCall((warning) => { - assert.match(warning.message, /Permission is an experimental feature/); -}, 1)); diff --git a/test/parallel/test-permission-fs-absolute-path.js b/test/parallel/test-permission-fs-absolute-path.js index b7897743941d2e..2c2257052c8b02 100644 --- a/test/parallel/test-permission-fs-absolute-path.js +++ b/test/parallel/test-permission-fs-absolute-path.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -13,7 +13,7 @@ const { spawnSync } = require('child_process'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', '*', '--allow-fs-write', path.resolve('../fixtures/permission/deny/regular-file.md'), '-e', diff --git a/test/parallel/test-permission-fs-internal-module-stat.js b/test/parallel/test-permission-fs-internal-module-stat.js index f0b9d86f0809a8..fd0222cc34fa2e 100644 --- a/test/parallel/test-permission-fs-internal-module-stat.js +++ b/test/parallel/test-permission-fs-internal-module-stat.js @@ -1,4 +1,4 @@ -// Flags: --expose-internals --experimental-permission --allow-fs-read=test/common* --allow-fs-read=tools* --allow-fs-read=test/parallel* --allow-child-process +// Flags: --expose-internals --permission --allow-fs-read=test/common* --allow-fs-read=tools* --allow-fs-read=test/parallel* --allow-child-process 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-fs-read.js b/test/parallel/test-permission-fs-read.js index 5be993c9df6be5..ed8e866a6a4c10 100644 --- a/test/parallel/test-permission-fs-read.js +++ b/test/parallel/test-permission-fs-read.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-fs-write=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-fs-write=* --allow-child-process 'use strict'; const common = require('../common'); @@ -28,7 +28,7 @@ const commonPath = path.join(__filename, '../../common'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, file, + '--permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, file, ], { env: { diff --git a/test/parallel/test-permission-fs-relative-path.js b/test/parallel/test-permission-fs-relative-path.js index 628e9918660088..3b115ee35d1227 100644 --- a/test/parallel/test-permission-fs-relative-path.js +++ b/test/parallel/test-permission-fs-relative-path.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -12,7 +12,7 @@ const { spawnSync } = require('child_process'); const { status, stdout } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', '*', '--allow-fs-write', '../fixtures/permission/deny/regular-file.md', '-e', diff --git a/test/parallel/test-permission-fs-require.js b/test/parallel/test-permission-fs-require.js index 6a2e9201dac7b4..5d3a407708371e 100644 --- a/test/parallel/test-permission-fs-require.js +++ b/test/parallel/test-permission-fs-require.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -14,7 +14,7 @@ const { spawnSync } = require('node:child_process'); const { status, stdout, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', mainModule, '--allow-fs-read', requiredModule, mainModule, @@ -31,7 +31,7 @@ const { spawnSync } = require('node:child_process'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', mainModule, mainModule, ] @@ -48,7 +48,7 @@ const { spawnSync } = require('node:child_process'); const { status, stdout, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', mainModule, '--allow-fs-read', requiredModule, mainModule, @@ -65,7 +65,7 @@ const { spawnSync } = require('node:child_process'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read', mainModule, mainModule, ] diff --git a/test/parallel/test-permission-fs-symlink-relative.js b/test/parallel/test-permission-fs-symlink-relative.js index 4cc7d920593c23..cf9b37ea79b059 100644 --- a/test/parallel/test-permission-fs-symlink-relative.js +++ b/test/parallel/test-permission-fs-symlink-relative.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-fs-write=* +// Flags: --permission --allow-fs-read=* --allow-fs-write=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-fs-symlink-target-write.js b/test/parallel/test-permission-fs-symlink-target-write.js index e2b4aa2a657442..f55b19fa764a89 100644 --- a/test/parallel/test-permission-fs-symlink-target-write.js +++ b/test/parallel/test-permission-fs-symlink-target-write.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-fs-write=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-fs-write=* --allow-child-process 'use strict'; const common = require('../common'); @@ -35,7 +35,7 @@ fs.writeFileSync(path.join(readWriteFolder, 'file'), 'NO evil file contents'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, `--allow-fs-read=${readOnlyFolder}`, `--allow-fs-read=${readWriteFolder}`, `--allow-fs-write=${readWriteFolder}`, `--allow-fs-write=${writeOnlyFolder}`, file, diff --git a/test/parallel/test-permission-fs-symlink.js b/test/parallel/test-permission-fs-symlink.js index c7d753c267c1e7..92965c960177d4 100644 --- a/test/parallel/test-permission-fs-symlink.js +++ b/test/parallel/test-permission-fs-symlink.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-fs-write=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-fs-write=* --allow-child-process 'use strict'; const common = require('../common'); @@ -36,7 +36,7 @@ const symlinkFromBlockedFile = tmpdir.resolve('example-symlink.md'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, `--allow-fs-read=${symlinkFromBlockedFile}`, `--allow-fs-write=${symlinkFromBlockedFile}`, file, diff --git a/test/parallel/test-permission-fs-traversal-path.js b/test/parallel/test-permission-fs-traversal-path.js index d618c3e4f79879..03571c2d01c861 100644 --- a/test/parallel/test-permission-fs-traversal-path.js +++ b/test/parallel/test-permission-fs-traversal-path.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-fs-write=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-fs-write=* --allow-child-process 'use strict'; const common = require('../common'); @@ -30,7 +30,7 @@ const commonPathWildcard = path.join(__filename, '../../common*'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, `--allow-fs-read=${allowedFolder}`, `--allow-fs-write=${allowedFolder}`, file, diff --git a/test/parallel/test-permission-fs-wildcard.js b/test/parallel/test-permission-fs-wildcard.js index 7aa8c34fd65cb1..adca56ed0dba6d 100644 --- a/test/parallel/test-permission-fs-wildcard.js +++ b/test/parallel/test-permission-fs-wildcard.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -31,7 +31,7 @@ if (common.isWindows) { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', ...allowList.flatMap((path) => ['--allow-fs-read', path]), '-e', ` @@ -66,7 +66,7 @@ if (common.isWindows) { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', ...allowList.flatMap((path) => ['--allow-fs-read', path]), '-e', ` @@ -91,7 +91,7 @@ if (common.isWindows) { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', `--allow-fs-read=${file}`, `--allow-fs-read=${commonPathWildcard}`, ...allowList.flatMap((path) => ['--allow-fs-read', path]), file, ], @@ -104,7 +104,7 @@ if (common.isWindows) { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read=/a/b/*', '--allow-fs-read=/a/b/d', '--allow-fs-read=/etc/passwd.*', diff --git a/test/parallel/test-permission-fs-windows-path.js b/test/parallel/test-permission-fs-windows-path.js index 552f8e1c21694b..6869b347cf283f 100644 --- a/test/parallel/test-permission-fs-windows-path.js +++ b/test/parallel/test-permission-fs-windows-path.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -13,7 +13,7 @@ if (!common.isWindows) { { const { stdout, status } = spawnSync(process.execPath, [ - '--experimental-permission', '--allow-fs-write', 'C:\\\\', '-e', + '--permission', '--allow-fs-write', 'C:\\\\', '-e', 'console.log(process.permission.has("fs.write", "C:\\\\"))', ]); assert.strictEqual(stdout.toString(), 'true\n'); @@ -22,7 +22,7 @@ if (!common.isWindows) { { const { stdout, status, stderr } = spawnSync(process.execPath, [ - '--experimental-permission', '--allow-fs-write="\\\\?\\C:\\"', '-e', + '--permission', '--allow-fs-write="\\\\?\\C:\\"', '-e', 'console.log(process.permission.has("fs.write", "C:\\\\"))', ]); assert.strictEqual(stdout.toString(), 'false\n', stderr.toString()); @@ -31,7 +31,7 @@ if (!common.isWindows) { { const { stdout, status, stderr } = spawnSync(process.execPath, [ - '--experimental-permission', '--allow-fs-write', 'C:\\', '-e', + '--permission', '--allow-fs-write', 'C:\\', '-e', `const path = require('path'); console.log(process.permission.has('fs.write', path.toNamespacedPath('C:\\\\')))`, ]); @@ -41,7 +41,7 @@ if (!common.isWindows) { { const { stdout, status, stderr } = spawnSync(process.execPath, [ - '--experimental-permission', '--allow-fs-write', 'C:\\*', '-e', + '--permission', '--allow-fs-write', 'C:\\*', '-e', "console.log(process.permission.has('fs.write', '\\\\\\\\A\\\\C:\\Users'))", ]); assert.strictEqual(stdout.toString(), 'false\n', stderr.toString()); diff --git a/test/parallel/test-permission-fs-write-report.js b/test/parallel/test-permission-fs-write-report.js index c8f6673de03d83..111f73b7bcc1ed 100644 --- a/test/parallel/test-permission-fs-write-report.js +++ b/test/parallel/test-permission-fs-write-report.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-fs-write-v8.js b/test/parallel/test-permission-fs-write-v8.js index bb33c307544a37..85cb9a5519b3af 100644 --- a/test/parallel/test-permission-fs-write-v8.js +++ b/test/parallel/test-permission-fs-write-v8.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-fs-write.js b/test/parallel/test-permission-fs-write.js index 626c00e5c007a2..34eab7a40005db 100644 --- a/test/parallel/test-permission-fs-write.js +++ b/test/parallel/test-permission-fs-write.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -24,7 +24,7 @@ const file = fixtures.path('permission', 'fs-write.js'); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read=*', `--allow-fs-write=${regularFile}`, `--allow-fs-write=${commonPath}`, file, diff --git a/test/parallel/test-permission-has.js b/test/parallel/test-permission-has.js index 3be45c5b2a410a..bf23af014c7a40 100644 --- a/test/parallel/test-permission-has.js +++ b/test/parallel/test-permission-has.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-inspector-brk.js b/test/parallel/test-permission-inspector-brk.js index e1bd8e9bbb0a34..61c9c799ba7eb6 100644 --- a/test/parallel/test-permission-inspector-brk.js +++ b/test/parallel/test-permission-inspector-brk.js @@ -14,7 +14,7 @@ common.skipIfInspectorDisabled(); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--allow-fs-read=*', '--inspect-brk', file, @@ -29,7 +29,7 @@ common.skipIfInspectorDisabled(); const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '--inspect-brk', '--eval', 'console.log("Hi!")', diff --git a/test/parallel/test-permission-inspector.js b/test/parallel/test-permission-inspector.js index d4afd8d93bc2f7..9d3bf485fc4348 100644 --- a/test/parallel/test-permission-inspector.js +++ b/test/parallel/test-permission-inspector.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* --allow-child-process +// Flags: --permission --allow-fs-read=* --allow-child-process 'use strict'; const common = require('../common'); @@ -26,7 +26,7 @@ if (!common.hasCrypto) const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', + '--permission', '-e', '(new (require("inspector")).Session()).connect()', ], diff --git a/test/parallel/test-permission-no-addons.js b/test/parallel/test-permission-no-addons.js index 4a1fc635a99bc7..a3ae6f4be10641 100644 --- a/test/parallel/test-permission-no-addons.js +++ b/test/parallel/test-permission-no-addons.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-processbinding.js b/test/parallel/test-permission-processbinding.js index 0dd6fd450152cd..47a1364f19e303 100644 --- a/test/parallel/test-permission-processbinding.js +++ b/test/parallel/test-permission-processbinding.js @@ -13,13 +13,13 @@ const fixtures = require('../common/fixtures'); const file = fixtures.path('permission', 'processbinding.js'); // Due to linting rules-utils.js:isBinding check, process.binding() should -// not be called when --experimental-permission is enabled. +// not be called when --permission is enabled. // Always spawn a child process { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', '--allow-fs-read=*', file, + '--permission', '--allow-fs-read=*', file, ], ); assert.strictEqual(status, 0, stderr.toString()); diff --git a/test/parallel/test-permission-warning-flags.js b/test/parallel/test-permission-warning-flags.js index 87fcb7ff7f3158..9b20248eae18e9 100644 --- a/test/parallel/test-permission-warning-flags.js +++ b/test/parallel/test-permission-warning-flags.js @@ -15,7 +15,7 @@ for (const flag of warnFlags) { const { status, stderr } = spawnSync( process.execPath, [ - '--experimental-permission', flag, '-e', + '--permission', flag, '-e', 'setTimeout(() => {}, 1)', ] ); diff --git a/test/parallel/test-permission-wasi.js b/test/parallel/test-permission-wasi.js index 1a6cde013097b7..01291e685570f3 100644 --- a/test/parallel/test-permission-wasi.js +++ b/test/parallel/test-permission-wasi.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-permission-worker-threads-cli.js b/test/parallel/test-permission-worker-threads-cli.js index e817a7877226c1..efd98b2a3881aa 100644 --- a/test/parallel/test-permission-worker-threads-cli.js +++ b/test/parallel/test-permission-worker-threads-cli.js @@ -1,4 +1,4 @@ -// Flags: --experimental-permission --allow-fs-read=* +// Flags: --permission --allow-fs-read=* 'use strict'; const common = require('../common'); diff --git a/test/parallel/test-process-load-env-file.js b/test/parallel/test-process-load-env-file.js index 795b8773d955cb..1dada3aa9b7016 100644 --- a/test/parallel/test-process-load-env-file.js +++ b/test/parallel/test-process-load-env-file.js @@ -78,7 +78,7 @@ describe('process.loadEnvFile()', () => { `.trim(); const child = await common.spawnPromisified( process.execPath, - [ '--eval', code, '--experimental-permission' ], + [ '--eval', code, '--permission' ], { cwd: __dirname }, ); assert.match(child.stderr, /Error: Access to this API has been restricted/); diff --git a/test/parallel/test-repl-permission-model.js b/test/parallel/test-repl-permission-model.js index 66f2a147652f8d..938f5121163a23 100644 --- a/test/parallel/test-repl-permission-model.js +++ b/test/parallel/test-repl-permission-model.js @@ -1,6 +1,6 @@ 'use strict'; -// Flags: --expose-internals --experimental-permission --allow-fs-read=* +// Flags: --expose-internals --permission --allow-fs-read=* const common = require('../common'); const stream = require('stream'); diff --git a/tools/run-worker.js b/tools/run-worker.js index 20f03f53e12184..f4ede8628e5fd4 100644 --- a/tools/run-worker.js +++ b/tools/run-worker.js @@ -7,7 +7,7 @@ if (typeof require === 'undefined') { const path = require('path'); const { Worker } = require('worker_threads'); -// When --experimental-permission is enabled, the process +// When --permission is enabled, the process // aren't able to spawn any worker unless --allow-worker is passed. // Therefore, we skip the permission tests for custom-suites-freestyle if (process.permission && !process.permission.has('worker')) { From febd969c462e1a53b2bbd464cf54f951b0683da0 Mon Sep 17 00:00:00 2001 From: Vitaly Aminev Date: Thu, 12 Dec 2024 06:05:11 -0800 Subject: [PATCH 46/88] http2: remove duplicate codeblock MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/55915 Reviewed-By: Yagiz Nizipli Reviewed-By: Tim Perry Reviewed-By: Luigi Pinca Reviewed-By: Richard Lau Reviewed-By: Juan José Arboleda Reviewed-By: James M Snell Reviewed-By: Jason Zhang Reviewed-By: Matteo Collina --- src/node_http2.cc | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) diff --git a/src/node_http2.cc b/src/node_http2.cc index 888f70bd4df8a3..f9b5226aea50dc 100644 --- a/src/node_http2.cc +++ b/src/node_http2.cc @@ -1316,11 +1316,7 @@ int Http2Session::OnDataChunkReceived(nghttp2_session* handle, } else { memcpy(buf.base, data, avail); } - if (buf.base == nullptr) [[likely]] { - buf.base = reinterpret_cast(const_cast(data)); - } else { - memcpy(buf.base, data, avail); - } + data += avail; len -= avail; stream->EmitRead(avail, buf); From 075b36b7b4ac4409adcf46ab8c592914415edfce Mon Sep 17 00:00:00 2001 From: Tim Perry <1526883+pimterry@users.noreply.github.com> Date: Thu, 12 Dec 2024 16:43:10 +0000 Subject: [PATCH 47/88] http: add setDefaultHeaders option to http.request This makes it possible to disable the various default headers directly from the constructor. While this is possible for many use cases by manually calling removeHeader on the request object instead, when passing a raw header array to the request constructor the headers are serialized and prepared to send immediately, and removeHeader cannot subsequently be used. With this change, it's now possible to 100% control sent request headers by passing 'setDefaultHeaders: false' and a raw headers array to http.request. PR-URL: https://github.com/nodejs/node/pull/56112 Reviewed-By: Matteo Collina Reviewed-By: Luigi Pinca Reviewed-By: Yongsheng Zhang Reviewed-By: Marco Ippolito Reviewed-By: James M Snell --- doc/api/http.md | 7 +++- lib/_http_client.js | 8 ++++- ...ont-set-default-headers-with-set-header.js | 33 +++++++++++++++++++ ...p-dont-set-default-headers-with-setHost.js | 23 +++++++++++++ .../test-http-dont-set-default-headers.js | 31 +++++++++++++++++ 5 files changed, 100 insertions(+), 2 deletions(-) create mode 100644 test/parallel/test-http-dont-set-default-headers-with-set-header.js create mode 100644 test/parallel/test-http-dont-set-default-headers-with-setHost.js create mode 100644 test/parallel/test-http-dont-set-default-headers.js diff --git a/doc/api/http.md b/doc/api/http.md index 9b9175a003f56c..886bbe26ecc95c 100644 --- a/doc/api/http.md +++ b/doc/api/http.md @@ -3848,8 +3848,13 @@ changes: * `port` {number} Port of remote server. **Default:** `defaultPort` if set, else `80`. * `protocol` {string} Protocol to use. **Default:** `'http:'`. + * `setDefaultHeaders` {boolean}: Specifies whether or not to automatically add + default headers such as `Connection`, `Content-Length`, `Transfer-Encoding`, + and `Host`. If set to `false` then all necessary headers must be added + manually. Defaults to `true`. * `setHost` {boolean}: Specifies whether or not to automatically add the - `Host` header. Defaults to `true`. + `Host` header. If provided, this overrides `setDefaultHeaders`. Defaults to + `true`. * `signal` {AbortSignal}: An AbortSignal that may be used to abort an ongoing request. * `socketPath` {string} Unix domain socket. Cannot be used if one of `host` diff --git a/lib/_http_client.js b/lib/_http_client.js index 91ba264339fa4f..00b59f357fa45d 100644 --- a/lib/_http_client.js +++ b/lib/_http_client.js @@ -199,7 +199,13 @@ function ClientRequest(input, options, cb) { const host = optsWithoutSignal.host = validateHost(options.hostname, 'hostname') || validateHost(options.host, 'host') || 'localhost'; - const setHost = (options.setHost === undefined || Boolean(options.setHost)); + const setHost = options.setHost !== undefined ? + Boolean(options.setHost) : + options.setDefaultHeaders !== false; + + this._removedConnection = options.setDefaultHeaders === false; + this._removedContLen = options.setDefaultHeaders === false; + this._removedTE = options.setDefaultHeaders === false; this.socketPath = options.socketPath; diff --git a/test/parallel/test-http-dont-set-default-headers-with-set-header.js b/test/parallel/test-http-dont-set-default-headers-with-set-header.js new file mode 100644 index 00000000000000..bafdae5571e33c --- /dev/null +++ b/test/parallel/test-http-dont-set-default-headers-with-set-header.js @@ -0,0 +1,33 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const http = require('http'); + +const server = http.createServer(common.mustCall(function(req, res) { + assert.deepStrictEqual(req.rawHeaders, [ + 'test', 'value', + 'HOST', `127.0.0.1:${server.address().port}`, + 'foo', 'bar', + 'foo', 'baz', + 'connection', 'close', + ]); + + res.end('ok'); + server.close(); +})); +server.listen(0, common.localhostIPv4, function() { + const req = http.request({ + method: 'POST', + host: common.localhostIPv4, + port: this.address().port, + setDefaultHeaders: false, + }); + + req.setHeader('test', 'value'); + req.setHeader('HOST', `${common.localhostIPv4}:${server.address().port}`); + req.setHeader('foo', ['bar', 'baz']); + req.setHeader('connection', 'close'); + + req.end(); +}); diff --git a/test/parallel/test-http-dont-set-default-headers-with-setHost.js b/test/parallel/test-http-dont-set-default-headers-with-setHost.js new file mode 100644 index 00000000000000..e2a4e39c24b837 --- /dev/null +++ b/test/parallel/test-http-dont-set-default-headers-with-setHost.js @@ -0,0 +1,23 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const http = require('http'); + +const server = http.createServer(common.mustCall(function(req, res) { + assert.deepStrictEqual(req.rawHeaders, [ + 'Host', `${common.localhostIPv4}:${server.address().port}`, + ]); + + res.end('ok'); + server.close(); +})); +server.listen(0, common.localhostIPv4, function() { + http.request({ + method: 'POST', + host: common.localhostIPv4, + port: this.address().port, + setDefaultHeaders: false, + setHost: true + }).end(); +}); diff --git a/test/parallel/test-http-dont-set-default-headers.js b/test/parallel/test-http-dont-set-default-headers.js new file mode 100644 index 00000000000000..3f73c11e5112ee --- /dev/null +++ b/test/parallel/test-http-dont-set-default-headers.js @@ -0,0 +1,31 @@ +'use strict'; + +const common = require('../common'); +const assert = require('assert'); +const http = require('http'); + +const server = http.createServer(common.mustCall(function(req, res) { + assert.deepStrictEqual(req.rawHeaders, [ + 'host', `${common.localhostIPv4}:${server.address().port}`, + 'foo', 'bar', + 'test', 'value', + 'foo', 'baz', + ]); + + res.end('ok'); + server.close(); +})); +server.listen(0, common.localhostIPv4, function() { + http.request({ + method: 'POST', + host: common.localhostIPv4, + port: this.address().port, + setDefaultHeaders: false, + headers: [ + 'host', `${common.localhostIPv4}:${server.address().port}`, + 'foo', 'bar', + 'test', 'value', + 'foo', 'baz', + ] + }).end(); +}); From 4745798225f42a8251fee6f068cd268cae0c65c0 Mon Sep 17 00:00:00 2001 From: Colin Ihrig Date: Thu, 12 Dec 2024 11:48:12 -0500 Subject: [PATCH 48/88] sqlite: add support for custom functions This commit adds support to node:sqlite for defining custom functions that can be invoked from SQL. Fixes: https://github.com/nodejs/node/issues/54349 PR-URL: https://github.com/nodejs/node/pull/55985 Reviewed-By: Zeyu "Alex" Yang Reviewed-By: James M Snell --- doc/api/sqlite.md | 28 ++ src/node_sqlite.cc | 265 +++++++++++++ src/node_sqlite.h | 1 + test/parallel/test-sqlite-custom-functions.js | 373 ++++++++++++++++++ 4 files changed, 667 insertions(+) create mode 100644 test/parallel/test-sqlite-custom-functions.js diff --git a/doc/api/sqlite.md b/doc/api/sqlite.md index 29d9525d6ea4d8..94bbc5d5236077 100644 --- a/doc/api/sqlite.md +++ b/doc/api/sqlite.md @@ -160,6 +160,31 @@ This method allows one or more SQL statements to be executed without returning any results. This method is useful when executing SQL statements read from a file. This method is a wrapper around [`sqlite3_exec()`][]. +### `database.function(name[, options], function)` + + + +* `name` {string} The name of the SQLite function to create. +* `options` {Object} Optional configuration settings for the function. The + following properties are supported: + * `deterministic` {boolean} If `true`, the [`SQLITE_DETERMINISTIC`][] flag is + set on the created function. **Default:** `false`. + * `directOnly` {boolean} If `true`, the [`SQLITE_DIRECTONLY`][] flag is set on + the created function. **Default:** `false`. + * `useBigIntArguments` {boolean} If `true`, integer arguments to `function` + are converted to `BigInt`s. If `false`, integer arguments are passed as + JavaScript numbers. **Default:** `false`. + * `varargs` {boolean} If `true`, `function` can accept a variable number of + arguments. If `false`, `function` must be invoked with exactly + `function.length` arguments. **Default:** `false`. +* `function` {Function} The JavaScript function to call when the SQLite + function is invoked. + +This method is used to create SQLite user-defined functions. This method is a +wrapper around [`sqlite3_create_function_v2()`][]. + ### `database.open()` -```js +```mjs +import { performance, PerformanceObserver } from 'node:perf_hooks'; + +// Activate the observer +const obs = new PerformanceObserver((list) => { + const entries = list.getEntries(); + entries.forEach((entry) => { + console.log(`import('${entry[0]}')`, entry.duration); + }); + performance.clearMarks(); + performance.clearMeasures(); + obs.disconnect(); +}); +obs.observe({ entryTypes: ['function'], buffered: true }); + +const timedImport = performance.timerify(async (module) => { + return await import(module); +}); + +await timedImport('some-module'); +``` + +```cjs 'use strict'; const { performance, @@ -1906,7 +2167,28 @@ it means the time interval between starting the request and receiving the response, and for HTTP request, it means the time interval between receiving the request and sending the response: -```js +```mjs +import { PerformanceObserver } from 'node:perf_hooks'; +import { createServer, get } from 'node:http'; + +const obs = new PerformanceObserver((items) => { + items.getEntries().forEach((item) => { + console.log(item); + }); +}); + +obs.observe({ entryTypes: ['http'] }); + +const PORT = 8080; + +createServer((req, res) => { + res.end('ok'); +}).listen(PORT, () => { + get(`http://127.0.0.1:${PORT}`); +}); +``` + +```cjs 'use strict'; const { PerformanceObserver } = require('node:perf_hooks'); const http = require('node:http'); @@ -1930,7 +2212,25 @@ http.createServer((req, res) => { ### Measuring how long the `net.connect` (only for TCP) takes when the connection is successful -```js +```mjs +import { PerformanceObserver } from 'node:perf_hooks'; +import { connect, createServer } from 'node:net'; + +const obs = new PerformanceObserver((items) => { + items.getEntries().forEach((item) => { + console.log(item); + }); +}); +obs.observe({ entryTypes: ['net'] }); +const PORT = 8080; +createServer((socket) => { + socket.destroy(); +}).listen(PORT, () => { + connect(PORT); +}); +``` + +```cjs 'use strict'; const { PerformanceObserver } = require('node:perf_hooks'); const net = require('node:net'); @@ -1950,7 +2250,21 @@ net.createServer((socket) => { ### Measuring how long the DNS takes when the request is successful -```js +```mjs +import { PerformanceObserver } from 'node:perf_hooks'; +import { lookup, promises } from 'node:dns'; + +const obs = new PerformanceObserver((items) => { + items.getEntries().forEach((item) => { + console.log(item); + }); +}); +obs.observe({ entryTypes: ['dns'] }); +lookup('localhost', () => {}); +promises.resolve('localhost'); +``` + +```cjs 'use strict'; const { PerformanceObserver } = require('node:perf_hooks'); const dns = require('node:dns'); From 5d252b7a67f0b951b564e2a53479198838895f10 Mon Sep 17 00:00:00 2001 From: Michael Dawson Date: Thu, 12 Dec 2024 18:05:01 -0500 Subject: [PATCH 55/88] test: remove exludes for sea tests on PPC The referenced issue is closed as having been fixed, so the tests should run ok. Unexclude them. Signed-off-by: Michael Dawson PR-URL: https://github.com/nodejs/node/pull/56217 Reviewed-By: Yagiz Nizipli Reviewed-By: James M Snell Reviewed-By: Richard Lau Reviewed-By: Luigi Pinca --- test/sequential/sequential.status | 11 ----------- 1 file changed, 11 deletions(-) diff --git a/test/sequential/sequential.status b/test/sequential/sequential.status index 073b29cce8dbca..5f4445416d95fa 100644 --- a/test/sequential/sequential.status +++ b/test/sequential/sequential.status @@ -52,14 +52,3 @@ test-watch-mode-inspect: SKIP [$arch==s390x] # https://github.com/nodejs/node/issues/41286 test-performance-eventloopdelay: PASS, FLAKY - -[$system==linux && $arch==ppc64] -# https://github.com/nodejs/node/issues/50740 -test-single-executable-application-assets-raw: PASS, FLAKY -test-single-executable-application-assets: PASS, FLAKY -test-single-executable-application-disable-experimental-sea-warning: PASS, FLAKY -test-single-executable-application-empty: PASS, FLAKY -test-single-executable-application-snapshot-and-code-cache: PASS, FLAKY -test-single-executable-application-snapshot: PASS, FLAKY -test-single-executable-application-use-code-cache: PASS, FLAKY -test-single-executable-application: PASS, FLAKY From d25d16efebc367cccc9a8bd68b46e9ffa9f55a5c Mon Sep 17 00:00:00 2001 From: Luigi Pinca Date: Fri, 13 Dec 2024 10:10:08 +0100 Subject: [PATCH 56/88] Revert "tools: disable automated libuv updates" This reverts commit a492646ff1ec5ee14d31bb977de01abc3779ada5. Refs: https://github.com/nodejs/node/commit/d6175b35ad3ad5c8bf0a Refs: https://github.com/nodejs/node/commit/f97865fab436fba24b46 PR-URL: https://github.com/nodejs/node/pull/56223 Reviewed-By: Antoine du Hamel Reviewed-By: Marco Ippolito Reviewed-By: Richard Lau --- .github/workflows/tools.yml | 21 +++++++++------------ 1 file changed, 9 insertions(+), 12 deletions(-) diff --git a/.github/workflows/tools.yml b/.github/workflows/tools.yml index 04c46541546ece..ea8aa33868fdf9 100644 --- a/.github/workflows/tools.yml +++ b/.github/workflows/tools.yml @@ -27,7 +27,7 @@ on: - gyp-next - histogram - icu - # - libuv + - libuv - llhttp - minimatch - nbytes @@ -175,17 +175,14 @@ jobs: cat temp-output tail -n1 temp-output | grep "NEW_VERSION=" >> "$GITHUB_ENV" || true rm temp-output - # libuv update was disabled because of Feb 14, 2024 security release - # modified the bundled version of libuv, we cannot automatically update - # libuv without potentially undoing those changes. - # - id: libuv - # subsystem: deps - # label: dependencies - # run: | - # ./tools/dep_updaters/update-libuv.sh > temp-output - # cat temp-output - # tail -n1 temp-output | grep "NEW_VERSION=" >> "$GITHUB_ENV" || true - # rm temp-output + - id: libuv + subsystem: deps + label: dependencies + run: | + ./tools/dep_updaters/update-libuv.sh > temp-output + cat temp-output + tail -n1 temp-output | grep "NEW_VERSION=" >> "$GITHUB_ENV" || true + rm temp-output - id: llhttp subsystem: deps label: dependencies From 65bc8e847f3accf29d038d9e8f168cc44c6831dd Mon Sep 17 00:00:00 2001 From: Yuan-Ming Hsu <48866415+technic960183@users.noreply.github.com> Date: Fri, 13 Dec 2024 19:28:30 +0800 Subject: [PATCH 57/88] report: fix typos in report keys and bump the version Replace "kbytes" with "bytes" in `PrintSystemInformation()` in `src/node_report.cc`, as RLIMIT_DATA, RLIMIT_RSS, and RLIMIT_AS are given in bytes. The report version is bumped from 4 to 5. Refs: https://www.ibm.com/docs/en/aix/7.3?topic=k-kgetrlimit64-kernel-service PR-URL: https://github.com/nodejs/node/pull/56068 Reviewed-By: Chengzhong Wu Reviewed-By: Gireesh Punathil Reviewed-By: James M Snell --- doc/api/report.md | 43 +++++++++++++++++++++++++++++++++++++++---- src/node_report.cc | 8 ++++---- test/common/report.js | 8 ++++---- 3 files changed, 47 insertions(+), 12 deletions(-) diff --git a/doc/api/report.md b/doc/api/report.md index c74dc7e4c0880b..4895dde2b52077 100644 --- a/doc/api/report.md +++ b/doc/api/report.md @@ -35,7 +35,7 @@ is provided below for reference. ```json { "header": { - "reportVersion": 4, + "reportVersion": 5, "event": "exception", "trigger": "Exception", "filename": "report.20181221.005011.8974.0.001.json", @@ -392,7 +392,7 @@ is provided below for reference. "soft": "", "hard": "unlimited" }, - "data_seg_size_kbytes": { + "data_seg_size_bytes": { "soft": "unlimited", "hard": "unlimited" }, @@ -404,7 +404,7 @@ is provided below for reference. "soft": "unlimited", "hard": 65536 }, - "max_memory_size_kbytes": { + "max_memory_size_bytes": { "soft": "unlimited", "hard": "unlimited" }, @@ -424,7 +424,7 @@ is provided below for reference. "soft": "unlimited", "hard": 4127290 }, - "virtual_memory_kbytes": { + "virtual_memory_bytes": { "soft": "unlimited", "hard": "unlimited" } @@ -588,6 +588,41 @@ Report version definitions are consistent across LTS releases. ### Version history +#### Version 5 + + + +Replace the keys `data_seg_size_kbytes`, `max_memory_size_kbytes`, and `virtual_memory_kbytes` +with `data_seg_size_bytes`, `max_memory_size_bytes`, and `virtual_memory_bytes` +respectively in the `userLimits` section, as these values are given in bytes. + +```json +{ + "userLimits": { + // Skip some keys ... + "data_seg_size_bytes": { // replacing data_seg_size_kbytes + "soft": "unlimited", + "hard": "unlimited" + }, + // ... + "max_memory_size_bytes": { // replacing max_memory_size_kbytes + "soft": "unlimited", + "hard": "unlimited" + }, + // ... + "virtual_memory_bytes": { // replacing virtual_memory_kbytes + "soft": "unlimited", + "hard": "unlimited" + } + } +} +``` + #### Version 4 * {string\[]} @@ -28,8 +32,6 @@ added: A list of the names of all modules provided by Node.js. Can be used to verify if a module is maintained by a third party or not. -Note: the list doesn't contain [prefix-only modules][] like `node:test`. - `module` in this context isn't the same object that's provided by the [module wrapper][]. To access it, require the `Module` module: @@ -1723,7 +1725,6 @@ returned object contains the following keys: [load hook]: #loadurl-context-nextload [module compile cache]: #module-compile-cache [module wrapper]: modules.md#the-module-wrapper -[prefix-only modules]: modules.md#built-in-modules-with-mandatory-node-prefix [realm]: https://tc39.es/ecma262/#realm [resolve hook]: #resolvespecifier-context-nextresolve [source map include directives]: https://sourcemaps.info/spec.html#h.lmz475t4mvbx diff --git a/doc/api/modules.md b/doc/api/modules.md index f5f1cf7307f663..3900f6260a01a8 100644 --- a/doc/api/modules.md +++ b/doc/api/modules.md @@ -513,7 +513,7 @@ Some built-in modules are always preferentially loaded if their identifier is passed to `require()`. For instance, `require('http')` will always return the built-in HTTP module, even if there is a file by that name. The list of built-in modules that can be loaded without using the `node:` prefix is exposed -as [`module.builtinModules`][]. +in [`module.builtinModules`][], listed without the prefix. ### Built-in modules with mandatory `node:` prefix @@ -527,6 +527,8 @@ taken the name. Currently the built-in modules that requires the `node:` prefix * [`node:test`][] * [`node:test/reporters`][] +The list of these modules is exposed in [`module.builtinModules`][], including the prefix. + ## Cycles diff --git a/lib/internal/bootstrap/realm.js b/lib/internal/bootstrap/realm.js index c11f70dd6bf329..7e87f1ad1ab5b6 100644 --- a/lib/internal/bootstrap/realm.js +++ b/lib/internal/bootstrap/realm.js @@ -54,6 +54,7 @@ const { ArrayPrototypeIncludes, ArrayPrototypeMap, ArrayPrototypePush, + ArrayPrototypePushApply, ArrayPrototypeSlice, Error, ObjectDefineProperty, @@ -320,14 +321,16 @@ class BuiltinModule { ); } - static getCanBeRequiredByUsersWithoutSchemeList() { - return ArrayFrom(canBeRequiredByUsersWithoutSchemeList); - } - static getSchemeOnlyModuleNames() { return ArrayFrom(schemelessBlockList); } + static getAllBuiltinModuleIds() { + const allBuiltins = ArrayFrom(canBeRequiredByUsersWithoutSchemeList); + ArrayPrototypePushApply(allBuiltins, ArrayFrom(schemelessBlockList, (x) => `node:${x}`)); + return allBuiltins; + } + // Used by user-land module loaders to compile and load builtins. compileForPublicLoader() { if (!BuiltinModule.canBeRequiredByUsers(this.id)) { diff --git a/lib/internal/modules/cjs/loader.js b/lib/internal/modules/cjs/loader.js index dd3d205a2bd301..0779190e1c9070 100644 --- a/lib/internal/modules/cjs/loader.js +++ b/lib/internal/modules/cjs/loader.js @@ -434,8 +434,8 @@ Module.isBuiltin = BuiltinModule.isBuiltin; */ function initializeCJS() { // This need to be done at runtime in case --expose-internals is set. - const builtinModules = BuiltinModule.getCanBeRequiredByUsersWithoutSchemeList(); - Module.builtinModules = ObjectFreeze(builtinModules); + + Module.builtinModules = ObjectFreeze(BuiltinModule.getAllBuiltinModuleIds()); initializeCjsConditions(); diff --git a/lib/repl.js b/lib/repl.js index 37b34af2917643..904cd82dc78abe 100644 --- a/lib/repl.js +++ b/lib/repl.js @@ -130,7 +130,7 @@ const { shouldColorize } = require('internal/util/colors'); const CJSModule = require('internal/modules/cjs/loader').Module; let _builtinLibs = ArrayPrototypeFilter( CJSModule.builtinModules, - (e) => e[0] !== '_', + (e) => e[0] !== '_' && !StringPrototypeStartsWith(e, 'node:'), ); const nodeSchemeBuiltinLibs = ArrayPrototypeMap( _builtinLibs, (lib) => `node:${lib}`); diff --git a/test/parallel/test-internal-module-require.js b/test/parallel/test-internal-module-require.js index c6e2057d3da1ee..058273c7ea4304 100644 --- a/test/parallel/test-internal-module-require.js +++ b/test/parallel/test-internal-module-require.js @@ -87,6 +87,9 @@ if (process.argv[2] === 'child') { }); } else { require(id); + if (!id.startsWith('node:')) { + require(`node:${id}`); + } publicModules.add(id); } } diff --git a/test/parallel/test-process-get-builtin.mjs b/test/parallel/test-process-get-builtin.mjs index 4b840585f2ad33..3cf8179f7286bb 100644 --- a/test/parallel/test-process-get-builtin.mjs +++ b/test/parallel/test-process-get-builtin.mjs @@ -41,15 +41,19 @@ for (const id of publicBuiltins) { } // Check that import(id).default returns the same thing as process.getBuiltinModule(id). for (const id of publicBuiltins) { - const imported = await import(`node:${id}`); - assert.strictEqual(process.getBuiltinModule(id), imported.default); + if (!id.startsWith('node:')) { + const imported = await import(`node:${id}`); + assert.strictEqual(process.getBuiltinModule(id), imported.default); + } } // publicBuiltins does not include 'test' which requires the node: prefix. const ids = publicBuiltins.add('test'); // Check that import(id).default returns the same thing as process.getBuiltinModule(id). for (const id of ids) { - const prefixed = `node:${id}`; - const imported = await import(prefixed); - assert.strictEqual(process.getBuiltinModule(prefixed), imported.default); + if (!id.startsWith('node:')) { + const prefixed = `node:${id}`; + const imported = await import(prefixed); + assert.strictEqual(process.getBuiltinModule(prefixed), imported.default); + } } diff --git a/test/parallel/test-repl-tab-complete-import.js b/test/parallel/test-repl-tab-complete-import.js index e328d95db5986c..fe9f7a3d11795b 100644 --- a/test/parallel/test-repl-tab-complete-import.js +++ b/test/parallel/test-repl-tab-complete-import.js @@ -5,7 +5,7 @@ const ArrayStream = require('../common/arraystream'); const fixtures = require('../common/fixtures'); const assert = require('assert'); const { builtinModules } = require('module'); -const publicModules = builtinModules.filter((lib) => !lib.startsWith('_')); +const publicUnprefixedModules = builtinModules.filter((lib) => !lib.startsWith('_') && !lib.startsWith('node:')); if (!common.isMainThread) common.skip('process.chdir is not available in Workers'); @@ -31,7 +31,7 @@ testMe._domain.on('error', assert.ifError); // Tab complete provides built in libs for import() testMe.complete('import(\'', common.mustCall((error, data) => { assert.strictEqual(error, null); - publicModules.forEach((lib) => { + publicUnprefixedModules.forEach((lib) => { assert( data[0].includes(lib) && data[0].includes(`node:${lib}`), `${lib} not found`, @@ -55,7 +55,7 @@ testMe.complete("import\t( 'n", common.mustCall((error, data) => { // import(...) completions include `node:` URL modules: let lastIndex = -1; - publicModules.forEach((lib, index) => { + publicUnprefixedModules.forEach((lib, index) => { lastIndex = completions.indexOf(`node:${lib}`); assert.notStrictEqual(lastIndex, -1); }); diff --git a/test/parallel/test-repl-tab-complete.js b/test/parallel/test-repl-tab-complete.js index 57278f52ccf2c6..ff1e927078ddf5 100644 --- a/test/parallel/test-repl-tab-complete.js +++ b/test/parallel/test-repl-tab-complete.js @@ -275,7 +275,7 @@ testMe.complete('require(\'', common.mustCall(function(error, data) { assert.strictEqual(error, null); publicModules.forEach((lib) => { assert( - data[0].includes(lib) && data[0].includes(`node:${lib}`), + data[0].includes(lib) && (lib.startsWith('node:') || data[0].includes(`node:${lib}`)), `${lib} not found` ); }); @@ -295,7 +295,7 @@ testMe.complete("require\t( 'n", common.mustCall(function(error, data) { // require(...) completions include `node:`-prefixed modules: let lastIndex = -1; - publicModules.forEach((lib, index) => { + publicModules.filter((lib) => !lib.startsWith('node:')).forEach((lib, index) => { lastIndex = data[0].indexOf(`node:${lib}`); assert.notStrictEqual(lastIndex, -1); }); diff --git a/test/parallel/test-require-resolve.js b/test/parallel/test-require-resolve.js index a38a8e074ab85d..b69192635e6d79 100644 --- a/test/parallel/test-require-resolve.js +++ b/test/parallel/test-require-resolve.js @@ -61,10 +61,9 @@ require(fixtures.path('resolve-paths', 'default', 'verify-paths.js')); // builtinModules. builtinModules.forEach((mod) => { assert.strictEqual(require.resolve.paths(mod), null); - }); - - builtinModules.forEach((mod) => { - assert.strictEqual(require.resolve.paths(`node:${mod}`), null); + if (!mod.startsWith('node:')) { + assert.strictEqual(require.resolve.paths(`node:${mod}`), null); + } }); // node_modules. From cc7a7c391bce82d64f693c253c6197733e706e71 Mon Sep 17 00:00:00 2001 From: Selveter Senitro <107211156+selveter@users.noreply.github.com> Date: Sat, 14 Dec 2024 15:07:58 +0500 Subject: [PATCH 63/88] doc: fix 'which' to 'that' and add commas PR-URL: https://github.com/nodejs/node/pull/56216 Reviewed-By: James M Snell Reviewed-By: Chemi Atlow --- doc/contributing/technical-priorities.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/doc/contributing/technical-priorities.md b/doc/contributing/technical-priorities.md index 9e566f12ae6750..68ac6f8dd0d00a 100644 --- a/doc/contributing/technical-priorities.md +++ b/doc/contributing/technical-priorities.md @@ -21,11 +21,11 @@ on October 1st 2022. _Present in: 2021_ -Base HTTP support is a key component of modern cloud-native applications +Base HTTP support is a key component of modern cloud-native applications, and built-in support was part of what made Node.js a success in the first 10 years. The current implementation is hard to support and a common source of vulnerabilities. We must work towards an -implementation which is easier to support and makes it easier to integrate +implementation that is easier to support and makes it easier to integrate the new HTTP versions (HTTP3, QUIC) and to support efficient implementations of different versions concurrently. @@ -96,7 +96,7 @@ supported tools to implement those processes (logging, metrics and tracing). This includes support within the Node.js runtime itself (for example generating heap dumps, performance metrics, etc.) as well as support for applications on top of the runtime. In addition, it is also important to -clearly document the use cases, problem determination methods and best +clearly document the use cases, problem determination methods, and best practices for those tools. ## Better multithreaded support From 1d8cc6179dbdde654abb4f08f0ee0f6e89065cf8 Mon Sep 17 00:00:00 2001 From: Rafael Gonzaga Date: Sat, 14 Dec 2024 12:06:38 -0300 Subject: [PATCH 64/88] test: use --permission over --experimental-permission PR-URL: https://github.com/nodejs/node/pull/56239 Reviewed-By: Marco Ippolito Reviewed-By: James M Snell Reviewed-By: Zeyu "Alex" Yang Reviewed-By: Yagiz Nizipli Reviewed-By: Luigi Pinca --- test/parallel/test-permission-sqlite-load-extension.js | 2 +- test/sqlite/test-sqlite-extensions.mjs | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/test/parallel/test-permission-sqlite-load-extension.js b/test/parallel/test-permission-sqlite-load-extension.js index ff13d05c84626e..28d750d0cd06b6 100644 --- a/test/parallel/test-permission-sqlite-load-extension.js +++ b/test/parallel/test-permission-sqlite-load-extension.js @@ -8,7 +8,7 @@ const db = new sqlite.DatabaseSync(':memory:', { allowExtension: true }); db.loadExtension('nonexistent');`.replace(/\n/g, ' '); childProcess.exec( - `${process.execPath} --experimental-permission -e "${code}"`, + `${process.execPath} --permission -e "${code}"`, {}, common.mustCall((err, _, stderr) => { assert.strictEqual(err.code, 1); diff --git a/test/sqlite/test-sqlite-extensions.mjs b/test/sqlite/test-sqlite-extensions.mjs index 128a4fa0b29d24..0e0acf2dc33d30 100644 --- a/test/sqlite/test-sqlite-extensions.mjs +++ b/test/sqlite/test-sqlite-extensions.mjs @@ -93,7 +93,7 @@ test('should throw error if permission is enabled', async () => { const db = new sqlite.DatabaseSync(':memory:', { allowExtension: true });`; return new Promise((resolve) => { childProcess.exec( - `${cmd} --experimental-permission -e "${code}"`, + `${cmd} --permission -e "${code}"`, { ...opts, }, From 13815417c72644f0567ae300af24f67dafe7714c Mon Sep 17 00:00:00 2001 From: Mert Can Altin Date: Sat, 14 Dec 2024 21:09:49 +0300 Subject: [PATCH 65/88] util: fix Latin1 decoding to return string output PR-URL: https://github.com/nodejs/node/pull/56222 Fixes: https://github.com/nodejs/node/issues/56219 Reviewed-By: Antoine du Hamel Reviewed-By: Ruben Bridgewater Reviewed-By: Daniel Lemire Reviewed-By: Yagiz Nizipli Reviewed-By: James M Snell --- src/encoding_binding.cc | 8 +++++--- test/cctest/test_encoding_binding.cc | 23 ++++++++++++++++++++++- test/parallel/test-util-text-decoder.js | 17 +++++++++++++++++ 3 files changed, 44 insertions(+), 4 deletions(-) create mode 100644 test/parallel/test-util-text-decoder.js diff --git a/src/encoding_binding.cc b/src/encoding_binding.cc index a132eeb62306c6..885a0d072312e9 100644 --- a/src/encoding_binding.cc +++ b/src/encoding_binding.cc @@ -286,9 +286,11 @@ void BindingData::DecodeLatin1(const FunctionCallbackInfo& args) { env->isolate(), "The encoded data was not valid for encoding latin1"); } - Local buffer_result = - node::Buffer::Copy(env, result.c_str(), written).ToLocalChecked(); - args.GetReturnValue().Set(buffer_result); + Local output = + String::NewFromUtf8( + env->isolate(), result.c_str(), v8::NewStringType::kNormal, written) + .ToLocalChecked(); + args.GetReturnValue().Set(output); } } // namespace encoding_binding diff --git a/test/cctest/test_encoding_binding.cc b/test/cctest/test_encoding_binding.cc index 06cc36d8f6ae34..d5d14c60fedf7e 100644 --- a/test/cctest/test_encoding_binding.cc +++ b/test/cctest/test_encoding_binding.cc @@ -26,7 +26,7 @@ bool RunDecodeLatin1(Environment* env, return false; } - *result = try_catch.Exception(); + *result = args[0]; return true; } @@ -151,5 +151,26 @@ TEST_F(EncodingBindingTest, DecodeLatin1_BOMPresent) { EXPECT_STREQ(*utf8_result, "Áéó"); } +TEST_F(EncodingBindingTest, DecodeLatin1_ReturnsString) { + Environment* env = CreateEnvironment(); + Isolate* isolate = env->isolate(); + HandleScope handle_scope(isolate); + + const uint8_t latin1_data[] = {0xC1, 0xE9, 0xF3}; + Local ab = ArrayBuffer::New(isolate, sizeof(latin1_data)); + memcpy(ab->GetBackingStore()->Data(), latin1_data, sizeof(latin1_data)); + + Local array = Uint8Array::New(ab, 0, sizeof(latin1_data)); + Local args[] = {array}; + + Local result; + ASSERT_TRUE(RunDecodeLatin1(env, args, false, false, &result)); + + ASSERT_TRUE(result->IsString()); + + String::Utf8Value utf8_result(isolate, result); + EXPECT_STREQ(*utf8_result, "Áéó"); +} + } // namespace encoding_binding } // namespace node diff --git a/test/parallel/test-util-text-decoder.js b/test/parallel/test-util-text-decoder.js new file mode 100644 index 00000000000000..0f6d0463f9da48 --- /dev/null +++ b/test/parallel/test-util-text-decoder.js @@ -0,0 +1,17 @@ +'use strict'; + +const common = require('../common'); + +const test = require('node:test'); +const assert = require('node:assert'); + +test('TextDecoder correctly decodes windows-1252 encoded data', { skip: !common.hasIntl }, () => { + const latin1Bytes = new Uint8Array([0xc1, 0xe9, 0xf3]); + + const expectedString = 'Áéó'; + + const decoder = new TextDecoder('windows-1252'); + const decodedString = decoder.decode(latin1Bytes); + + assert.strictEqual(decodedString, expectedString); +}); From 7a10ef88d93b7bd6f42670540c8514f1e4521dee Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Alfredo=20Gonz=C3=A1lez?= <12631491+mfdebian@users.noreply.github.com> Date: Sat, 14 Dec 2024 16:50:34 -0300 Subject: [PATCH 66/88] doc: add esm examples to node:readline PR-URL: https://github.com/nodejs/node/pull/55335 Reviewed-By: Luigi Pinca --- doc/api/readline.md | 147 ++++++++++++++++++++++++++++++++++++++------ 1 file changed, 127 insertions(+), 20 deletions(-) diff --git a/doc/api/readline.md b/doc/api/readline.md index bf0951fdd1b55c..cb1aa52605dc4a 100644 --- a/doc/api/readline.md +++ b/doc/api/readline.md @@ -703,9 +703,18 @@ added: v17.0.0 The `readlinePromises.createInterface()` method creates a new `readlinePromises.Interface` instance. -```js -const readlinePromises = require('node:readline/promises'); -const rl = readlinePromises.createInterface({ +```mjs +import { createInterface } from 'node:readline/promises'; +import { stdin, stdout } from 'node:process'; +const rl = createInterface({ + input: stdin, + output: stdout, +}); +``` + +```cjs +const { createInterface } = require('node:readline/promises'); +const rl = createInterface({ input: process.stdin, output: process.stdout, }); @@ -960,9 +969,18 @@ changes: The `readline.createInterface()` method creates a new `readline.Interface` instance. -```js -const readline = require('node:readline'); -const rl = readline.createInterface({ +```mjs +import { createInterface } from 'node:readline'; +import { stdin, stdout } from 'node:process'; +const rl = createInterface({ + input: stdin, + output: stdout, +}); +``` + +```cjs +const { createInterface } = require('node:readline'); +const rl = createInterface({ input: process.stdin, output: process.stdout, }); @@ -1098,9 +1116,36 @@ if (process.stdin.isTTY) The following example illustrates the use of `readline.Interface` class to implement a small command-line interface: -```js -const readline = require('node:readline'); -const rl = readline.createInterface({ +```mjs +import { createInterface } from 'node:readline'; +import { exit, stdin, stdout } from 'node:process'; +const rl = createInterface({ + input: stdin, + output: stdout, + prompt: 'OHAI> ', +}); + +rl.prompt(); + +rl.on('line', (line) => { + switch (line.trim()) { + case 'hello': + console.log('world!'); + break; + default: + console.log(`Say what? I might have heard '${line.trim()}'`); + break; + } + rl.prompt(); +}).on('close', () => { + console.log('Have a great day!'); + exit(0); +}); +``` + +```cjs +const { createInterface } = require('node:readline'); +const rl = createInterface({ input: process.stdin, output: process.stdout, prompt: 'OHAI> ', @@ -1130,14 +1175,37 @@ A common use case for `readline` is to consume an input file one line at a time. The easiest way to do so is leveraging the [`fs.ReadStream`][] API as well as a `for await...of` loop: -```js -const fs = require('node:fs'); -const readline = require('node:readline'); +```mjs +import { createReadStream } from 'node:fs'; +import { createInterface } from 'node:readline'; async function processLineByLine() { - const fileStream = fs.createReadStream('input.txt'); + const fileStream = createReadStream('input.txt'); - const rl = readline.createInterface({ + const rl = createInterface({ + input: fileStream, + crlfDelay: Infinity, + }); + // Note: we use the crlfDelay option to recognize all instances of CR LF + // ('\r\n') in input.txt as a single line break. + + for await (const line of rl) { + // Each line in input.txt will be successively available here as `line`. + console.log(`Line from file: ${line}`); + } +} + +processLineByLine(); +``` + +```cjs +const { createReadStream } = require('node:fs'); +const { createInterface } = require('node:readline'); + +async function processLineByLine() { + const fileStream = createReadStream('input.txt'); + + const rl = createInterface({ input: fileStream, crlfDelay: Infinity, }); @@ -1155,12 +1223,26 @@ processLineByLine(); Alternatively, one could use the [`'line'`][] event: -```js -const fs = require('node:fs'); -const readline = require('node:readline'); +```mjs +import { createReadStream } from 'node:fs'; +import { createInterface } from 'node:readline'; -const rl = readline.createInterface({ - input: fs.createReadStream('sample.txt'), +const rl = createInterface({ + input: createReadStream('sample.txt'), + crlfDelay: Infinity, +}); + +rl.on('line', (line) => { + console.log(`Line from file: ${line}`); +}); +``` + +```cjs +const { createReadStream } = require('node:fs'); +const { createInterface } = require('node:readline'); + +const rl = createInterface({ + input: createReadStream('sample.txt'), crlfDelay: Infinity, }); @@ -1172,7 +1254,32 @@ rl.on('line', (line) => { Currently, `for await...of` loop can be a bit slower. If `async` / `await` flow and speed are both essential, a mixed approach can be applied: -```js +```mjs +import { once } from 'node:events'; +import { createReadStream } from 'node:fs'; +import { createInterface } from 'node:readline'; + +(async function processLineByLine() { + try { + const rl = createInterface({ + input: createReadStream('big-file.txt'), + crlfDelay: Infinity, + }); + + rl.on('line', (line) => { + // Process the line. + }); + + await once(rl, 'close'); + + console.log('File processed.'); + } catch (err) { + console.error(err); + } +})(); +``` + +```cjs const { once } = require('node:events'); const { createReadStream } = require('node:fs'); const { createInterface } = require('node:readline'); From f94f21080b85cef86e0822e19b191f459fbe7679 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Alfredo=20Gonz=C3=A1lez?= <12631491+mfdebian@users.noreply.github.com> Date: Sat, 14 Dec 2024 16:50:52 -0300 Subject: [PATCH 67/88] doc: add esm examples to node:repl PR-URL: https://github.com/nodejs/node/pull/55432 Reviewed-By: Tierney Cyren Reviewed-By: Luigi Pinca --- doc/api/repl.md | 166 +++++++++++++++++++++++++++++++++++++++++++----- 1 file changed, 150 insertions(+), 16 deletions(-) diff --git a/doc/api/repl.md b/doc/api/repl.md index 8d00cdeed3916a..a134c493a54812 100644 --- a/doc/api/repl.md +++ b/doc/api/repl.md @@ -10,7 +10,11 @@ The `node:repl` module provides a Read-Eval-Print-Loop (REPL) implementation that is available both as a standalone program or includible in other applications. It can be accessed using: -```js +```mjs +import repl from 'node:repl'; +``` + +```cjs const repl = require('node:repl'); ``` @@ -106,7 +110,14 @@ The default evaluator provides access to any variables that exist in the global scope. It is possible to expose a variable to the REPL explicitly by assigning it to the `context` object associated with each `REPLServer`: -```js +```mjs +import repl from 'node:repl'; +const msg = 'message'; + +repl.start('> ').context.m = msg; +``` + +```cjs const repl = require('node:repl'); const msg = 'message'; @@ -124,7 +135,19 @@ $ node repl_test.js Context properties are not read-only by default. To specify read-only globals, context properties must be defined using `Object.defineProperty()`: -```js +```mjs +import repl from 'node:repl'; +const msg = 'message'; + +const r = repl.start('> '); +Object.defineProperty(r.context, 'm', { + configurable: false, + enumerable: true, + value: msg, +}); +``` + +```cjs const repl = require('node:repl'); const msg = 'message'; @@ -280,20 +303,34 @@ When a new [`repl.REPLServer`][] is created, a custom evaluation function may be provided. This can be used, for instance, to implement fully customized REPL applications. -The following illustrates a hypothetical example of a REPL that performs -translation of text from one language to another: +The following illustrates an example of a REPL that squares a given number: -```js +```mjs +import repl from 'node:repl'; + +function byThePowerOfTwo(number) { + return number * number; +} + +function myEval(cmd, context, filename, callback) { + callback(null, byThePowerOfTwo(cmd)); +} + +repl.start({ prompt: 'Enter a number: ', eval: myEval }); +``` + +```cjs const repl = require('node:repl'); -const { Translator } = require('translator'); -const myTranslator = new Translator('en', 'fr'); +function byThePowerOfTwo(number) { + return number * number; +} function myEval(cmd, context, filename, callback) { - callback(null, myTranslator.translate(cmd)); + callback(null, byThePowerOfTwo(cmd)); } -repl.start({ prompt: '> ', eval: myEval }); +repl.start({ prompt: 'Enter a number: ', eval: myEval }); ``` #### Recoverable errors @@ -354,7 +391,21 @@ To fully customize the output of a [`repl.REPLServer`][] instance pass in a new function for the `writer` option on construction. The following example, for instance, simply converts any input text to upper case: -```js +```mjs +import repl from 'node:repl'; + +const r = repl.start({ prompt: '> ', eval: myEval, writer: myWriter }); + +function myEval(cmd, context, filename, callback) { + callback(null, cmd); +} + +function myWriter(output) { + return output.toUpperCase(); +} +``` + +```cjs const repl = require('node:repl'); const r = repl.start({ prompt: '> ', eval: myEval, writer: myWriter }); @@ -380,7 +431,16 @@ added: v0.1.91 Instances of `repl.REPLServer` are created using the [`repl.start()`][] method or directly using the JavaScript `new` keyword. -```js +```mjs +import repl from 'node:repl'; + +const options = { useColors: true }; + +const firstInstance = repl.start(options); +const secondInstance = new repl.REPLServer(options); +``` + +```cjs const repl = require('node:repl'); const options = { useColors: true }; @@ -424,7 +484,20 @@ reference to the `context` object as the only argument. This can be used primarily to re-initialize REPL context to some pre-defined state: -```js +```mjs +import repl from 'node:repl'; + +function initializeContext(context) { + context.m = 'test'; +} + +const r = repl.start({ prompt: '> ' }); +initializeContext(r.context); + +r.on('reset', initializeContext); +``` + +```cjs const repl = require('node:repl'); function initializeContext(context) { @@ -475,7 +548,25 @@ properties: The following example shows two new commands added to the REPL instance: -```js +```mjs +import repl from 'node:repl'; + +const replServer = repl.start({ prompt: '> ' }); +replServer.defineCommand('sayhello', { + help: 'Say hello', + action(name) { + this.clearBufferedCommand(); + console.log(`Hello, ${name}!`); + this.displayPrompt(); + }, +}); +replServer.defineCommand('saybye', function saybye() { + console.log('Goodbye!'); + this.close(); +}); +``` + +```cjs const repl = require('node:repl'); const replServer = repl.start({ prompt: '> ' }); @@ -637,7 +728,14 @@ The `repl.start()` method creates and starts a [`repl.REPLServer`][] instance. If `options` is a string, then it specifies the input prompt: -```js +```mjs +import repl from 'node:repl'; + +// a Unix style prompt +repl.start('$ '); +``` + +```cjs const repl = require('node:repl'); // a Unix style prompt @@ -709,7 +807,43 @@ separate I/O interfaces. The following example, for instance, provides separate REPLs on `stdin`, a Unix socket, and a TCP socket: -```js +```mjs +import net from 'node:net'; +import repl from 'node:repl'; +import process from 'node:process'; + +let connections = 0; + +repl.start({ + prompt: 'Node.js via stdin> ', + input: process.stdin, + output: process.stdout, +}); + +net.createServer((socket) => { + connections += 1; + repl.start({ + prompt: 'Node.js via Unix socket> ', + input: socket, + output: socket, + }).on('exit', () => { + socket.end(); + }); +}).listen('/tmp/node-repl-sock'); + +net.createServer((socket) => { + connections += 1; + repl.start({ + prompt: 'Node.js via TCP socket> ', + input: socket, + output: socket, + }).on('exit', () => { + socket.end(); + }); +}).listen(5001); +``` + +```cjs const net = require('node:net'); const repl = require('node:repl'); let connections = 0; From 59cae914657c66310fb1971a3e0a6b0279e54f13 Mon Sep 17 00:00:00 2001 From: theanarkh Date: Sun, 15 Dec 2024 22:19:27 +0800 Subject: [PATCH 68/88] dgram: support blocklist in udp PR-URL: https://github.com/nodejs/node/pull/56087 Reviewed-By: Luigi Pinca --- doc/api/dgram.md | 7 ++++ lib/dgram.js | 37 +++++++++++++++++++- test/parallel/test-dgram-blocklist.js | 49 +++++++++++++++++++++++++++ 3 files changed, 92 insertions(+), 1 deletion(-) create mode 100644 test/parallel/test-dgram-blocklist.js diff --git a/doc/api/dgram.md b/doc/api/dgram.md index 2243b6abdea9bc..4d2ef8dea164f9 100644 --- a/doc/api/dgram.md +++ b/doc/api/dgram.md @@ -957,6 +957,13 @@ changes: * `sendBufferSize` {number} Sets the `SO_SNDBUF` socket value. * `lookup` {Function} Custom lookup function. **Default:** [`dns.lookup()`][]. * `signal` {AbortSignal} An AbortSignal that may be used to close a socket. + * `receiveBlockList` {net.BlockList} `receiveBlockList` can be used for discarding + inbound datagram to specific IP addresses, IP ranges, or IP subnets. This does not + work if the server is behind a reverse proxy, NAT, etc. because the address + checked against the blocklist is the address of the proxy, or the one + specified by the NAT. + * `sendBlockList` {net.BlockList} `sendBlockList` can be used for disabling outbound + access to specific IP addresses, IP ranges, or IP subnets. * `callback` {Function} Attached as a listener for `'message'` events. Optional. * Returns: {dgram.Socket} diff --git a/lib/dgram.js b/lib/dgram.js index 09630b6c901181..b4c5db6439784a 100644 --- a/lib/dgram.js +++ b/lib/dgram.js @@ -41,6 +41,7 @@ const { ERR_BUFFER_OUT_OF_BOUNDS, ERR_INVALID_ARG_TYPE, ERR_INVALID_FD_TYPE, + ERR_IP_BLOCKED, ERR_MISSING_ARGS, ERR_SOCKET_ALREADY_BOUND, ERR_SOCKET_BAD_BUFFER_SIZE, @@ -55,6 +56,7 @@ const { _createSocketHandle, newHandle, } = require('internal/dgram'); +const { isIP } = require('internal/net'); const { isInt32, validateAbortSignal, @@ -99,12 +101,18 @@ let _cluster = null; function lazyLoadCluster() { return _cluster ??= require('cluster'); } +let _blockList = null; +function lazyLoadBlockList() { + return _blockList ??= require('internal/blocklist').BlockList; +} function Socket(type, listener) { FunctionPrototypeCall(EventEmitter, this); let lookup; let recvBufferSize; let sendBufferSize; + let receiveBlockList; + let sendBlockList; let options; if (type !== null && typeof type === 'object') { @@ -119,6 +127,18 @@ function Socket(type, listener) { } recvBufferSize = options.recvBufferSize; sendBufferSize = options.sendBufferSize; + if (options.receiveBlockList) { + if (!lazyLoadBlockList().isBlockList(options.receiveBlockList)) { + throw new ERR_INVALID_ARG_TYPE('options.receiveBlockList', 'net.BlockList', options.receiveBlockList); + } + receiveBlockList = options.receiveBlockList; + } + if (options.sendBlockList) { + if (!lazyLoadBlockList().isBlockList(options.sendBlockList)) { + throw new ERR_INVALID_ARG_TYPE('options.sendBlockList', 'net.BlockList', options.sendBlockList); + } + sendBlockList = options.sendBlockList; + } } const handle = newHandle(type, lookup); @@ -141,6 +161,8 @@ function Socket(type, listener) { ipv6Only: options?.ipv6Only, recvBufferSize, sendBufferSize, + receiveBlockList, + sendBlockList, }; if (options?.signal !== undefined) { @@ -439,7 +461,9 @@ function doConnect(ex, self, ip, address, port, callback) { const state = self[kStateSymbol]; if (!state.handle) return; - + if (!ex && state.sendBlockList?.check(ip, `ipv${isIP(ip)}`)) { + ex = new ERR_IP_BLOCKED(ip); + } if (!ex) { const err = state.handle.connect(ip, port); if (err) { @@ -703,6 +727,13 @@ function doSend(ex, self, ip, list, address, port, callback) { return; } + if (ip && state.sendBlockList?.check(ip, `ipv${isIP(ip)}`)) { + if (callback) { + process.nextTick(callback, new ERR_IP_BLOCKED(ip)); + } + return; + } + const req = new SendWrap(); req.list = list; // Keep reference alive. req.address = address; @@ -951,6 +982,10 @@ function onMessage(nread, handle, buf, rinfo) { if (nread < 0) { return self.emit('error', new ErrnoException(nread, 'recvmsg')); } + if (self[kStateSymbol]?.receiveBlockList?.check(rinfo.address, + rinfo.family?.toLocaleLowerCase())) { + return; + } rinfo.size = buf.length; // compatibility self.emit('message', buf, rinfo); } diff --git a/test/parallel/test-dgram-blocklist.js b/test/parallel/test-dgram-blocklist.js new file mode 100644 index 00000000000000..8af6522e7bd2d2 --- /dev/null +++ b/test/parallel/test-dgram-blocklist.js @@ -0,0 +1,49 @@ +'use strict'; +const common = require('../common'); +const assert = require('assert'); +const dgram = require('dgram'); +const net = require('net'); + +{ + const blockList = new net.BlockList(); + blockList.addAddress(common.localhostIPv4); + + const connectSocket = dgram.createSocket({ type: 'udp4', sendBlockList: blockList }); + connectSocket.connect(9999, common.localhostIPv4, common.mustCall((err) => { + assert.ok(err.code === 'ERR_IP_BLOCKED', err); + connectSocket.close(); + })); +} + +{ + const blockList = new net.BlockList(); + blockList.addAddress(common.localhostIPv4); + const sendSocket = dgram.createSocket({ type: 'udp4', sendBlockList: blockList }); + sendSocket.send('hello', 9999, common.localhostIPv4, common.mustCall((err) => { + assert.ok(err.code === 'ERR_IP_BLOCKED', err); + sendSocket.close(); + })); +} + +{ + const blockList = new net.BlockList(); + blockList.addAddress(common.localhostIPv4); + const receiveSocket = dgram.createSocket({ type: 'udp4', receiveBlockList: blockList }); + // Hack to close the socket + const check = blockList.check; + blockList.check = function() { + process.nextTick(() => { + receiveSocket.close(); + }); + return check.apply(this, arguments); + }; + receiveSocket.on('message', common.mustNotCall()); + receiveSocket.bind(0, common.localhostIPv4, common.mustCall(() => { + const addressInfo = receiveSocket.address(); + const client = dgram.createSocket('udp4'); + client.send('hello', addressInfo.port, addressInfo.address, common.mustCall((err) => { + assert.ok(!err); + client.close(); + })); + })); +} From f264dd6d203480ebf1fcf3515b6b3f9fb332059a Mon Sep 17 00:00:00 2001 From: Duncan Date: Sun, 15 Dec 2024 16:56:39 -0500 Subject: [PATCH 69/88] buffer: document concat zero-fill MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/55562 Reviewed-By: Luigi Pinca Reviewed-By: Ruben Bridgewater Reviewed-By: Ulises Gascón Reviewed-By: James M Snell Reviewed-By: Jason Zhang --- doc/api/buffer.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/api/buffer.md b/doc/api/buffer.md index d72e8720c688fa..c07443601c8b67 100644 --- a/doc/api/buffer.md +++ b/doc/api/buffer.md @@ -1042,7 +1042,8 @@ in `list` by adding their lengths. If `totalLength` is provided, it is coerced to an unsigned integer. If the combined length of the `Buffer`s in `list` exceeds `totalLength`, the result is -truncated to `totalLength`. +truncated to `totalLength`. If the combined length of the `Buffer`s in `list` is +less than `totalLength`, the remaining space is filled with zeros. ```mjs import { Buffer } from 'node:buffer'; From 0d08756d0c2d178cc15ac04462588298884283ee Mon Sep 17 00:00:00 2001 From: Kunal Kumar Date: Mon, 16 Dec 2024 03:26:47 +0530 Subject: [PATCH 70/88] doc: clarify util.aborted resource usage PR-URL: https://github.com/nodejs/node/pull/55780 Fixes: https://github.com/nodejs/node/issues/55340 Reviewed-By: Benjamin Gruenbaum Reviewed-By: Luigi Pinca Reviewed-By: James M Snell Reviewed-By: Jason Zhang Reviewed-By: Minwoo Jung --- doc/api/util.md | 34 +++++++++++++++++++++++++--------- 1 file changed, 25 insertions(+), 9 deletions(-) diff --git a/doc/api/util.md b/doc/api/util.md index eb06e93c503811..03c1b621358fea 100644 --- a/doc/api/util.md +++ b/doc/api/util.md @@ -2244,39 +2244,55 @@ added: > Stability: 1 - Experimental * `signal` {AbortSignal} -* `resource` {Object} Any non-null entity, reference to which is held weakly. +* `resource` {Object} Any non-null object tied to the abortable operation and held weakly. + If `resource` is garbage collected before the `signal` aborts, the promise remains pending, + allowing Node.js to stop tracking it. + This helps prevent memory leaks in long-running or non-cancelable operations. * Returns: {Promise} -Listens to abort event on the provided `signal` and -returns a promise that is fulfilled when the `signal` is -aborted. If the passed `resource` is garbage collected before the `signal` is -aborted, the returned promise shall remain pending indefinitely. +Listens to abort event on the provided `signal` and returns a promise that resolves when the `signal` is aborted. +If `resource` is provided, it weakly references the operation's associated object, +so if `resource` is garbage collected before the `signal` aborts, +then returned promise shall remain pending. +This prevents memory leaks in long-running or non-cancelable operations. ```cjs const { aborted } = require('node:util'); +// Obtain an object with an abortable signal, like a custom resource or operation. const dependent = obtainSomethingAbortable(); +// Pass `dependent` as the resource, indicating the promise should only resolve +// if `dependent` is still in memory when the signal is aborted. aborted(dependent.signal, dependent).then(() => { - // Do something when dependent is aborted. + + // This code runs when `dependent` is aborted. + console.log('Dependent resource was aborted.'); }); +// Simulate an event that triggers the abort. dependent.on('event', () => { - dependent.abort(); + dependent.abort(); // This will cause the `aborted` promise to resolve. }); ``` ```mjs import { aborted } from 'node:util'; +// Obtain an object with an abortable signal, like a custom resource or operation. const dependent = obtainSomethingAbortable(); +// Pass `dependent` as the resource, indicating the promise should only resolve +// if `dependent` is still in memory when the signal is aborted. aborted(dependent.signal, dependent).then(() => { - // Do something when dependent is aborted. + + // This code runs when `dependent` is aborted. + console.log('Dependent resource was aborted.'); }); +// Simulate an event that triggers the abort. dependent.on('event', () => { - dependent.abort(); + dependent.abort(); // This will cause the `aborted` promise to resolve. }); ``` From 25bb462bc2656d1c3e1ec36602206eaacc2691b7 Mon Sep 17 00:00:00 2001 From: Stefan Stojanovic Date: Mon, 16 Dec 2024 11:11:22 +0100 Subject: [PATCH 71/88] deps: define V8_PRESERVE_MOST as no-op on Windows MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit It's causing linker errors with node.lib in node-gyp and potentially breaks other 3rd party tools PR-URL: https://github.com/nodejs/node/pull/56238 Refs: https://github.com/nodejs/node/pull/55784 Reviewed-By: Michaël Zasso Reviewed-By: James M Snell Reviewed-By: Yagiz Nizipli Reviewed-By: Luigi Pinca --- common.gypi | 2 +- deps/v8/include/v8config.h | 4 ++++ 2 files changed, 5 insertions(+), 1 deletion(-) diff --git a/common.gypi b/common.gypi index 23196aae451f6a..a6a79adcc2fb4f 100644 --- a/common.gypi +++ b/common.gypi @@ -36,7 +36,7 @@ # Reset this number to 0 on major V8 upgrades. # Increment by one for each non-official patch applied to deps/v8. - 'v8_embedder_string': '-node.11', + 'v8_embedder_string': '-node.12', ##### V8 defaults for Node.js ##### diff --git a/deps/v8/include/v8config.h b/deps/v8/include/v8config.h index b6d087b958edc1..73a6a91d49bf0e 100644 --- a/deps/v8/include/v8config.h +++ b/deps/v8/include/v8config.h @@ -581,11 +581,15 @@ path. Add it with -I to the command line // functions. // Use like: // V8_NOINLINE V8_PRESERVE_MOST void UnlikelyMethod(); +#if V8_OS_WIN +# define V8_PRESERVE_MOST +#else #if V8_HAS_ATTRIBUTE_PRESERVE_MOST # define V8_PRESERVE_MOST __attribute__((preserve_most)) #else # define V8_PRESERVE_MOST /* NOT SUPPORTED */ #endif +#endif // A macro (V8_DEPRECATED) to mark classes or functions as deprecated. From 3e17a8e78e1455abadcedb5d6feafc76253a1962 Mon Sep 17 00:00:00 2001 From: Antoine du Hamel Date: Mon, 16 Dec 2024 23:33:08 +0100 Subject: [PATCH 72/88] util: harden more built-in classes against prototype pollution MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56225 Reviewed-By: Jordan Harband Reviewed-By: Vinícius Lourenço Claro Cardoso --- lib/buffer.js | 10 +++++++- lib/internal/util/inspect.js | 20 ++++++++++++--- test/parallel/test-util-inspect.js | 41 ++++++++++++++++++++++++++++++ 3 files changed, 67 insertions(+), 4 deletions(-) diff --git a/lib/buffer.js b/lib/buffer.js index 756657e910893e..8f235e5f0dae6c 100644 --- a/lib/buffer.js +++ b/lib/buffer.js @@ -35,6 +35,7 @@ const { NumberMIN_SAFE_INTEGER, ObjectDefineProperties, ObjectDefineProperty, + ObjectPrototypeHasOwnProperty, ObjectSetPrototypeOf, RegExpPrototypeSymbolReplace, StringPrototypeCharCodeAt, @@ -910,7 +911,14 @@ Buffer.prototype[customInspectSymbol] = function inspect(recurseTimes, ctx) { }), 27, -2); } } - return `<${this.constructor.name} ${str}>`; + let constructorName = 'Buffer'; + try { + const { constructor } = this; + if (typeof constructor === 'function' && ObjectPrototypeHasOwnProperty(constructor, 'name')) { + constructorName = constructor.name; + } + } catch { /* Ignore error and use default name */ } + return `<${constructorName} ${str}>`; }; Buffer.prototype.inspect = Buffer.prototype[customInspectSymbol]; diff --git a/lib/internal/util/inspect.js b/lib/internal/util/inspect.js index 1f1be555c96e08..57586a888e191f 100644 --- a/lib/internal/util/inspect.js +++ b/lib/internal/util/inspect.js @@ -2,7 +2,10 @@ const { Array, + ArrayBuffer, + ArrayBufferPrototype, ArrayIsArray, + ArrayPrototype, ArrayPrototypeFilter, ArrayPrototypeForEach, ArrayPrototypeIncludes, @@ -29,6 +32,8 @@ const { FunctionPrototypeSymbolHasInstance, FunctionPrototypeToString, JSONStringify, + Map, + MapPrototype, MapPrototypeEntries, MapPrototypeGetSize, MathFloor, @@ -68,6 +73,8 @@ const { SafeMap, SafeSet, SafeStringIterator, + Set, + SetPrototype, SetPrototypeGetSize, SetPrototypeValues, String, @@ -93,6 +100,8 @@ const { SymbolPrototypeValueOf, SymbolToPrimitive, SymbolToStringTag, + TypedArray, + TypedArrayPrototype, TypedArrayPrototypeGetLength, TypedArrayPrototypeGetSymbolToStringTag, Uint8Array, @@ -599,8 +608,13 @@ function isInstanceof(object, proto) { // Special-case for some builtin prototypes in case their `constructor` property has been tampered. const wellKnownPrototypes = new SafeMap(); -wellKnownPrototypes.set(ObjectPrototype, { name: 'Object', constructor: Object }); +wellKnownPrototypes.set(ArrayPrototype, { name: 'Array', constructor: Array }); +wellKnownPrototypes.set(ArrayBufferPrototype, { name: 'ArrayBuffer', constructor: ArrayBuffer }); wellKnownPrototypes.set(FunctionPrototype, { name: 'Function', constructor: Function }); +wellKnownPrototypes.set(MapPrototype, { name: 'Map', constructor: Map }); +wellKnownPrototypes.set(ObjectPrototype, { name: 'Object', constructor: Object }); +wellKnownPrototypes.set(SetPrototype, { name: 'Set', constructor: Set }); +wellKnownPrototypes.set(TypedArrayPrototype, { name: 'TypedArray', constructor: TypedArray }); function getConstructorName(obj, ctx, recurseTimes, protoProps) { let firstProto; @@ -825,12 +839,12 @@ function formatValue(ctx, value, recurseTimes, typedArray) { // Filter out the util module, its inspect function is special. maybeCustom !== inspect && // Also filter out any prototype objects using the circular check. - !(value.constructor && value.constructor.prototype === value)) { + ObjectGetOwnPropertyDescriptor(value, 'constructor')?.value?.prototype !== value) { // This makes sure the recurseTimes are reported as before while using // a counter internally. const depth = ctx.depth === null ? null : ctx.depth - recurseTimes; const isCrossContext = - proxy !== undefined || !(context instanceof Object); + proxy !== undefined || !FunctionPrototypeSymbolHasInstance(Object, context); const ret = FunctionPrototypeCall( maybeCustom, context, diff --git a/test/parallel/test-util-inspect.js b/test/parallel/test-util-inspect.js index 04fc82cfc1aa80..e69c0349dbafa9 100644 --- a/test/parallel/test-util-inspect.js +++ b/test/parallel/test-util-inspect.js @@ -3353,3 +3353,44 @@ assert.strictEqual( ); Object.defineProperty(BuiltinPrototype, 'constructor', desc); } +{ + const prototypes = [ + Array.prototype, + ArrayBuffer.prototype, + Buffer.prototype, + Function.prototype, + Map.prototype, + Object.prototype, + Reflect.getPrototypeOf(Uint8Array.prototype), + Set.prototype, + Uint8Array.prototype, + ]; + const descriptors = new Map(); + const buffer = Buffer.from('Hello'); + const o = { + arrayBuffer: new ArrayBuffer(), buffer, typedArray: Uint8Array.from(buffer), + array: [], func() {}, set: new Set([1]), map: new Map(), + }; + for (const BuiltinPrototype of prototypes) { + descriptors.set(BuiltinPrototype, Reflect.getOwnPropertyDescriptor(BuiltinPrototype, 'constructor')); + Object.defineProperty(BuiltinPrototype, 'constructor', { + get: () => BuiltinPrototype, + configurable: true, + }); + } + assert.strictEqual( + util.inspect(o), + '{\n' + + ' arrayBuffer: ArrayBuffer { [Uint8Contents]: <>, byteLength: 0 },\n' + + ' buffer: ,\n' + + ' typedArray: TypedArray(5) [Uint8Array] [ 72, 101, 108, 108, 111 ],\n' + + ' array: [],\n' + + ' func: [Function: func],\n' + + ' set: Set(1) { 1 },\n' + + ' map: Map(0) {}\n' + + '}', + ); + for (const [BuiltinPrototype, desc] of descriptors) { + Object.defineProperty(BuiltinPrototype, 'constructor', desc); + } +} From 445c8c7489081e53f6b5f9820a31411e266f05f3 Mon Sep 17 00:00:00 2001 From: Rafael Gonzaga Date: Mon, 16 Dec 2024 20:51:05 -0300 Subject: [PATCH 73/88] build: add major release action This action reminds collaborators of the upcoming major release date. In the future, this action can also update and create the branches (that's why the action name is generic). PR-URL: https://github.com/nodejs/node/pull/56199 Refs: https://github.com/nodejs/node/pull/55732 Reviewed-By: Antoine du Hamel --- .github/workflows/major-release.yml | 48 +++++++++++++++++++++++++++++ 1 file changed, 48 insertions(+) create mode 100644 .github/workflows/major-release.yml diff --git a/.github/workflows/major-release.yml b/.github/workflows/major-release.yml new file mode 100644 index 00000000000000..a90be1798fac85 --- /dev/null +++ b/.github/workflows/major-release.yml @@ -0,0 +1,48 @@ +name: Major Release + +on: + schedule: + - cron: 0 0 15 2,8 * # runs at midnight UTC every 15 February and 15 August + +permissions: + contents: read + +jobs: + create-issue: + runs-on: ubuntu-latest + permissions: + issues: write + steps: + - name: Check for release schedule + id: check-date + run: | + # Get the current month and day + MONTH=$(date +'%m') + DAY=$(date +'%d') + # We'll create the reminder issue two months prior the release + if [[ "$MONTH" == "02" || "$MONTH" == "08" ]] && [[ "$DAY" == "15" ]]; then + echo "create_issue=true" >> "$GITHUB_ENV" + fi + - name: Retrieve next major release info from nodejs/Release + if: env.create_issue == 'true' + run: | + curl -L https://github.com/nodejs/Release/raw/HEAD/schedule.json | \ + jq -r 'to_entries | map(select(.value.start | strptime("%Y-%m-%d") | mktime > now)) | first | "VERSION=" + .key + "\nRELEASE_DATE=" + .value.start' >> "$GITHUB_ENV" + - name: Compute max date for landing semver-major PRs + if: env.create_issue == 'true' + run: | + echo "PR_MAX_DATE=$(date -d "$RELEASE_DATE -1 month" +%Y-%m-%d)" >> "$GITHUB_ENV" + - name: Create release announcement issue + if: env.create_issue == 'true' + run: | + gh issue create --repo "${GITHUB_REPOSITORY}" \ + --title "Upcoming Node.js Major Release ($VERSION)" \ + --body-file -< Date: Mon, 16 Dec 2024 20:02:29 -0500 Subject: [PATCH 74/88] deps: update c-ares to v1.34.4 PR-URL: https://github.com/nodejs/node/pull/56256 Reviewed-By: Luigi Pinca Reviewed-By: Marco Ippolito --- deps/cares/CMakeLists.txt | 9 +- deps/cares/Makefile.am | 29 +- deps/cares/Makefile.in | 35 +- deps/cares/Makefile.msvc | 29 +- deps/cares/RELEASE-NOTES.md | 100 +-- deps/cares/aclocal.m4 | 4 +- deps/cares/aminclude_static.am | 6 +- deps/cares/configure | 669 ++++++++++++------ deps/cares/configure.ac | 43 +- deps/cares/docs/Makefile.in | 6 +- deps/cares/docs/ares_create_query.3 | 3 + deps/cares/docs/ares_mkquery.3 | 3 +- deps/cares/docs/ares_send.3 | 3 + deps/cares/include/Makefile.in | 6 +- deps/cares/include/ares.h | 2 +- deps/cares/include/ares_version.h | 4 +- ...espace.m4 => ares_check_user_namespace.m4} | 12 +- ...mespace.m4 => ares_check_uts_namespace.m4} | 12 +- deps/cares/m4/ax_append_compile_flags.m4 | 47 +- deps/cares/m4/ax_append_flag.m4 | 61 +- deps/cares/m4/ax_check_compile_flag.m4 | 57 +- deps/cares/m4/ax_code_coverage.m4 | 13 +- deps/cares/m4/ax_cxx_compile_stdcxx.m4 | 88 ++- deps/cares/src/Makefile.in | 6 +- deps/cares/src/lib/CMakeLists.txt | 14 +- deps/cares/src/lib/Makefile.in | 12 +- deps/cares/src/lib/ares_config.h.cmake | 3 + deps/cares/src/lib/ares_config.h.in | 3 + deps/cares/src/lib/ares_private.h | 17 +- .../cares/src/lib/ares_set_socket_functions.c | 4 +- deps/cares/src/lib/ares_socket.c | 3 +- deps/cares/src/lib/ares_sysconfig.c | 92 ++- deps/cares/src/lib/ares_sysconfig_files.c | 89 ++- .../src/lib/event/ares_event_configchg.c | 22 +- deps/cares/src/lib/include/ares_buf.h | 20 + deps/cares/src/lib/include/ares_str.h | 14 + .../src/lib/record/ares_dns_multistring.c | 58 +- deps/cares/src/lib/str/ares_buf.c | 66 ++ deps/cares/src/lib/str/ares_str.c | 17 + deps/cares/src/tools/Makefile.in | 6 +- 40 files changed, 1099 insertions(+), 588 deletions(-) rename deps/cares/m4/{ax_check_user_namespace.m4 => ares_check_user_namespace.m4} (82%) rename deps/cares/m4/{ax_check_uts_namespace.m4 => ares_check_uts_namespace.m4} (87%) diff --git a/deps/cares/CMakeLists.txt b/deps/cares/CMakeLists.txt index f6560d56b08ddd..139defd8ffd159 100644 --- a/deps/cares/CMakeLists.txt +++ b/deps/cares/CMakeLists.txt @@ -1,6 +1,6 @@ # Copyright (C) The c-ares project and its contributors # SPDX-License-Identifier: MIT -CMAKE_MINIMUM_REQUIRED (VERSION 3.5.0) +CMAKE_MINIMUM_REQUIRED (VERSION 3.5.0...3.10.0) list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_SOURCE_DIR}/cmake/") @@ -12,7 +12,7 @@ INCLUDE (CheckCSourceCompiles) INCLUDE (CheckStructHasMember) INCLUDE (CheckLibraryExists) -PROJECT (c-ares LANGUAGES C VERSION "1.34.3" ) +PROJECT (c-ares LANGUAGES C VERSION "1.34.4" ) # Set this version before release SET (CARES_VERSION "${PROJECT_VERSION}") @@ -30,7 +30,7 @@ INCLUDE (GNUInstallDirs) # include this *AFTER* PROJECT(), otherwise paths are w # For example, a version of 4:0:2 would generate output such as: # libname.so -> libname.so.2 # libname.so.2 -> libname.so.2.2.0 -SET (CARES_LIB_VERSIONINFO "21:2:19") +SET (CARES_LIB_VERSIONINFO "21:3:19") OPTION (CARES_STATIC "Build as a static library" OFF) @@ -271,6 +271,8 @@ ELSEIF (CMAKE_SYSTEM_NAME STREQUAL "AIX") LIST (APPEND SYSFLAGS -D_ALL_SOURCE -D_XOPEN_SOURCE=700 -D_USE_IRS) ELSEIF (CMAKE_SYSTEM_NAME STREQUAL "FreeBSD") # Don't define _XOPEN_SOURCE on FreeBSD, it actually reduces visibility instead of increasing it +ELSEIF (CMAKE_SYSTEM_NAME STREQUAL "QNX") + LIST (APPEND SYSFLAGS -D_QNX_SOURCE) ELSEIF (WIN32) LIST (APPEND SYSFLAGS -DWIN32_LEAN_AND_MEAN -D_CRT_SECURE_NO_DEPRECATE -D_CRT_NONSTDC_NO_DEPRECATE -D_WIN32_WINNT=0x0602) ENDIF () @@ -406,6 +408,7 @@ ENDIF () CHECK_STRUCT_HAS_MEMBER("struct sockaddr_in6" sin6_scope_id "${CMAKE_EXTRA_INCLUDE_FILES}" HAVE_STRUCT_SOCKADDR_IN6_SIN6_SCOPE_ID LANGUAGE C) +CHECK_SYMBOL_EXISTS (strnlen "${CMAKE_EXTRA_INCLUDE_FILES}" HAVE_STRNLEN) CHECK_SYMBOL_EXISTS (memmem "${CMAKE_EXTRA_INCLUDE_FILES}" HAVE_MEMMEM) CHECK_SYMBOL_EXISTS (closesocket "${CMAKE_EXTRA_INCLUDE_FILES}" HAVE_CLOSESOCKET) CHECK_SYMBOL_EXISTS (CloseSocket "${CMAKE_EXTRA_INCLUDE_FILES}" HAVE_CLOSESOCKET_CAMEL) diff --git a/deps/cares/Makefile.am b/deps/cares/Makefile.am index e99161a45f7883..51b5f6be32be78 100644 --- a/deps/cares/Makefile.am +++ b/deps/cares/Makefile.am @@ -3,17 +3,24 @@ # Copyright (C) the Massachusetts Institute of Technology. # Copyright (C) Daniel Stenberg # -# Permission to use, copy, modify, and distribute this -# software and its documentation for any purpose and without -# fee is hereby granted, provided that the above copyright -# notice appear in all copies and that both that copyright -# notice and this permission notice appear in supporting -# documentation, and that the name of M.I.T. not be used in -# advertising or publicity pertaining to distribution of the -# software without specific, written prior permission. -# M.I.T. makes no representations about the suitability of -# this software for any purpose. It is provided "as is" -# without express or implied warranty. +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice (including the next +# paragraph) shall be included in all copies or substantial portions of the +# Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. # # SPDX-License-Identifier: MIT # diff --git a/deps/cares/Makefile.in b/deps/cares/Makefile.in index ba78cb77cbe335..2342125d136526 100644 --- a/deps/cares/Makefile.in +++ b/deps/cares/Makefile.in @@ -19,17 +19,24 @@ # Copyright (C) the Massachusetts Institute of Technology. # Copyright (C) Daniel Stenberg # -# Permission to use, copy, modify, and distribute this -# software and its documentation for any purpose and without -# fee is hereby granted, provided that the above copyright -# notice appear in all copies and that both that copyright -# notice and this permission notice appear in supporting -# documentation, and that the name of M.I.T. not be used in -# advertising or publicity pertaining to distribution of the -# software without specific, written prior permission. -# M.I.T. makes no representations about the suitability of -# this software for any purpose. It is provided "as is" -# without express or implied warranty. +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice (including the next +# paragraph) shall be included in all copies or substantial portions of the +# Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. # # SPDX-License-Identifier: MIT # @@ -111,7 +118,9 @@ build_triplet = @build@ host_triplet = @host@ subdir = . ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -121,8 +130,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ diff --git a/deps/cares/Makefile.msvc b/deps/cares/Makefile.msvc index 8395d1a7d67728..3266db415e09fe 100644 --- a/deps/cares/Makefile.msvc +++ b/deps/cares/Makefile.msvc @@ -1,17 +1,24 @@ # Copyright (C) 2009-2013 by Daniel Stenberg # -# Permission to use, copy, modify, and distribute this -# software and its documentation for any purpose and without -# fee is hereby granted, provided that the above copyright -# notice appear in all copies and that both that copyright -# notice and this permission notice appear in supporting -# documentation, and that the name of M.I.T. not be used in -# advertising or publicity pertaining to distribution of the -# software without specific, written prior permission. -# M.I.T. makes no representations about the suitability of -# this software for any purpose. It is provided "as is" -# without express or implied warranty. +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice (including the next +# paragraph) shall be included in all copies or substantial portions of the +# Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. # # SPDX-License-Identifier: MIT diff --git a/deps/cares/RELEASE-NOTES.md b/deps/cares/RELEASE-NOTES.md index f9d58d278432f1..19a204b3ea96bd 100644 --- a/deps/cares/RELEASE-NOTES.md +++ b/deps/cares/RELEASE-NOTES.md @@ -1,97 +1,25 @@ -## c-ares version 1.34.3 - November 9 2024 +## c-ares version 1.34.4 - December 14 2024 This is a bugfix release. Changes: -* Build the release package in an automated way so we can provide - provenance as per [SLSA3](https://slsa.dev/). - [PR #906](https://github.com/c-ares/c-ares/pull/906) +* QNX Port: Port to QNX 8, add primary config reading support, add CI build. [PR #934](https://github.com/c-ares/c-ares/pull/934), [PR #937](https://github.com/c-ares/c-ares/pull/937), [PR #938](https://github.com/c-ares/c-ares/pull/938) Bugfixes: -* Some upstream servers are non-compliant with EDNS options, resend queries - without EDNS. [Issue #911](https://github.com/c-ares/c-ares/issues/911) -* Android: <=7 needs sys/system_properties.h - [a70637c](https://github.com/c-ares/c-ares/commit/a70637c) -* Android: CMake needs `-D_GNU_SOURCE` and others. - [PR #915](https://github.com/c-ares/c-ares/pull/914) -* TSAN warns on missing lock, but lock isn't actually necessary. - [PR #915](https://github.com/c-ares/c-ares/pull/915) -* `ares_getaddrinfo()` for `AF_UNSPEC` should retry IPv4 if only IPv6 is - received. [765d558](https://github.com/c-ares/c-ares/commit/765d558) -* `ares_send()` shouldn't return `ARES_EBADRESP`, its `ARES_EBADQUERY`. - [91519e7](https://github.com/c-ares/c-ares/commit/91519e7) -* Fix typos in man pages. [PR #905](https://github.com/c-ares/c-ares/pull/905) +* Empty TXT records were not being preserved. [PR #922](https://github.com/c-ares/c-ares/pull/922) +* docs: update deprecation notices for `ares_create_query()` and `ares_mkquery()`. [PR #910](https://github.com/c-ares/c-ares/pull/910) +* license: some files weren't properly updated. [PR #920](https://github.com/c-ares/c-ares/pull/920) +* Fix bind local device regression from 1.34.0. [PR #929](https://github.com/c-ares/c-ares/pull/929), [PR #931](https://github.com/c-ares/c-ares/pull/931), [PR #935](https://github.com/c-ares/c-ares/pull/935) +* CMake: set policy version to prevent deprecation warnings. [PR #932](https://github.com/c-ares/c-ares/pull/932) +* CMake: shared and static library names should be the same on unix platforms like autotools uses. [PR #933](https://github.com/c-ares/c-ares/pull/933) +* Update to latest autoconf archive macros for enhanced system compatibility. [PR #936](https://github.com/c-ares/c-ares/pull/936) Thanks go to these friendly people for their efforts and contributions for this release: * Brad House (@bradh352) -* Jiwoo Park (@jimmy-park) - - -## c-ares version 1.34.2 - October 15 2024 - -This release contains a fix for downstream packages detecting the c-ares -version based on the contents of the header file rather than the -distributed pkgconf or cmake files. - -## c-ares version 1.34.1 - October 9 2024 - -This release fixes a packaging issue. - - -## c-ares version 1.34.0 - October 9 2024 - -This is a feature and bugfix release. - -Features: -* adig: read arguments from adigrc. - [PR #856](https://github.com/c-ares/c-ares/pull/856) -* Add new pending write callback optimization via `ares_set_pending_write_cb`. - [PR #857](https://github.com/c-ares/c-ares/pull/857) -* New function `ares_process_fds()`. - [PR #875](https://github.com/c-ares/c-ares/pull/875) -* Failed servers should be probed rather than redirecting queries which could - cause unexpected latency. - [PR #877](https://github.com/c-ares/c-ares/pull/877) -* adig: rework command line arguments to mimic dig from bind. - [PR #890](https://github.com/c-ares/c-ares/pull/890) -* Add new method for overriding network functions - `ares_set_socket_function_ex()` to properly support all new functionality. - [PR #894](https://github.com/c-ares/c-ares/pull/894) -* Fix regression with custom socket callbacks due to DNS cookie support. - [PR #895](https://github.com/c-ares/c-ares/pull/895) -* ares_socket: set IP_BIND_ADDRESS_NO_PORT on ares_set_local_ip* tcp sockets - [PR #887](https://github.com/c-ares/c-ares/pull/887) -* URI parser/writer for ares_set_servers_csv()/ares_get_servers_csv(). - [PR #882](https://github.com/c-ares/c-ares/pull/882) - -Changes: -* Connection handling modularization. - [PR #857](https://github.com/c-ares/c-ares/pull/857), - [PR #876](https://github.com/c-ares/c-ares/pull/876) -* Expose library/utility functions to tools. - [PR #860](https://github.com/c-ares/c-ares/pull/860) -* Remove `ares__` prefix, just use `ares_` for internal functions. - [PR #872](https://github.com/c-ares/c-ares/pull/872) - - -Bugfixes: -* fix: potential WIN32_LEAN_AND_MEAN redefinition. - [PR #869](https://github.com/c-ares/c-ares/pull/869) -* Fix googletest v1.15 compatibility. - [PR #874](https://github.com/c-ares/c-ares/pull/874) -* Fix pkgconfig thread dependencies. - [PR #884](https://github.com/c-ares/c-ares/pull/884) - - -Thanks go to these friendly people for their efforts and contributions for this -release: - -* Brad House (@bradh352) -* Cristian Rodríguez (@crrodriguez) -* Georg (@tacerus) -* @lifenjoiner -* Shelley Vohr (@codebytere) -* 前进,前进,进 (@leleliu008) - +* Daniel Stenberg (@bagder) +* Gregor Jasny (@gjasny) +* @marcovsz +* Nikolaos Chatzikonstantinou (@createyourpersonalaccount) +* @vlasovsoft1979 diff --git a/deps/cares/aclocal.m4 b/deps/cares/aclocal.m4 index ce7ad1c8a86a43..04f8786c9c0c89 100644 --- a/deps/cares/aclocal.m4 +++ b/deps/cares/aclocal.m4 @@ -1221,6 +1221,8 @@ AC_SUBST([am__tar]) AC_SUBST([am__untar]) ]) # _AM_PROG_TAR +m4_include([m4/ares_check_user_namespace.m4]) +m4_include([m4/ares_check_uts_namespace.m4]) m4_include([m4/ax_ac_append_to_file.m4]) m4_include([m4/ax_ac_print_to_file.m4]) m4_include([m4/ax_add_am_macro_static.m4]) @@ -1231,8 +1233,6 @@ m4_include([m4/ax_append_link_flags.m4]) m4_include([m4/ax_check_compile_flag.m4]) m4_include([m4/ax_check_gnu_make.m4]) m4_include([m4/ax_check_link_flag.m4]) -m4_include([m4/ax_check_user_namespace.m4]) -m4_include([m4/ax_check_uts_namespace.m4]) m4_include([m4/ax_code_coverage.m4]) m4_include([m4/ax_compiler_vendor.m4]) m4_include([m4/ax_cxx_compile_stdcxx.m4]) diff --git a/deps/cares/aminclude_static.am b/deps/cares/aminclude_static.am index b83549f81adde4..ec7a86a43e6829 100644 --- a/deps/cares/aminclude_static.am +++ b/deps/cares/aminclude_static.am @@ -1,6 +1,6 @@ # aminclude_static.am generated automatically by Autoconf -# from AX_AM_MACROS_STATIC on Sat Nov 9 17:40:37 UTC 2024 +# from AX_AM_MACROS_STATIC on Sat Dec 14 15:15:44 UTC 2024 # Code coverage @@ -66,7 +66,7 @@ code_coverage_v_lcov_cap_ = $(code_coverage_v_lcov_cap_$(AM_DEFAULT_VERBOSITY)) code_coverage_v_lcov_cap_0 = @echo " LCOV --capture" $(CODE_COVERAGE_OUTPUT_FILE); code_coverage_v_lcov_ign = $(code_coverage_v_lcov_ign_$(V)) code_coverage_v_lcov_ign_ = $(code_coverage_v_lcov_ign_$(AM_DEFAULT_VERBOSITY)) -code_coverage_v_lcov_ign_0 = @echo " LCOV --remove /tmp/*" $(CODE_COVERAGE_IGNORE_PATTERN); +code_coverage_v_lcov_ign_0 = @echo " LCOV --remove" "$(CODE_COVERAGE_OUTPUT_FILE).tmp" $(CODE_COVERAGE_IGNORE_PATTERN); code_coverage_v_genhtml = $(code_coverage_v_genhtml_$(V)) code_coverage_v_genhtml_ = $(code_coverage_v_genhtml_$(AM_DEFAULT_VERBOSITY)) code_coverage_v_genhtml_0 = @echo " GEN " "$(CODE_COVERAGE_OUTPUT_DIRECTORY)"; @@ -85,7 +85,7 @@ check-code-coverage: # Capture code coverage data code-coverage-capture: code-coverage-capture-hook $(code_coverage_v_lcov_cap)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --capture --output-file "$(CODE_COVERAGE_OUTPUT_FILE).tmp" --test-name "$(call code_coverage_sanitize,$(PACKAGE_NAME)-$(PACKAGE_VERSION))" --no-checksum --compat-libtool $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_OPTIONS) - $(code_coverage_v_lcov_ign)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --remove "$(CODE_COVERAGE_OUTPUT_FILE).tmp" "/tmp/*" $(CODE_COVERAGE_IGNORE_PATTERN) --output-file "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_RMOPTS) + $(code_coverage_v_lcov_ign)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --remove "$(CODE_COVERAGE_OUTPUT_FILE).tmp" $(CODE_COVERAGE_IGNORE_PATTERN) --output-file "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_RMOPTS) -@rm -f "$(CODE_COVERAGE_OUTPUT_FILE).tmp" $(code_coverage_v_genhtml)LANG=C $(GENHTML) $(code_coverage_quiet) $(addprefix --prefix ,$(CODE_COVERAGE_DIRECTORY)) --output-directory "$(CODE_COVERAGE_OUTPUT_DIRECTORY)" --title "$(PACKAGE_NAME)-$(PACKAGE_VERSION) Code Coverage" --legend --show-details "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_GENHTML_OPTIONS) @echo "file://$(abs_builddir)/$(CODE_COVERAGE_OUTPUT_DIRECTORY)/index.html" diff --git a/deps/cares/configure b/deps/cares/configure index 76b0ddf39c136a..d02f117d2f0b64 100755 --- a/deps/cares/configure +++ b/deps/cares/configure @@ -1,6 +1,6 @@ #! /bin/sh # Guess values for system-dependent variables and create Makefiles. -# Generated by GNU Autoconf 2.71 for c-ares 1.34.3. +# Generated by GNU Autoconf 2.71 for c-ares 1.34.4. # # Report bugs to . # @@ -621,8 +621,8 @@ MAKEFLAGS= # Identity of this package. PACKAGE_NAME='c-ares' PACKAGE_TARNAME='c-ares' -PACKAGE_VERSION='1.34.3' -PACKAGE_STRING='c-ares 1.34.3' +PACKAGE_VERSION='1.34.4' +PACKAGE_STRING='c-ares 1.34.4' PACKAGE_BUGREPORT='c-ares mailing list: http://lists.haxx.se/listinfo/c-ares' PACKAGE_URL='' @@ -853,6 +853,7 @@ with_gcov enable_code_coverage enable_largefile enable_libgcc +enable_tests_crossbuild ' ac_precious_vars='build_alias host_alias @@ -1423,7 +1424,7 @@ if test "$ac_init_help" = "long"; then # Omit some internal or obsolete options to make the list less imposing. # This message is too long to be a string in the A/UX 3.1 sh. cat <<_ACEOF -\`configure' configures c-ares 1.34.3 to adapt to many kinds of systems. +\`configure' configures c-ares 1.34.4 to adapt to many kinds of systems. Usage: $0 [OPTION]... [VAR=VALUE]... @@ -1494,7 +1495,7 @@ fi if test -n "$ac_init_help"; then case $ac_init_help in - short | recursive ) echo "Configuration of c-ares 1.34.3:";; + short | recursive ) echo "Configuration of c-ares 1.34.4:";; esac cat <<\_ACEOF @@ -1525,6 +1526,8 @@ Optional Features: --enable-code-coverage Whether to enable code coverage support --disable-largefile omit support for large files --enable-libgcc use libgcc when linking + --enable-tests-crossbuild + Enable test building even when cross building Optional Packages: --with-PACKAGE[=ARG] use PACKAGE [ARG=yes] @@ -1634,7 +1637,7 @@ fi test -n "$ac_init_help" && exit $ac_status if $ac_init_version; then cat <<\_ACEOF -c-ares configure 1.34.3 +c-ares configure 1.34.4 generated by GNU Autoconf 2.71 Copyright (C) 2021 Free Software Foundation, Inc. @@ -2258,7 +2261,7 @@ cat >config.log <<_ACEOF This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. -It was created by c-ares $as_me 1.34.3, which was +It was created by c-ares $as_me 1.34.4, which was generated by GNU Autoconf 2.71. Invocation command line was $ $0$ac_configure_args_raw @@ -3232,7 +3235,7 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu -CARES_VERSION_INFO="21:2:19" +CARES_VERSION_INFO="21:3:19" @@ -4891,7 +4894,17 @@ else $as_nop // MSVC always sets __cplusplus to 199711L in older versions; newer versions // only set it correctly if /Zc:__cplusplus is specified as well as a // /std:c++NN switch: +// // https://devblogs.microsoft.com/cppblog/msvc-now-correctly-reports-__cplusplus/ +// +// The value __cplusplus ought to have is available in _MSVC_LANG since +// Visual Studio 2015 Update 3: +// +// https://learn.microsoft.com/en-us/cpp/preprocessor/predefined-macros +// +// This was also the first MSVC version to support C++14 so we can't use the +// value of either __cplusplus or _MSVC_LANG to quickly rule out MSVC having +// C++11 or C++14 support, but we can check _MSVC_LANG for C++17 and later. #elif __cplusplus < 201103L && !defined _MSC_VER #error "This is not a C++11 compiler" @@ -5914,7 +5927,7 @@ fi # Define the identity of the package. PACKAGE='c-ares' - VERSION='1.34.3' + VERSION='1.34.4' printf "%s\n" "#define PACKAGE \"$PACKAGE\"" >>confdefs.h @@ -19525,10 +19538,52 @@ then : fi + { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking for _gcov_init in -lgcov" >&5 +printf %s "checking for _gcov_init in -lgcov... " >&6; } +if test ${ac_cv_lib_gcov__gcov_init+y} +then : + printf %s "(cached) " >&6 +else $as_nop + ac_check_lib_save_LIBS=$LIBS +LIBS="-lgcov $LIBS" +cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +/* Override any GCC internal prototype to avoid an error. + Use char because int might match the return type of a GCC + builtin and then its argument prototype would still apply. */ +char _gcov_init (); +int +main (void) +{ +return _gcov_init (); + ; + return 0; +} +_ACEOF +if ac_fn_c_try_link "$LINENO" +then : + ac_cv_lib_gcov__gcov_init=yes +else $as_nop + ac_cv_lib_gcov__gcov_init=no +fi +rm -f core conftest.err conftest.$ac_objext conftest.beam \ + conftest$ac_exeext conftest.$ac_ext +LIBS=$ac_check_lib_save_LIBS +fi +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_gcov__gcov_init" >&5 +printf "%s\n" "$ac_cv_lib_gcov__gcov_init" >&6; } +if test "x$ac_cv_lib_gcov__gcov_init" = xyes +then : + CODE_COVERAGE_LIBS="-lgcov" +else $as_nop + CODE_COVERAGE_LIBS="" +fi + + CODE_COVERAGE_CPPFLAGS="-DNDEBUG" CODE_COVERAGE_CFLAGS="-O0 -g -fprofile-arcs -ftest-coverage" CODE_COVERAGE_CXXFLAGS="-O0 -g -fprofile-arcs -ftest-coverage" - CODE_COVERAGE_LIBS="-lgcov" @@ -19805,27 +19860,37 @@ eval ac_res=\$$as_CACHEVAR printf "%s\n" "$ac_res" >&6; } if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${LDFLAGS+y} + +if test ${LDFLAGS+y} then : - case " $LDFLAGS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : LDFLAGS already contains \$flag"; } >&5 + + case " $LDFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : LDFLAGS already contains \$flag"; } >&5 (: LDFLAGS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : LDFLAGS=\"\$LDFLAGS \$flag\""; } >&5 - (: LDFLAGS="$LDFLAGS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append LDFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : LDFLAGS=\"\$LDFLAGS\""; } >&5 + (: LDFLAGS="$LDFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - LDFLAGS="$LDFLAGS $flag" - ;; - esac + ;; +esac + else $as_nop - LDFLAGS="$flag" + + LDFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : LDFLAGS=\"\$LDFLAGS\""; } >&5 + (: LDFLAGS="$LDFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -19870,27 +19935,37 @@ if test "x$enable_shared" = "xno" -a "x$enable_static" = "xyes" ; then { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether we need CARES_STATICLIB definition" >&5 printf %s "checking whether we need CARES_STATICLIB definition... " >&6; } if test "$ac_cv_native_windows" = "yes" ; then - if test ${AM_CPPFLAGS+y} + +if test ${AM_CPPFLAGS+y} then : - case " $AM_CPPFLAGS " in - *" -DCARES_STATICLIB "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS already contains -DCARES_STATICLIB"; } >&5 + + case " $AM_CPPFLAGS " in #( + *" -DCARES_STATICLIB "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS already contains -DCARES_STATICLIB"; } >&5 (: AM_CPPFLAGS already contains -DCARES_STATICLIB) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS=\"\$AM_CPPFLAGS -DCARES_STATICLIB\""; } >&5 - (: AM_CPPFLAGS="$AM_CPPFLAGS -DCARES_STATICLIB") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CPPFLAGS " -DCARES_STATICLIB" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS=\"\$AM_CPPFLAGS\""; } >&5 + (: AM_CPPFLAGS="$AM_CPPFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - AM_CPPFLAGS="$AM_CPPFLAGS -DCARES_STATICLIB" - ;; - esac + ;; +esac + else $as_nop - AM_CPPFLAGS="-DCARES_STATICLIB" + + AM_CPPFLAGS=-DCARES_STATICLIB + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS=\"\$AM_CPPFLAGS\""; } >&5 + (: AM_CPPFLAGS="$AM_CPPFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi PKGCONFIG_CFLAGS="-DCARES_STATICLIB" @@ -19910,57 +19985,24 @@ if test "$symbol_hiding" != "no" ; then else case "$ax_cv_c_compiler_vendor" in clang|gnu|intel) - { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts " >&5 -printf %s "checking whether C compiler accepts ... " >&6; } -if test ${ax_cv_check_cflags__+y} -then : - printf %s "(cached) " >&6 -else $as_nop - ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS " - cat confdefs.h - <<_ACEOF >conftest.$ac_ext -/* end confdefs.h. */ - -int -main (void) -{ - - ; - return 0; -} -_ACEOF -if ac_fn_c_try_compile "$LINENO" -then : - ax_cv_check_cflags__=yes -else $as_nop - ax_cv_check_cflags__=no -fi -rm -f core conftest.err conftest.$ac_objext conftest.beam conftest.$ac_ext - CFLAGS=$ax_check_save_flags -fi -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ax_cv_check_cflags__" >&5 -printf "%s\n" "$ax_cv_check_cflags__" >&6; } -if test x"$ax_cv_check_cflags__" = xyes -then : - : -else $as_nop - : -fi for flag in -fvisibility=hidden; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags__$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -19984,29 +20026,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${CARES_SYMBOL_HIDING_CFLAG+y} + +if test ${CARES_SYMBOL_HIDING_CFLAG+y} then : - case " $CARES_SYMBOL_HIDING_CFLAG " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG already contains \$flag"; } >&5 + + case " $CARES_SYMBOL_HIDING_CFLAG " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG already contains \$flag"; } >&5 (: CARES_SYMBOL_HIDING_CFLAG already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG \$flag\""; } >&5 - (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append CARES_SYMBOL_HIDING_CFLAG " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG\""; } >&5 + (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG $flag" - ;; - esac + ;; +esac + else $as_nop - CARES_SYMBOL_HIDING_CFLAG="$flag" + + CARES_SYMBOL_HIDING_CFLAG=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG\""; } >&5 + (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20022,17 +20074,22 @@ done sun) + + for flag in -xldscope=hidden; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags__$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -20056,29 +20113,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${CARES_SYMBOL_HIDING_CFLAG+y} + +if test ${CARES_SYMBOL_HIDING_CFLAG+y} then : - case " $CARES_SYMBOL_HIDING_CFLAG " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG already contains \$flag"; } >&5 + + case " $CARES_SYMBOL_HIDING_CFLAG " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG already contains \$flag"; } >&5 (: CARES_SYMBOL_HIDING_CFLAG already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG \$flag\""; } >&5 - (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append CARES_SYMBOL_HIDING_CFLAG " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG\""; } >&5 + (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG $flag" - ;; - esac + ;; +esac + else $as_nop - CARES_SYMBOL_HIDING_CFLAG="$flag" + + CARES_SYMBOL_HIDING_CFLAG=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : CARES_SYMBOL_HIDING_CFLAG=\"\$CARES_SYMBOL_HIDING_CFLAG\""; } >&5 + (: CARES_SYMBOL_HIDING_CFLAG="$CARES_SYMBOL_HIDING_CFLAG") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20120,17 +20187,22 @@ fi if test "$enable_warnings" = "yes"; then + + for flag in -Wall -Wextra -Waggregate-return -Wcast-align -Wcast-qual -Wconversion -Wdeclaration-after-statement -Wdouble-promotion -Wfloat-equal -Wformat-security -Winit-self -Wjump-misses-init -Wlogical-op -Wmissing-braces -Wmissing-declarations -Wmissing-format-attribute -Wmissing-include-dirs -Wmissing-prototypes -Wnested-externs -Wno-coverage-mismatch -Wold-style-definition -Wpacked -Wpedantic -Wpointer-arith -Wredundant-decls -Wshadow -Wsign-conversion -Wstrict-overflow -Wstrict-prototypes -Wtrampolines -Wundef -Wunreachable-code -Wunused -Wvariadic-macros -Wvla -Wwrite-strings -Werror=implicit-int -Werror=implicit-function-declaration -Werror=partial-availability -Wno-long-long ; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags_-Werror_$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS -Werror $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS -Werror $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -20154,29 +20226,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${AM_CFLAGS+y} + +if test ${AM_CFLAGS+y} then : - case " $AM_CFLAGS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 + + case " $AM_CFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 (: AM_CFLAGS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS \$flag\""; } >&5 - (: AM_CFLAGS="$AM_CFLAGS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - AM_CFLAGS="$AM_CFLAGS $flag" - ;; - esac + ;; +esac + else $as_nop - AM_CFLAGS="$flag" + + AM_CFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20185,22 +20267,28 @@ fi done +fi + +case $host_os in + *qnx*|*android*) + - case $host_os in - *android*) for flag in -std=c99; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags_-Werror_$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS -Werror $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS -Werror $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -20224,29 +20312,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${AM_CFLAGS+y} + +if test ${AM_CFLAGS+y} then : - case " $AM_CFLAGS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 + + case " $AM_CFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 (: AM_CFLAGS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS \$flag\""; } >&5 - (: AM_CFLAGS="$AM_CFLAGS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - AM_CFLAGS="$AM_CFLAGS $flag" - ;; - esac + ;; +esac + else $as_nop - AM_CFLAGS="$flag" + + AM_CFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20255,21 +20353,26 @@ fi done - ;; - *) + ;; + *) + + for flag in -std=c90; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags_-Werror_$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS -Werror $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS -Werror $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -20293,29 +20396,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${AM_CFLAGS+y} + +if test ${AM_CFLAGS+y} then : - case " $AM_CFLAGS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 + + case " $AM_CFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 (: AM_CFLAGS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS \$flag\""; } >&5 - (: AM_CFLAGS="$AM_CFLAGS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - AM_CFLAGS="$AM_CFLAGS $flag" - ;; - esac + ;; +esac + else $as_nop - AM_CFLAGS="$flag" + + AM_CFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20324,24 +20437,115 @@ fi done - ;; - esac + ;; +esac + +case $host_os in + *qnx*) + + + + +for flag in -D_QNX_SOURCE; do + as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags_-Werror_$flag" | $as_tr_sh` +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } +if eval test \${$as_CACHEVAR+y} +then : + printf %s "(cached) " >&6 +else $as_nop + + ax_check_save_flags=$CFLAGS + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS -Werror $flag $add_gnu_werror" + cat confdefs.h - <<_ACEOF >conftest.$ac_ext +/* end confdefs.h. */ + +int +main (void) +{ + + ; + return 0; +} +_ACEOF +if ac_fn_c_try_compile "$LINENO" +then : + eval "$as_CACHEVAR=yes" +else $as_nop + eval "$as_CACHEVAR=no" +fi +rm -f core conftest.err conftest.$ac_objext conftest.beam conftest.$ac_ext + CFLAGS=$ax_check_save_flags +fi +eval ac_res=\$$as_CACHEVAR + { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 +printf "%s\n" "$ac_res" >&6; } +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" +then : + +if test ${AM_CPPFLAGS+y} +then : + + case " $AM_CPPFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS already contains \$flag"; } >&5 + (: AM_CPPFLAGS already contains $flag) 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CPPFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS=\"\$AM_CPPFLAGS\""; } >&5 + (: AM_CPPFLAGS="$AM_CPPFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + ;; +esac + +else $as_nop + + AM_CPPFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CPPFLAGS=\"\$AM_CPPFLAGS\""; } >&5 + (: AM_CPPFLAGS="$AM_CPPFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + +fi + +else $as_nop + : fi +done + + ;; +esac + if test "$ax_cv_c_compiler_vendor" = "intel"; then + + for flag in -shared-intel; do as_CACHEVAR=`printf "%s\n" "ax_cv_check_cflags__$flag" | $as_tr_sh` -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether C compiler accepts $flag" >&5 -printf %s "checking whether C compiler accepts $flag... " >&6; } +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether the C compiler accepts $flag" >&5 +printf %s "checking whether the C compiler accepts $flag... " >&6; } if eval test \${$as_CACHEVAR+y} then : printf %s "(cached) " >&6 else $as_nop ax_check_save_flags=$CFLAGS - CFLAGS="$CFLAGS $flag" + if test x"$GCC" = xyes ; then + add_gnu_werror="-Werror" + fi + CFLAGS="$CFLAGS $flag $add_gnu_werror" cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -20365,29 +20569,39 @@ fi eval ac_res=\$$as_CACHEVAR { printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ac_res" >&5 printf "%s\n" "$ac_res" >&6; } -if test x"`eval 'as_val=${'$as_CACHEVAR'};printf "%s\n" "$as_val"'`" = xyes +if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${AM_CFLAGS+y} + +if test ${AM_CFLAGS+y} then : - case " $AM_CFLAGS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 + + case " $AM_CFLAGS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS already contains \$flag"; } >&5 (: AM_CFLAGS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS \$flag\""; } >&5 - (: AM_CFLAGS="$AM_CFLAGS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append AM_CFLAGS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - AM_CFLAGS="$AM_CFLAGS $flag" - ;; - esac + ;; +esac + else $as_nop - AM_CFLAGS="$flag" + + AM_CFLAGS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : AM_CFLAGS=\"\$AM_CFLAGS\""; } >&5 + (: AM_CFLAGS="$AM_CFLAGS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -20708,27 +20922,37 @@ eval ac_res=\$$as_CACHEVAR printf "%s\n" "$ac_res" >&6; } if eval test \"x\$"$as_CACHEVAR"\" = x"yes" then : - if test ${XNET_LIBS+y} + +if test ${XNET_LIBS+y} then : - case " $XNET_LIBS " in - *" $flag "*) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : XNET_LIBS already contains \$flag"; } >&5 + + case " $XNET_LIBS " in #( + *" $flag "*) : + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : XNET_LIBS already contains \$flag"; } >&5 (: XNET_LIBS already contains $flag) 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 - test $ac_status = 0; } - ;; - *) - { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : XNET_LIBS=\"\$XNET_LIBS \$flag\""; } >&5 - (: XNET_LIBS="$XNET_LIBS $flag") 2>&5 + test $ac_status = 0; } ;; #( + *) : + + as_fn_append XNET_LIBS " $flag" + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : XNET_LIBS=\"\$XNET_LIBS\""; } >&5 + (: XNET_LIBS="$XNET_LIBS") 2>&5 ac_status=$? printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 test $ac_status = 0; } - XNET_LIBS="$XNET_LIBS $flag" - ;; - esac + ;; +esac + else $as_nop - XNET_LIBS="$flag" + + XNET_LIBS=$flag + { { printf "%s\n" "$as_me:${as_lineno-$LINENO}: : XNET_LIBS=\"\$XNET_LIBS\""; } >&5 + (: XNET_LIBS="$XNET_LIBS") 2>&5 + ac_status=$? + printf "%s\n" "$as_me:${as_lineno-$LINENO}: \$? = $ac_status" >&5 + test $ac_status = 0; } + fi else $as_nop @@ -22131,6 +22355,14 @@ fi +ac_fn_check_decl "$LINENO" "strnlen" "ac_cv_have_decl_strnlen" "$cares_all_includes +" "$ac_c_undeclared_builtin_options" "CFLAGS" +if test "x$ac_cv_have_decl_strnlen" = xyes +then : + +printf "%s\n" "#define HAVE_STRNLEN 1" >>confdefs.h + +fi ac_fn_check_decl "$LINENO" "memmem" "ac_cv_have_decl_memmem" "$cares_all_includes " "$ac_c_undeclared_builtin_options" "CFLAGS" if test "x$ac_cv_have_decl_memmem" = xyes @@ -23708,6 +23940,15 @@ printf "%s\n" "$as_me: WARNING: cannot build tests when cross compiling" >&2;} as_fn_error $? "*** Tests not supported when cross compiling" "$LINENO" 5 fi fi + +# Check whether --enable-tests-crossbuild was given. +if test ${enable_tests_crossbuild+y} +then : + enableval=$enable_tests_crossbuild; build_tests="$enableval" + +fi + + if test "x$build_tests" != "xno" ; then @@ -23993,7 +24234,7 @@ fi if test "x$have_gmock_v112" = "xyes" ; then { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether user namespaces are supported" >&5 printf %s "checking whether user namespaces are supported... " >&6; } -if test ${ax_cv_user_namespace+y} +if test ${ares_cv_user_namespace+y} then : printf %s "(cached) " >&6 else $as_nop @@ -24006,7 +24247,7 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu if test "$cross_compiling" = yes then : - ax_cv_user_namespace=no + ares_cv_user_namespace=no else $as_nop cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -24046,9 +24287,9 @@ int main() { _ACEOF if ac_fn_c_try_run "$LINENO" then : - ax_cv_user_namespace=yes + ares_cv_user_namespace=yes else $as_nop - ax_cv_user_namespace=no + ares_cv_user_namespace=no fi rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ conftest.$ac_objext conftest.beam conftest.$ac_ext @@ -24062,9 +24303,9 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu fi -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ax_cv_user_namespace" >&5 -printf "%s\n" "$ax_cv_user_namespace" >&6; } - if test "$ax_cv_user_namespace" = yes; then +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ares_cv_user_namespace" >&5 +printf "%s\n" "$ares_cv_user_namespace" >&6; } + if test "$ares_cv_user_namespace" = yes; then printf "%s\n" "#define HAVE_USER_NAMESPACE 1" >>confdefs.h @@ -24072,7 +24313,7 @@ printf "%s\n" "#define HAVE_USER_NAMESPACE 1" >>confdefs.h { printf "%s\n" "$as_me:${as_lineno-$LINENO}: checking whether UTS namespaces are supported" >&5 printf %s "checking whether UTS namespaces are supported... " >&6; } -if test ${ax_cv_uts_namespace+y} +if test ${ares_cv_uts_namespace+y} then : printf %s "(cached) " >&6 else $as_nop @@ -24085,7 +24326,7 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu if test "$cross_compiling" = yes then : - ax_cv_uts_namespace=no + ares_cv_uts_namespace=no else $as_nop cat confdefs.h - <<_ACEOF >conftest.$ac_ext /* end confdefs.h. */ @@ -24145,9 +24386,9 @@ int main() { _ACEOF if ac_fn_c_try_run "$LINENO" then : - ax_cv_uts_namespace=yes + ares_cv_uts_namespace=yes else $as_nop - ax_cv_uts_namespace=no + ares_cv_uts_namespace=no fi rm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \ conftest.$ac_objext conftest.beam conftest.$ac_ext @@ -24161,9 +24402,9 @@ ac_compiler_gnu=$ac_cv_c_compiler_gnu fi -{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ax_cv_uts_namespace" >&5 -printf "%s\n" "$ax_cv_uts_namespace" >&6; } - if test "$ax_cv_uts_namespace" = yes; then +{ printf "%s\n" "$as_me:${as_lineno-$LINENO}: result: $ares_cv_uts_namespace" >&5 +printf "%s\n" "$ares_cv_uts_namespace" >&6; } + if test "$ares_cv_uts_namespace" = yes; then printf "%s\n" "#define HAVE_UTS_NAMESPACE 1" >>confdefs.h @@ -24218,7 +24459,17 @@ else $as_nop // MSVC always sets __cplusplus to 199711L in older versions; newer versions // only set it correctly if /Zc:__cplusplus is specified as well as a // /std:c++NN switch: +// // https://devblogs.microsoft.com/cppblog/msvc-now-correctly-reports-__cplusplus/ +// +// The value __cplusplus ought to have is available in _MSVC_LANG since +// Visual Studio 2015 Update 3: +// +// https://learn.microsoft.com/en-us/cpp/preprocessor/predefined-macros +// +// This was also the first MSVC version to support C++14 so we can't use the +// value of either __cplusplus or _MSVC_LANG to quickly rule out MSVC having +// C++11 or C++14 support, but we can check _MSVC_LANG for C++17 and later. #elif __cplusplus < 201103L && !defined _MSC_VER #error "This is not a C++11 compiler" @@ -26007,7 +26258,7 @@ cat >>$CONFIG_STATUS <<\_ACEOF || ac_write_fail=1 # report actual input values of CONFIG_FILES etc. instead of their # values after options handling. ac_log=" -This file was extended by c-ares $as_me 1.34.3, which was +This file was extended by c-ares $as_me 1.34.4, which was generated by GNU Autoconf 2.71. Invocation command line was CONFIG_FILES = $CONFIG_FILES @@ -26075,7 +26326,7 @@ ac_cs_config_escaped=`printf "%s\n" "$ac_cs_config" | sed "s/^ //; s/'/'\\\\\\\\ cat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1 ac_cs_config='$ac_cs_config_escaped' ac_cs_version="\\ -c-ares config.status 1.34.3 +c-ares config.status 1.34.4 configured by $0, generated by GNU Autoconf 2.71, with options \\"\$ac_cs_config\\" diff --git a/deps/cares/configure.ac b/deps/cares/configure.ac index 5f848c28598a95..9dacf1fb2e4a40 100644 --- a/deps/cares/configure.ac +++ b/deps/cares/configure.ac @@ -2,10 +2,10 @@ dnl Copyright (C) The c-ares project and its contributors dnl SPDX-License-Identifier: MIT AC_PREREQ([2.69]) -AC_INIT([c-ares], [1.34.3], +AC_INIT([c-ares], [1.34.4], [c-ares mailing list: http://lists.haxx.se/listinfo/c-ares]) -CARES_VERSION_INFO="21:2:19" +CARES_VERSION_INFO="21:3:19" dnl This flag accepts an argument of the form current[:revision[:age]]. So, dnl passing -version-info 3:12:1 sets current to 3, revision to 12, and age to dnl 1. @@ -245,18 +245,25 @@ AC_SUBST(CARES_SYMBOL_HIDING_CFLAG) if test "$enable_warnings" = "yes"; then AX_APPEND_COMPILE_FLAGS([-Wall -Wextra -Waggregate-return -Wcast-align -Wcast-qual -Wconversion -Wdeclaration-after-statement -Wdouble-promotion -Wfloat-equal -Wformat-security -Winit-self -Wjump-misses-init -Wlogical-op -Wmissing-braces -Wmissing-declarations -Wmissing-format-attribute -Wmissing-include-dirs -Wmissing-prototypes -Wnested-externs -Wno-coverage-mismatch -Wold-style-definition -Wpacked -Wpedantic -Wpointer-arith -Wredundant-decls -Wshadow -Wsign-conversion -Wstrict-overflow -Wstrict-prototypes -Wtrampolines -Wundef -Wunreachable-code -Wunused -Wvariadic-macros -Wvla -Wwrite-strings -Werror=implicit-int -Werror=implicit-function-declaration -Werror=partial-availability -Wno-long-long ], [AM_CFLAGS], [-Werror]) - - dnl Android requires c99, all others should use c90 - case $host_os in - *android*) - AX_APPEND_COMPILE_FLAGS([-std=c99], [AM_CFLAGS], [-Werror]) - ;; - *) - AX_APPEND_COMPILE_FLAGS([-std=c90], [AM_CFLAGS], [-Werror]) - ;; - esac fi +dnl Android and QNX require c99, all others should use c90 +case $host_os in + *qnx*|*android*) + AX_APPEND_COMPILE_FLAGS([-std=c99], [AM_CFLAGS], [-Werror]) + ;; + *) + AX_APPEND_COMPILE_FLAGS([-std=c90], [AM_CFLAGS], [-Werror]) + ;; +esac + +dnl QNX needs -D_QNX_SOURCE +case $host_os in + *qnx*) + AX_APPEND_COMPILE_FLAGS([-D_QNX_SOURCE], [AM_CPPFLAGS], [-Werror]) + ;; +esac + if test "$ax_cv_c_compiler_vendor" = "intel"; then AX_APPEND_COMPILE_FLAGS([-shared-intel], [AM_CFLAGS]) fi @@ -543,6 +550,7 @@ dnl https://mailman.videolan.org/pipermail/vlc-devel/2015-March/101802.html dnl which would require we check each individually and provide function arguments dnl for the test. +AC_CHECK_DECL(strnlen, [AC_DEFINE([HAVE_STRNLEN], 1, [Define to 1 if you have `strnlen`] )], [], $cares_all_includes) AC_CHECK_DECL(memmem, [AC_DEFINE([HAVE_MEMMEM], 1, [Define to 1 if you have `memmem`] )], [], $cares_all_includes) AC_CHECK_DECL(recv, [AC_DEFINE([HAVE_RECV], 1, [Define to 1 if you have `recv`] )], [], $cares_all_includes) AC_CHECK_DECL(recvfrom, [AC_DEFINE([HAVE_RECVFROM], 1, [Define to 1 if you have `recvfrom`] )], [], $cares_all_includes) @@ -813,6 +821,13 @@ if test "x$build_tests" != "xno" -a "x$cross_compiling" = "xyes" ; then AC_MSG_ERROR([*** Tests not supported when cross compiling]) fi fi + +dnl Forces compiling of tests even when cross-compiling. +AC_ARG_ENABLE(tests-crossbuild, + AS_HELP_STRING([--enable-tests-crossbuild], [Enable test building even when cross building]), + [build_tests="$enableval"] +) + if test "x$build_tests" != "xno" ; then PKG_CHECK_MODULES([GMOCK], [gmock], [ have_gmock=yes ], [ have_gmock=no ]) if test "x$have_gmock" = "xno" ; then @@ -825,8 +840,8 @@ if test "x$build_tests" != "xno" ; then else PKG_CHECK_MODULES([GMOCK112], [gmock >= 1.12.0], [ have_gmock_v112=yes ], [ have_gmock_v112=no ]) if test "x$have_gmock_v112" = "xyes" ; then - AX_CHECK_USER_NAMESPACE - AX_CHECK_UTS_NAMESPACE + ARES_CHECK_USER_NAMESPACE + ARES_CHECK_UTS_NAMESPACE fi fi fi diff --git a/deps/cares/docs/Makefile.in b/deps/cares/docs/Makefile.in index 6b7bb8e30d1a20..0d1873c9662c92 100644 --- a/deps/cares/docs/Makefile.in +++ b/deps/cares/docs/Makefile.in @@ -92,7 +92,9 @@ build_triplet = @build@ host_triplet = @host@ subdir = docs ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -102,8 +104,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ diff --git a/deps/cares/docs/ares_create_query.3 b/deps/cares/docs/ares_create_query.3 index a54eec3e2a6bd1..3af6ba4cc3dc5b 100644 --- a/deps/cares/docs/ares_create_query.3 +++ b/deps/cares/docs/ares_create_query.3 @@ -19,6 +19,9 @@ int ares_create_query(const char *\fIname\fP, int \fImax_udp_size\fP) .fi .SH DESCRIPTION +This function is deprecated as of c-ares 1.22, please use +\fIares_dns_record_create(3)\fP instead. + The \fIares_create_query(3)\fP function composes a DNS query with a single question. The parameter \fIname\fP gives the query name as a NUL-terminated C string of period-separated labels optionally ending with a period; periods and diff --git a/deps/cares/docs/ares_mkquery.3 b/deps/cares/docs/ares_mkquery.3 index 0e7b5edbb89353..2f42d169210fef 100644 --- a/deps/cares/docs/ares_mkquery.3 +++ b/deps/cares/docs/ares_mkquery.3 @@ -14,7 +14,8 @@ int ares_mkquery(const char *\fIname\fP, int \fIdnsclass\fP, int \fItype\fP, int *\fIbuflen\fP) .fi .SH DESCRIPTION -Deprecated function. See \fIares_create_query(3)\fP instead! +This function is deprecated as of c-ares 1.10, please use +\fIares_dns_record_create(3)\fP instead. The .B ares_mkquery diff --git a/deps/cares/docs/ares_send.3 b/deps/cares/docs/ares_send.3 index f6ea9140e2510c..df3e3bbe4136b0 100644 --- a/deps/cares/docs/ares_send.3 +++ b/deps/cares/docs/ares_send.3 @@ -113,6 +113,9 @@ is being destroyed; the query will not be completed. .B ARES_ENOSERVER The query will not be completed because no DNS servers were configured on the channel. +.TP 19 +.B ARES_EBADQUERY +Misformatted DNS query. .PP The callback argument diff --git a/deps/cares/include/Makefile.in b/deps/cares/include/Makefile.in index 0beee44a22bb22..7dc40eb08fab9c 100644 --- a/deps/cares/include/Makefile.in +++ b/deps/cares/include/Makefile.in @@ -90,7 +90,9 @@ build_triplet = @build@ host_triplet = @host@ subdir = include ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -100,8 +102,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ diff --git a/deps/cares/include/ares.h b/deps/cares/include/ares.h index 139c6d66ee90df..7fe3ec78f4e651 100644 --- a/deps/cares/include/ares.h +++ b/deps/cares/include/ares.h @@ -74,7 +74,7 @@ #if defined(_AIX) || defined(__NOVELL_LIBC__) || defined(__NetBSD__) || \ defined(__minix) || defined(__SYMBIAN32__) || defined(__INTEGRITY) || \ defined(ANDROID) || defined(__ANDROID__) || defined(__OpenBSD__) || \ - defined(__QNXNTO__) || defined(__MVS__) || defined(__HAIKU__) + defined(__QNX__) || defined(__MVS__) || defined(__HAIKU__) # include #endif diff --git a/deps/cares/include/ares_version.h b/deps/cares/include/ares_version.h index 9cb8084dd56bc9..782046bd79d844 100644 --- a/deps/cares/include/ares_version.h +++ b/deps/cares/include/ares_version.h @@ -32,8 +32,8 @@ #define ARES_VERSION_MAJOR 1 #define ARES_VERSION_MINOR 34 -#define ARES_VERSION_PATCH 3 -#define ARES_VERSION_STR "1.34.3" +#define ARES_VERSION_PATCH 4 +#define ARES_VERSION_STR "1.34.4" /* NOTE: We cannot make the version string a C preprocessor stringify operation * due to assumptions made by integrators that aren't properly using diff --git a/deps/cares/m4/ax_check_user_namespace.m4 b/deps/cares/m4/ares_check_user_namespace.m4 similarity index 82% rename from deps/cares/m4/ax_check_user_namespace.m4 rename to deps/cares/m4/ares_check_user_namespace.m4 index aca721626f2e89..a26b384fda5c54 100644 --- a/deps/cares/m4/ax_check_user_namespace.m4 +++ b/deps/cares/m4/ares_check_user_namespace.m4 @@ -2,7 +2,7 @@ # SYNOPSIS # -# AX_CHECK_USER_NAMESPACE +# ARES_CHECK_USER_NAMESPACE # # DESCRIPTION # @@ -12,9 +12,9 @@ # Copyright (C) The c-ares team # SPDX-License-Identifier: MIT -AC_DEFUN([AX_CHECK_USER_NAMESPACE],[dnl +AC_DEFUN([ARES_CHECK_USER_NAMESPACE],[dnl AC_CACHE_CHECK([whether user namespaces are supported], - ax_cv_user_namespace,[ + ares_cv_user_namespace,[ AC_LANG_PUSH([C]) AC_RUN_IFELSE([AC_LANG_SOURCE([[ #define _GNU_SOURCE @@ -48,10 +48,10 @@ int main() { if (!WIFEXITED(status)) return 1; return WEXITSTATUS(status); } - ]])],[ax_cv_user_namespace=yes],[ax_cv_user_namespace=no],[ax_cv_user_namespace=no]) + ]])],[ares_cv_user_namespace=yes],[ares_cv_user_namespace=no],[ares_cv_user_namespace=no]) AC_LANG_POP([C]) ]) - if test "$ax_cv_user_namespace" = yes; then + if test "$ares_cv_user_namespace" = yes; then AC_DEFINE([HAVE_USER_NAMESPACE],[1],[Whether user namespaces are available]) fi -]) # AX_CHECK_USER_NAMESPACE +]) # ARES_CHECK_USER_NAMESPACE diff --git a/deps/cares/m4/ax_check_uts_namespace.m4 b/deps/cares/m4/ares_check_uts_namespace.m4 similarity index 87% rename from deps/cares/m4/ax_check_uts_namespace.m4 rename to deps/cares/m4/ares_check_uts_namespace.m4 index 5708acf1b9f376..0aeefe4a9b7b8b 100644 --- a/deps/cares/m4/ax_check_uts_namespace.m4 +++ b/deps/cares/m4/ares_check_uts_namespace.m4 @@ -2,7 +2,7 @@ # SYNOPSIS # -# AX_CHECK_UTS_NAMESPACE +# ARES_CHECK_UTS_NAMESPACE # # DESCRIPTION # @@ -14,9 +14,9 @@ # Copyright (C) The c-ares team # SPDX-License-Identifier: MIT -AC_DEFUN([AX_CHECK_UTS_NAMESPACE],[dnl +AC_DEFUN([ARES_CHECK_UTS_NAMESPACE],[dnl AC_CACHE_CHECK([whether UTS namespaces are supported], - ax_cv_uts_namespace,[ + ares_cv_uts_namespace,[ AC_LANG_PUSH([C]) AC_RUN_IFELSE([AC_LANG_SOURCE([[ #define _GNU_SOURCE @@ -70,10 +70,10 @@ int main() { return WEXITSTATUS(status); } ]]) - ],[ax_cv_uts_namespace=yes],[ax_cv_uts_namespace=no],[ax_cv_uts_namespace=no]) + ],[ares_cv_uts_namespace=yes],[ares_cv_uts_namespace=no],[ares_cv_uts_namespace=no]) AC_LANG_POP([C]) ]) - if test "$ax_cv_uts_namespace" = yes; then + if test "$ares_cv_uts_namespace" = yes; then AC_DEFINE([HAVE_UTS_NAMESPACE],[1],[Whether UTS namespaces are available]) fi -]) # AX_CHECK_UTS_NAMESPACE +]) # ARES_CHECK_UTS_NAMESPACE diff --git a/deps/cares/m4/ax_append_compile_flags.m4 b/deps/cares/m4/ax_append_compile_flags.m4 index 1f8e70845c20d9..9c856356c0cda6 100644 --- a/deps/cares/m4/ax_append_compile_flags.m4 +++ b/deps/cares/m4/ax_append_compile_flags.m4 @@ -1,10 +1,10 @@ -# =========================================================================== -# http://www.gnu.org/software/autoconf-archive/ax_append_compile_flags.html -# =========================================================================== +# ============================================================================ +# https://www.gnu.org/software/autoconf-archive/ax_append_compile_flags.html +# ============================================================================ # # SYNOPSIS # -# AX_APPEND_COMPILE_FLAGS([FLAG1 FLAG2 ...], [FLAGS-VARIABLE], [EXTRA-FLAGS]) +# AX_APPEND_COMPILE_FLAGS([FLAG1 FLAG2 ...], [FLAGS-VARIABLE], [EXTRA-FLAGS], [INPUT]) # # DESCRIPTION # @@ -20,6 +20,8 @@ # the flags: "CFLAGS EXTRA-FLAGS FLAG". This can for example be used to # force the compiler to issue an error when a bad flag is given. # +# INPUT gives an alternative input source to AC_COMPILE_IFELSE. +# # NOTE: This macro depends on the AX_APPEND_FLAG and # AX_CHECK_COMPILE_FLAG. Please keep this macro in sync with # AX_APPEND_LINK_FLAGS. @@ -28,38 +30,17 @@ # # Copyright (c) 2011 Maarten Bosmans # -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This program is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# You should have received a copy of the GNU General Public License along -# with this program. If not, see . -# -# As a special exception, the respective Autoconf Macro's copyright owner -# gives unlimited permission to copy, distribute and modify the configure -# scripts that are the output of Autoconf when processing the Macro. You -# need not follow the terms of the GNU General Public License when using -# or distributing such scripts, even though portions of the text of the -# Macro appear in them. The GNU General Public License (GPL) does govern -# all other use of the material that constitutes the Autoconf Macro. -# -# This special exception to the GPL applies to versions of the Autoconf -# Macro released by the Autoconf Archive. When you make and distribute a -# modified version of the Autoconf Macro, you may extend this special -# exception to the GPL to apply to your modified version as well. +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. -#serial 3 +#serial 7 AC_DEFUN([AX_APPEND_COMPILE_FLAGS], -[AC_REQUIRE([AX_CHECK_COMPILE_FLAG]) -AC_REQUIRE([AX_APPEND_FLAG]) +[AX_REQUIRE_DEFINED([AX_CHECK_COMPILE_FLAG]) +AX_REQUIRE_DEFINED([AX_APPEND_FLAG]) for flag in $1; do - AX_CHECK_COMPILE_FLAG([$flag], [AX_APPEND_FLAG([$flag], [$2])], [], [$3]) + AX_CHECK_COMPILE_FLAG([$flag], [AX_APPEND_FLAG([$flag], [$2])], [], [$3], [$4]) done ])dnl AX_APPEND_COMPILE_FLAGS diff --git a/deps/cares/m4/ax_append_flag.m4 b/deps/cares/m4/ax_append_flag.m4 index 1d38b76fb8e157..dd6d8b61406c32 100644 --- a/deps/cares/m4/ax_append_flag.m4 +++ b/deps/cares/m4/ax_append_flag.m4 @@ -1,5 +1,5 @@ # =========================================================================== -# http://www.gnu.org/software/autoconf-archive/ax_append_flag.html +# https://www.gnu.org/software/autoconf-archive/ax_append_flag.html # =========================================================================== # # SYNOPSIS @@ -23,47 +23,28 @@ # Copyright (c) 2008 Guido U. Draheim # Copyright (c) 2011 Maarten Bosmans # -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This program is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# You should have received a copy of the GNU General Public License along -# with this program. If not, see . -# -# As a special exception, the respective Autoconf Macro's copyright owner -# gives unlimited permission to copy, distribute and modify the configure -# scripts that are the output of Autoconf when processing the Macro. You -# need not follow the terms of the GNU General Public License when using -# or distributing such scripts, even though portions of the text of the -# Macro appear in them. The GNU General Public License (GPL) does govern -# all other use of the material that constitutes the Autoconf Macro. -# -# This special exception to the GPL applies to versions of the Autoconf -# Macro released by the Autoconf Archive. When you make and distribute a -# modified version of the Autoconf Macro, you may extend this special -# exception to the GPL to apply to your modified version as well. +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. -#serial 2 +#serial 8 AC_DEFUN([AX_APPEND_FLAG], -[AC_PREREQ(2.59)dnl for _AC_LANG_PREFIX -AS_VAR_PUSHDEF([FLAGS], [m4_default($2,_AC_LANG_PREFIX[FLAGS])])dnl -AS_VAR_SET_IF(FLAGS, - [case " AS_VAR_GET(FLAGS) " in - *" $1 "*) - AC_RUN_LOG([: FLAGS already contains $1]) - ;; - *) - AC_RUN_LOG([: FLAGS="$FLAGS $1"]) - AS_VAR_SET(FLAGS, ["AS_VAR_GET(FLAGS) $1"]) - ;; - esac], - [AS_VAR_SET(FLAGS,["$1"])]) +[dnl +AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_SET_IF +AS_VAR_PUSHDEF([FLAGS], [m4_default($2,_AC_LANG_PREFIX[FLAGS])]) +AS_VAR_SET_IF(FLAGS,[ + AS_CASE([" AS_VAR_GET(FLAGS) "], + [*" $1 "*], [AC_RUN_LOG([: FLAGS already contains $1])], + [ + AS_VAR_APPEND(FLAGS,[" $1"]) + AC_RUN_LOG([: FLAGS="$FLAGS"]) + ]) + ], + [ + AS_VAR_SET(FLAGS,[$1]) + AC_RUN_LOG([: FLAGS="$FLAGS"]) + ]) AS_VAR_POPDEF([FLAGS])dnl ])dnl AX_APPEND_FLAG diff --git a/deps/cares/m4/ax_check_compile_flag.m4 b/deps/cares/m4/ax_check_compile_flag.m4 index c3a8d695a1bcda..54191c55353ee5 100644 --- a/deps/cares/m4/ax_check_compile_flag.m4 +++ b/deps/cares/m4/ax_check_compile_flag.m4 @@ -1,10 +1,10 @@ # =========================================================================== -# http://www.gnu.org/software/autoconf-archive/ax_check_compile_flag.html +# https://www.gnu.org/software/autoconf-archive/ax_check_compile_flag.html # =========================================================================== # # SYNOPSIS # -# AX_CHECK_COMPILE_FLAG(FLAG, [ACTION-SUCCESS], [ACTION-FAILURE], [EXTRA-FLAGS]) +# AX_CHECK_COMPILE_FLAG(FLAG, [ACTION-SUCCESS], [ACTION-FAILURE], [EXTRA-FLAGS], [INPUT]) # # DESCRIPTION # @@ -19,6 +19,8 @@ # the flags: "CFLAGS EXTRA-FLAGS FLAG". This can for example be used to # force the compiler to issue an error when a bad flag is given. # +# INPUT gives an alternative input source to AC_COMPILE_IFELSE. +# # NOTE: Implementation based on AX_CFLAGS_GCC_OPTION. Please keep this # macro in sync with AX_CHECK_{PREPROC,LINK}_FLAG. # @@ -27,45 +29,34 @@ # Copyright (c) 2008 Guido U. Draheim # Copyright (c) 2011 Maarten Bosmans # -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU General Public License as published by the -# Free Software Foundation, either version 3 of the License, or (at your -# option) any later version. -# -# This program is distributed in the hope that it will be useful, but -# WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General -# Public License for more details. -# -# You should have received a copy of the GNU General Public License along -# with this program. If not, see . -# -# As a special exception, the respective Autoconf Macro's copyright owner -# gives unlimited permission to copy, distribute and modify the configure -# scripts that are the output of Autoconf when processing the Macro. You -# need not follow the terms of the GNU General Public License when using -# or distributing such scripts, even though portions of the text of the -# Macro appear in them. The GNU General Public License (GPL) does govern -# all other use of the material that constitutes the Autoconf Macro. -# -# This special exception to the GPL applies to versions of the Autoconf -# Macro released by the Autoconf Archive. When you make and distribute a -# modified version of the Autoconf Macro, you may extend this special -# exception to the GPL to apply to your modified version as well. +# Copying and distribution of this file, with or without modification, are +# permitted in any medium without royalty provided the copyright notice +# and this notice are preserved. This file is offered as-is, without any +# warranty. -#serial 2 +#serial 11 AC_DEFUN([AX_CHECK_COMPILE_FLAG], -[AC_PREREQ(2.59)dnl for _AC_LANG_PREFIX +[AC_PREREQ(2.64)dnl for _AC_LANG_PREFIX and AS_VAR_IF AS_VAR_PUSHDEF([CACHEVAR],[ax_cv_check_[]_AC_LANG_ABBREV[]flags_$4_$1])dnl -AC_CACHE_CHECK([whether _AC_LANG compiler accepts $1], CACHEVAR, [ +AC_CACHE_CHECK([whether the _AC_LANG compiler accepts $1], CACHEVAR, [ ax_check_save_flags=$[]_AC_LANG_PREFIX[]FLAGS - _AC_LANG_PREFIX[]FLAGS="$[]_AC_LANG_PREFIX[]FLAGS $4 $1" - AC_COMPILE_IFELSE([AC_LANG_PROGRAM()], + if test x"m4_case(_AC_LANG, + [C], [$GCC], + [C++], [$GXX], + [Fortran], [$GFC], + [Fortran 77], [$G77], + [Objective C], [$GOBJC], + [Objective C++], [$GOBJCXX], + [no])" = xyes ; then + add_gnu_werror="-Werror" + fi + _AC_LANG_PREFIX[]FLAGS="$[]_AC_LANG_PREFIX[]FLAGS $4 $1 $add_gnu_werror" + AC_COMPILE_IFELSE([m4_default([$5],[AC_LANG_PROGRAM()])], [AS_VAR_SET(CACHEVAR,[yes])], [AS_VAR_SET(CACHEVAR,[no])]) _AC_LANG_PREFIX[]FLAGS=$ax_check_save_flags]) -AS_IF([test x"AS_VAR_GET(CACHEVAR)" = xyes], +AS_VAR_IF(CACHEVAR,yes, [m4_default([$2], :)], [m4_default([$3], :)]) AS_VAR_POPDEF([CACHEVAR])dnl diff --git a/deps/cares/m4/ax_code_coverage.m4 b/deps/cares/m4/ax_code_coverage.m4 index ad4063305ebcdd..216708a41f10c9 100644 --- a/deps/cares/m4/ax_code_coverage.m4 +++ b/deps/cares/m4/ax_code_coverage.m4 @@ -74,7 +74,7 @@ # You should have received a copy of the GNU Lesser General Public License # along with this program. If not, see . -#serial 34 +#serial 37 m4_define(_AX_CODE_COVERAGE_RULES,[ AX_ADD_AM_MACRO_STATIC([ @@ -144,7 +144,7 @@ code_coverage_v_lcov_cap_ = \$(code_coverage_v_lcov_cap_\$(AM_DEFAULT_VERBOSITY) code_coverage_v_lcov_cap_0 = @echo \" LCOV --capture\" \$(CODE_COVERAGE_OUTPUT_FILE); code_coverage_v_lcov_ign = \$(code_coverage_v_lcov_ign_\$(V)) code_coverage_v_lcov_ign_ = \$(code_coverage_v_lcov_ign_\$(AM_DEFAULT_VERBOSITY)) -code_coverage_v_lcov_ign_0 = @echo \" LCOV --remove /tmp/*\" \$(CODE_COVERAGE_IGNORE_PATTERN); +code_coverage_v_lcov_ign_0 = @echo \" LCOV --remove\" \"\$(CODE_COVERAGE_OUTPUT_FILE).tmp\" \$(CODE_COVERAGE_IGNORE_PATTERN); code_coverage_v_genhtml = \$(code_coverage_v_genhtml_\$(V)) code_coverage_v_genhtml_ = \$(code_coverage_v_genhtml_\$(AM_DEFAULT_VERBOSITY)) code_coverage_v_genhtml_0 = @echo \" GEN \" \"\$(CODE_COVERAGE_OUTPUT_DIRECTORY)\"; @@ -163,7 +163,7 @@ check-code-coverage: # Capture code coverage data code-coverage-capture: code-coverage-capture-hook \$(code_coverage_v_lcov_cap)\$(LCOV) \$(code_coverage_quiet) \$(addprefix --directory ,\$(CODE_COVERAGE_DIRECTORY)) --capture --output-file \"\$(CODE_COVERAGE_OUTPUT_FILE).tmp\" --test-name \"\$(call code_coverage_sanitize,\$(PACKAGE_NAME)-\$(PACKAGE_VERSION))\" --no-checksum --compat-libtool \$(CODE_COVERAGE_LCOV_SHOPTS) \$(CODE_COVERAGE_LCOV_OPTIONS) - \$(code_coverage_v_lcov_ign)\$(LCOV) \$(code_coverage_quiet) \$(addprefix --directory ,\$(CODE_COVERAGE_DIRECTORY)) --remove \"\$(CODE_COVERAGE_OUTPUT_FILE).tmp\" \"/tmp/*\" \$(CODE_COVERAGE_IGNORE_PATTERN) --output-file \"\$(CODE_COVERAGE_OUTPUT_FILE)\" \$(CODE_COVERAGE_LCOV_SHOPTS) \$(CODE_COVERAGE_LCOV_RMOPTS) + \$(code_coverage_v_lcov_ign)\$(LCOV) \$(code_coverage_quiet) \$(addprefix --directory ,\$(CODE_COVERAGE_DIRECTORY)) --remove \"\$(CODE_COVERAGE_OUTPUT_FILE).tmp\" \$(CODE_COVERAGE_IGNORE_PATTERN) --output-file \"\$(CODE_COVERAGE_OUTPUT_FILE)\" \$(CODE_COVERAGE_LCOV_SHOPTS) \$(CODE_COVERAGE_LCOV_RMOPTS) -@rm -f \"\$(CODE_COVERAGE_OUTPUT_FILE).tmp\" \$(code_coverage_v_genhtml)LANG=C \$(GENHTML) \$(code_coverage_quiet) \$(addprefix --prefix ,\$(CODE_COVERAGE_DIRECTORY)) --output-directory \"\$(CODE_COVERAGE_OUTPUT_DIRECTORY)\" --title \"\$(PACKAGE_NAME)-\$(PACKAGE_VERSION) Code Coverage\" --legend --show-details \"\$(CODE_COVERAGE_OUTPUT_FILE)\" \$(CODE_COVERAGE_GENHTML_OPTIONS) @echo \"file://\$(abs_builddir)/\$(CODE_COVERAGE_OUTPUT_DIRECTORY)/index.html\" @@ -206,14 +206,14 @@ code-coverage-capture-hook: ]) AC_DEFUN([_AX_CODE_COVERAGE_ENABLED],[ - AX_CHECK_GNU_MAKE([],AC_MSG_ERROR([not using GNU make that is needed for coverage])) + AX_CHECK_GNU_MAKE([],[AC_MSG_ERROR([not using GNU make that is needed for coverage])]) AC_REQUIRE([AX_ADD_AM_MACRO_STATIC]) # check for gcov AC_CHECK_TOOL([GCOV], [$_AX_CODE_COVERAGE_GCOV_PROG_WITH], [:]) AS_IF([test "X$GCOV" = "X:"], - AC_MSG_ERROR([gcov is needed to do coverage])) + [AC_MSG_ERROR([gcov is needed to do coverage])]) AC_SUBST([GCOV]) dnl Check if gcc is being used @@ -232,12 +232,13 @@ AC_DEFUN([_AX_CODE_COVERAGE_ENABLED],[ AC_MSG_ERROR([Could not find genhtml from the lcov package]) ]) + AC_CHECK_LIB([gcov], [_gcov_init], [CODE_COVERAGE_LIBS="-lgcov"], [CODE_COVERAGE_LIBS=""]) + dnl Build the code coverage flags dnl Define CODE_COVERAGE_LDFLAGS for backwards compatibility CODE_COVERAGE_CPPFLAGS="-DNDEBUG" CODE_COVERAGE_CFLAGS="-O0 -g -fprofile-arcs -ftest-coverage" CODE_COVERAGE_CXXFLAGS="-O0 -g -fprofile-arcs -ftest-coverage" - CODE_COVERAGE_LIBS="-lgcov" AC_SUBST([CODE_COVERAGE_CPPFLAGS]) AC_SUBST([CODE_COVERAGE_CFLAGS]) diff --git a/deps/cares/m4/ax_cxx_compile_stdcxx.m4 b/deps/cares/m4/ax_cxx_compile_stdcxx.m4 index 8edf5152ec7a91..fe6ae17e6c4d32 100644 --- a/deps/cares/m4/ax_cxx_compile_stdcxx.m4 +++ b/deps/cares/m4/ax_cxx_compile_stdcxx.m4 @@ -10,8 +10,8 @@ # # Check for baseline language coverage in the compiler for the specified # version of the C++ standard. If necessary, add switches to CXX and -# CXXCPP to enable support. VERSION may be '11', '14', '17', or '20' for -# the respective C++ standard version. +# CXXCPP to enable support. VERSION may be '11', '14', '17', '20', or +# '23' for the respective C++ standard version. # # The second argument, if specified, indicates whether you insist on an # extended mode (e.g. -std=gnu++11) or a strict conformance mode (e.g. @@ -36,14 +36,15 @@ # Copyright (c) 2016, 2018 Krzesimir Nowak # Copyright (c) 2019 Enji Cooper # Copyright (c) 2020 Jason Merrill -# Copyright (c) 2021 Jörn Heusipp +# Copyright (c) 2021, 2024 Jörn Heusipp +# Copyright (c) 2015, 2022, 2023, 2024 Olly Betts # # Copying and distribution of this file, with or without modification, are # permitted in any medium without royalty provided the copyright notice # and this notice are preserved. This file is offered as-is, without any # warranty. -#serial 18 +#serial 25 dnl This macro is based on the code from the AX_CXX_COMPILE_STDCXX_11 macro dnl (serial version number 13). @@ -53,6 +54,7 @@ AC_DEFUN([AX_CXX_COMPILE_STDCXX], [dnl [$1], [14], [ax_cxx_compile_alternatives="14 1y"], [$1], [17], [ax_cxx_compile_alternatives="17 1z"], [$1], [20], [ax_cxx_compile_alternatives="20"], + [$1], [23], [ax_cxx_compile_alternatives="23"], [m4_fatal([invalid first argument `$1' to AX_CXX_COMPILE_STDCXX])])dnl m4_if([$2], [], [], [$2], [ext], [], @@ -159,31 +161,41 @@ AC_DEFUN([AX_CXX_COMPILE_STDCXX], [dnl dnl Test body for checking C++11 support m4_define([_AX_CXX_COMPILE_STDCXX_testbody_11], - _AX_CXX_COMPILE_STDCXX_testbody_new_in_11 + [_AX_CXX_COMPILE_STDCXX_testbody_new_in_11] ) dnl Test body for checking C++14 support m4_define([_AX_CXX_COMPILE_STDCXX_testbody_14], - _AX_CXX_COMPILE_STDCXX_testbody_new_in_11 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 + [_AX_CXX_COMPILE_STDCXX_testbody_new_in_11 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_14] ) dnl Test body for checking C++17 support m4_define([_AX_CXX_COMPILE_STDCXX_testbody_17], - _AX_CXX_COMPILE_STDCXX_testbody_new_in_11 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_17 + [_AX_CXX_COMPILE_STDCXX_testbody_new_in_11 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_17] ) dnl Test body for checking C++20 support m4_define([_AX_CXX_COMPILE_STDCXX_testbody_20], - _AX_CXX_COMPILE_STDCXX_testbody_new_in_11 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_17 - _AX_CXX_COMPILE_STDCXX_testbody_new_in_20 + [_AX_CXX_COMPILE_STDCXX_testbody_new_in_11 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_17 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_20] +) + +dnl Test body for checking C++23 support + +m4_define([_AX_CXX_COMPILE_STDCXX_testbody_23], + [_AX_CXX_COMPILE_STDCXX_testbody_new_in_11 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_14 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_17 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_20 + _AX_CXX_COMPILE_STDCXX_testbody_new_in_23] ) @@ -201,7 +213,17 @@ m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_11], [[ // MSVC always sets __cplusplus to 199711L in older versions; newer versions // only set it correctly if /Zc:__cplusplus is specified as well as a // /std:c++NN switch: +// // https://devblogs.microsoft.com/cppblog/msvc-now-correctly-reports-__cplusplus/ +// +// The value __cplusplus ought to have is available in _MSVC_LANG since +// Visual Studio 2015 Update 3: +// +// https://learn.microsoft.com/en-us/cpp/preprocessor/predefined-macros +// +// This was also the first MSVC version to support C++14 so we can't use the +// value of either __cplusplus or _MSVC_LANG to quickly rule out MSVC having +// C++11 or C++14 support, but we can check _MSVC_LANG for C++17 and later. #elif __cplusplus < 201103L && !defined _MSC_VER #error "This is not a C++11 compiler" @@ -617,7 +639,7 @@ m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_17], [[ #error "This is not a C++ compiler" -#elif __cplusplus < 201703L && !defined _MSC_VER +#elif (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 201703L #error "This is not a C++17 compiler" @@ -983,7 +1005,7 @@ namespace cxx17 } // namespace cxx17 -#endif // __cplusplus < 201703L && !defined _MSC_VER +#endif // (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 201703L ]]) @@ -996,7 +1018,7 @@ m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_20], [[ #error "This is not a C++ compiler" -#elif __cplusplus < 202002L && !defined _MSC_VER +#elif (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 202002L #error "This is not a C++20 compiler" @@ -1013,6 +1035,36 @@ namespace cxx20 } // namespace cxx20 -#endif // __cplusplus < 202002L && !defined _MSC_VER +#endif // (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 202002L + +]]) + + +dnl Tests for new features in C++23 + +m4_define([_AX_CXX_COMPILE_STDCXX_testbody_new_in_23], [[ + +#ifndef __cplusplus + +#error "This is not a C++ compiler" + +#elif (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 202302L + +#error "This is not a C++23 compiler" + +#else + +#include + +namespace cxx23 +{ + +// As C++23 supports feature test macros in the standard, there is no +// immediate need to actually test for feature availability on the +// Autoconf side. + +} // namespace cxx23 + +#endif // (defined _MSVC_LANG ? _MSVC_LANG : __cplusplus) < 202302L ]]) diff --git a/deps/cares/src/Makefile.in b/deps/cares/src/Makefile.in index 0c3c0864d4460a..1f286880247aa1 100644 --- a/deps/cares/src/Makefile.in +++ b/deps/cares/src/Makefile.in @@ -89,7 +89,9 @@ build_triplet = @build@ host_triplet = @host@ subdir = src ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -99,8 +101,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ diff --git a/deps/cares/src/lib/CMakeLists.txt b/deps/cares/src/lib/CMakeLists.txt index 9956fd625b2ad6..9d4e10924d0adb 100644 --- a/deps/cares/src/lib/CMakeLists.txt +++ b/deps/cares/src/lib/CMakeLists.txt @@ -92,11 +92,23 @@ IF (CARES_STATIC) SET_TARGET_PROPERTIES (${LIBNAME} PROPERTIES EXPORT_NAME cares${STATIC_SUFFIX} - OUTPUT_NAME cares${STATIC_SUFFIX} COMPILE_PDB_NAME cares${STATIC_SUFFIX} C_STANDARD 90 ) + # On Windows, the output name should have a static suffix since otherwise + # we would have conflicting output names (libcares.lib) for the link + # library. + # However on Unix-like systems, we typically have something like + # libcares.so for shared libraries and libcares.a for static + # libraries, so these don't conflict. + # This behavior better emulates what happens with autotools builds + IF (WIN32) + SET_TARGET_PROPERTIES(${LIBNAME} PROPERTIES OUTPUT_NAME cares${STATIC_SUFFIX}) + ELSE () + SET_TARGET_PROPERTIES(${LIBNAME} PROPERTIES OUTPUT_NAME cares) + ENDIF() + IF (ANDROID) SET_TARGET_PROPERTIES (${LIBNAME} PROPERTIES C_STANDARD 99) ENDIF () diff --git a/deps/cares/src/lib/Makefile.in b/deps/cares/src/lib/Makefile.in index 4aff043b26a310..a45fc10b544755 100644 --- a/deps/cares/src/lib/Makefile.in +++ b/deps/cares/src/lib/Makefile.in @@ -15,7 +15,7 @@ @SET_MAKE@ # aminclude_static.am generated automatically by Autoconf -# from AX_AM_MACROS_STATIC on Sat Nov 9 17:40:37 UTC 2024 +# from AX_AM_MACROS_STATIC on Sat Dec 14 15:15:44 UTC 2024 # Copyright (C) The c-ares project and its contributors # SPDX-License-Identifier: MIT @@ -100,7 +100,9 @@ host_triplet = @host@ subdir = src/lib SUBDIRS = ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -110,8 +112,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ @@ -629,7 +629,7 @@ libcares_la_CPPFLAGS_EXTRA = -DCARES_BUILDING_LIBRARY $(am__append_3) \ @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_lcov_cap_0 = @echo " LCOV --capture" $(CODE_COVERAGE_OUTPUT_FILE); @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_lcov_ign = $(code_coverage_v_lcov_ign_$(V)) @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_lcov_ign_ = $(code_coverage_v_lcov_ign_$(AM_DEFAULT_VERBOSITY)) -@CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_lcov_ign_0 = @echo " LCOV --remove /tmp/*" $(CODE_COVERAGE_IGNORE_PATTERN); +@CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_lcov_ign_0 = @echo " LCOV --remove" "$(CODE_COVERAGE_OUTPUT_FILE).tmp" $(CODE_COVERAGE_IGNORE_PATTERN); @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_genhtml = $(code_coverage_v_genhtml_$(V)) @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_genhtml_ = $(code_coverage_v_genhtml_$(AM_DEFAULT_VERBOSITY)) @CODE_COVERAGE_ENABLED_TRUE@code_coverage_v_genhtml_0 = @echo " GEN " "$(CODE_COVERAGE_OUTPUT_DIRECTORY)"; @@ -2328,7 +2328,7 @@ uninstall-am: uninstall-libLTLIBRARIES # Capture code coverage data @CODE_COVERAGE_ENABLED_TRUE@code-coverage-capture: code-coverage-capture-hook @CODE_COVERAGE_ENABLED_TRUE@ $(code_coverage_v_lcov_cap)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --capture --output-file "$(CODE_COVERAGE_OUTPUT_FILE).tmp" --test-name "$(call code_coverage_sanitize,$(PACKAGE_NAME)-$(PACKAGE_VERSION))" --no-checksum --compat-libtool $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_OPTIONS) -@CODE_COVERAGE_ENABLED_TRUE@ $(code_coverage_v_lcov_ign)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --remove "$(CODE_COVERAGE_OUTPUT_FILE).tmp" "/tmp/*" $(CODE_COVERAGE_IGNORE_PATTERN) --output-file "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_RMOPTS) +@CODE_COVERAGE_ENABLED_TRUE@ $(code_coverage_v_lcov_ign)$(LCOV) $(code_coverage_quiet) $(addprefix --directory ,$(CODE_COVERAGE_DIRECTORY)) --remove "$(CODE_COVERAGE_OUTPUT_FILE).tmp" $(CODE_COVERAGE_IGNORE_PATTERN) --output-file "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_LCOV_SHOPTS) $(CODE_COVERAGE_LCOV_RMOPTS) @CODE_COVERAGE_ENABLED_TRUE@ -@rm -f "$(CODE_COVERAGE_OUTPUT_FILE).tmp" @CODE_COVERAGE_ENABLED_TRUE@ $(code_coverage_v_genhtml)LANG=C $(GENHTML) $(code_coverage_quiet) $(addprefix --prefix ,$(CODE_COVERAGE_DIRECTORY)) --output-directory "$(CODE_COVERAGE_OUTPUT_DIRECTORY)" --title "$(PACKAGE_NAME)-$(PACKAGE_VERSION) Code Coverage" --legend --show-details "$(CODE_COVERAGE_OUTPUT_FILE)" $(CODE_COVERAGE_GENHTML_OPTIONS) @CODE_COVERAGE_ENABLED_TRUE@ @echo "file://$(abs_builddir)/$(CODE_COVERAGE_OUTPUT_DIRECTORY)/index.html" diff --git a/deps/cares/src/lib/ares_config.h.cmake b/deps/cares/src/lib/ares_config.h.cmake index 051b97f494fd32..51744fe143868c 100644 --- a/deps/cares/src/lib/ares_config.h.cmake +++ b/deps/cares/src/lib/ares_config.h.cmake @@ -257,6 +257,9 @@ /* Define to 1 if you have the header file. */ #cmakedefine HAVE_SIGNAL_H 1 +/* Define to 1 if you have the strnlen function. */ +#cmakedefine HAVE_STRNLEN 1 + /* Define to 1 if your struct sockaddr_in6 has sin6_scope_id. */ #cmakedefine HAVE_STRUCT_SOCKADDR_IN6_SIN6_SCOPE_ID 1 diff --git a/deps/cares/src/lib/ares_config.h.in b/deps/cares/src/lib/ares_config.h.in index d1f09d694db68e..a62e17089358aa 100644 --- a/deps/cares/src/lib/ares_config.h.in +++ b/deps/cares/src/lib/ares_config.h.in @@ -309,6 +309,9 @@ /* Define to 1 if you have `strnicmp` */ #undef HAVE_STRNICMP +/* Define to 1 if you have `strnlen` */ +#undef HAVE_STRNLEN + /* Define to 1 if the system has the type `struct addrinfo'. */ #undef HAVE_STRUCT_ADDRINFO diff --git a/deps/cares/src/lib/ares_private.h b/deps/cares/src/lib/ares_private.h index ce8c3f2ddc2f6c..e6d44e8b8640f9 100644 --- a/deps/cares/src/lib/ares_private.h +++ b/deps/cares/src/lib/ares_private.h @@ -388,8 +388,23 @@ ares_status_t ares_sysconfig_set_options(ares_sysconfig_t *sysconfig, ares_status_t ares_init_by_environment(ares_sysconfig_t *sysconfig); + +typedef ares_status_t (*ares_sysconfig_line_cb_t)(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig, + ares_buf_t *line); + +ares_status_t ares_sysconfig_parse_resolv_line(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig, + ares_buf_t *line); + +ares_status_t ares_sysconfig_process_buf(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig, + ares_buf_t *buf, + ares_sysconfig_line_cb_t cb); + ares_status_t ares_init_sysconfig_files(const ares_channel_t *channel, - ares_sysconfig_t *sysconfig); + ares_sysconfig_t *sysconfig, + ares_bool_t process_resolvconf); #ifdef __APPLE__ ares_status_t ares_init_sysconfig_macos(const ares_channel_t *channel, ares_sysconfig_t *sysconfig); diff --git a/deps/cares/src/lib/ares_set_socket_functions.c b/deps/cares/src/lib/ares_set_socket_functions.c index 143c491174fdba..7216ffa933fc07 100644 --- a/deps/cares/src/lib/ares_set_socket_functions.c +++ b/deps/cares/src/lib/ares_set_socket_functions.c @@ -288,7 +288,9 @@ static int default_asetsockopt(ares_socket_t sock, ares_socket_opt_t opt, return setsockopt(sock, SOL_SOCKET, SO_RCVBUF, val, val_size); case ARES_SOCKET_OPT_BIND_DEVICE: - if (!ares_str_isprint(val, (size_t)val_size)) { + /* Count the number of characters before NULL terminator then + * validate those are all printable */ + if (!ares_str_isprint(val, ares_strnlen(val, (size_t)val_size))) { SET_SOCKERRNO(EINVAL); return -1; } diff --git a/deps/cares/src/lib/ares_socket.c b/deps/cares/src/lib/ares_socket.c index df02fd61b60b14..516852a84abfb8 100644 --- a/deps/cares/src/lib/ares_socket.c +++ b/deps/cares/src/lib/ares_socket.c @@ -263,7 +263,8 @@ ares_status_t ares_socket_configure(ares_channel_t *channel, int family, * compatibility */ (void)channel->sock_funcs.asetsockopt( fd, ARES_SOCKET_OPT_BIND_DEVICE, channel->local_dev_name, - sizeof(channel->local_dev_name), channel->sock_func_cb_data); + (ares_socklen_t)ares_strlen(channel->local_dev_name), + channel->sock_func_cb_data); } /* Bind to ip address if configured */ diff --git a/deps/cares/src/lib/ares_sysconfig.c b/deps/cares/src/lib/ares_sysconfig.c index 9f0d7e5061ffe0..286db60328f45b 100644 --- a/deps/cares/src/lib/ares_sysconfig.c +++ b/deps/cares/src/lib/ares_sysconfig.c @@ -260,6 +260,94 @@ static ares_status_t ares_init_sysconfig_android(const ares_channel_t *channel, } #endif +#if defined(__QNX__) +static ares_status_t + ares_init_sysconfig_qnx(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig) +{ + /* QNX: + * 1. use confstr(_CS_RESOLVE, ...) as primary resolv.conf data, replacing + * "_" with " ". If that is empty, then do normal /etc/resolv.conf + * processing. + * 2. We want to process /etc/nsswitch.conf as normal. + * 3. if confstr(_CS_DOMAIN, ...) this is the domain name. Use this as + * preference over anything else found. + */ + ares_buf_t *buf = ares_buf_create(); + unsigned char *data = NULL; + size_t data_size = 0; + ares_bool_t process_resolvconf = ARES_TRUE; + ares_status_t status = ARES_SUCCESS; + + /* Prefer confstr(_CS_RESOLVE, ...) */ + buf = ares_buf_create(); + if (buf == NULL) { + status = ARES_ENOMEM; + goto done; + } + + data_size = 1024; + data = ares_buf_append_start(buf, &data_size); + if (data == NULL) { + status = ARES_ENOMEM; + goto done; + } + + data_size = confstr(_CS_RESOLVE, (char *)data, data_size); + if (data_size > 1) { + /* confstr returns byte for NULL terminator, strip */ + data_size--; + + ares_buf_append_finish(buf, data_size); + /* Its odd, this uses _ instead of " " between keywords, otherwise the + * format is the same as resolv.conf, replace. */ + ares_buf_replace(buf, (const unsigned char *)"_", 1, + (const unsigned char *)" ", 1); + + status = ares_sysconfig_process_buf(channel, sysconfig, buf, + ares_sysconfig_parse_resolv_line); + if (status != ARES_SUCCESS) { + /* ENOMEM is really the only error we'll get here */ + goto done; + } + + /* don't read resolv.conf if we processed *any* nameservers */ + if (ares_llist_len(sysconfig->sconfig) != 0) { + process_resolvconf = ARES_FALSE; + } + } + + /* Process files */ + status = ares_init_sysconfig_files(channel, sysconfig, process_resolvconf); + if (status != ARES_SUCCESS) { + goto done; + } + + /* Read confstr(_CS_DOMAIN, ...), but if we had a search path specified with + * more than one domain, lets prefer that instead. Its not exactly clear + * the best way to handle this. */ + if (sysconfig->ndomains <= 1) { + char domain[256]; + size_t domain_len; + + domain_len = confstr(_CS_DOMAIN, domain, sizeof(domain_len)); + if (domain_len != 0) { + ares_strsplit_free(sysconfig->domains, sysconfig->ndomains); + sysconfig->domains = ares_strsplit(domain, ", ", &sysconfig->ndomains); + if (sysconfig->domains == NULL) { + status = ARES_ENOMEM; + goto done; + } + } + } + +done: + ares_buf_destroy(buf); + + return status; +} +#endif + #if defined(CARES_USE_LIBRESOLV) static ares_status_t ares_init_sysconfig_libresolv(const ares_channel_t *channel, @@ -516,8 +604,10 @@ ares_status_t ares_init_by_sysconfig(ares_channel_t *channel) status = ares_init_sysconfig_macos(channel, &sysconfig); #elif defined(CARES_USE_LIBRESOLV) status = ares_init_sysconfig_libresolv(channel, &sysconfig); +#elif defined(__QNX__) + status = ares_init_sysconfig_qnx(channel, &sysconfig); #else - status = ares_init_sysconfig_files(channel, &sysconfig); + status = ares_init_sysconfig_files(channel, &sysconfig, ARES_TRUE); #endif if (status != ARES_SUCCESS) { diff --git a/deps/cares/src/lib/ares_sysconfig_files.c b/deps/cares/src/lib/ares_sysconfig_files.c index 49bc330d9d346d..a6c2a8e62bb34f 100644 --- a/deps/cares/src/lib/ares_sysconfig_files.c +++ b/deps/cares/src/lib/ares_sysconfig_files.c @@ -549,9 +549,9 @@ ares_status_t ares_init_by_environment(ares_sysconfig_t *sysconfig) /* This function will only return ARES_SUCCESS or ARES_ENOMEM. Any other * conditions are ignored. Users may mess up config files, but we want to * process anything we can. */ -static ares_status_t parse_resolvconf_line(const ares_channel_t *channel, - ares_sysconfig_t *sysconfig, - ares_buf_t *line) +ares_status_t ares_sysconfig_parse_resolv_line(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig, + ares_buf_t *line) { char option[32]; char value[512]; @@ -726,9 +726,38 @@ static ares_status_t parse_svcconf_line(const ares_channel_t *channel, return status; } -typedef ares_status_t (*line_callback_t)(const ares_channel_t *channel, - ares_sysconfig_t *sysconfig, - ares_buf_t *line); + +ares_status_t ares_sysconfig_process_buf(const ares_channel_t *channel, + ares_sysconfig_t *sysconfig, + ares_buf_t *buf, + ares_sysconfig_line_cb_t cb) +{ + ares_array_t *lines = NULL; + size_t num; + size_t i; + ares_status_t status; + + status = ares_buf_split(buf, (const unsigned char *)"\n", 1, + ARES_BUF_SPLIT_TRIM, 0, &lines); + if (status != ARES_SUCCESS) { + goto done; + } + + num = ares_array_len(lines); + for (i = 0; i < num; i++) { + ares_buf_t **bufptr = ares_array_at(lines, i); + ares_buf_t *line = *bufptr; + + status = cb(channel, sysconfig, line); + if (status != ARES_SUCCESS) { + goto done; + } + } + +done: + ares_array_destroy(lines); + return status; +} /* Should only return: * ARES_ENOTFOUND - file not found @@ -737,16 +766,13 @@ typedef ares_status_t (*line_callback_t)(const ares_channel_t *channel, * ARES_SUCCESS - file processed, doesn't necessarily mean it was a good * file, but we're not erroring out if we can't parse * something (or anything at all) */ -static ares_status_t process_config_lines(const ares_channel_t *channel, - const char *filename, - ares_sysconfig_t *sysconfig, - line_callback_t cb) +static ares_status_t process_config_lines(const ares_channel_t *channel, + const char *filename, + ares_sysconfig_t *sysconfig, + ares_sysconfig_line_cb_t cb) { ares_status_t status = ARES_SUCCESS; - ares_array_t *lines = NULL; ares_buf_t *buf = NULL; - size_t num; - size_t i; buf = ares_buf_create(); if (buf == NULL) { @@ -759,43 +785,30 @@ static ares_status_t process_config_lines(const ares_channel_t *channel, goto done; } - status = ares_buf_split(buf, (const unsigned char *)"\n", 1, - ARES_BUF_SPLIT_TRIM, 0, &lines); - if (status != ARES_SUCCESS) { - goto done; - } - - num = ares_array_len(lines); - for (i = 0; i < num; i++) { - ares_buf_t **bufptr = ares_array_at(lines, i); - ares_buf_t *line = *bufptr; - - status = cb(channel, sysconfig, line); - if (status != ARES_SUCCESS) { - goto done; - } - } + status = ares_sysconfig_process_buf(channel, sysconfig, buf, cb); done: ares_buf_destroy(buf); - ares_array_destroy(lines); return status; } ares_status_t ares_init_sysconfig_files(const ares_channel_t *channel, - ares_sysconfig_t *sysconfig) + ares_sysconfig_t *sysconfig, + ares_bool_t process_resolvconf) { ares_status_t status = ARES_SUCCESS; /* Resolv.conf */ - status = process_config_lines(channel, - (channel->resolvconf_path != NULL) - ? channel->resolvconf_path - : PATH_RESOLV_CONF, - sysconfig, parse_resolvconf_line); - if (status != ARES_SUCCESS && status != ARES_ENOTFOUND) { - goto done; + if (process_resolvconf) { + status = process_config_lines(channel, + (channel->resolvconf_path != NULL) + ? channel->resolvconf_path + : PATH_RESOLV_CONF, + sysconfig, ares_sysconfig_parse_resolv_line); + if (status != ARES_SUCCESS && status != ARES_ENOTFOUND) { + goto done; + } } /* Nsswitch.conf */ diff --git a/deps/cares/src/lib/event/ares_event_configchg.c b/deps/cares/src/lib/event/ares_event_configchg.c index e3e665bd165523..5ecc6888ab719f 100644 --- a/deps/cares/src/lib/event/ares_event_configchg.c +++ b/deps/cares/src/lib/event/ares_event_configchg.c @@ -558,14 +558,24 @@ static ares_status_t config_change_check(ares_htable_strvp_t *filestat, const char *resolvconf_path) { size_t i; - const char *configfiles[5]; + const char *configfiles[16]; ares_bool_t changed = ARES_FALSE; + size_t cnt = 0; - configfiles[0] = resolvconf_path; - configfiles[1] = "/etc/nsswitch.conf"; - configfiles[2] = "/etc/netsvc.conf"; - configfiles[3] = "/etc/svc.conf"; - configfiles[4] = NULL; + memset(configfiles, 0, sizeof(configfiles)); + + configfiles[cnt++] = resolvconf_path; + configfiles[cnt++] = "/etc/nsswitch.conf"; +#ifdef _AIX + configfiles[cnt++] = "/etc/netsvc.conf"; +#endif +#ifdef __osf /* Tru64 */ + configfiles[cnt++] = "/etc/svc.conf"; +#endif +#ifdef __QNX__ + configfiles[cnt++] = "/etc/net.cfg"; +#endif + configfiles[cnt++] = NULL; for (i = 0; configfiles[i] != NULL; i++) { fileinfo_t *fi = ares_htable_strvp_get_direct(filestat, configfiles[i]); diff --git a/deps/cares/src/lib/include/ares_buf.h b/deps/cares/src/lib/include/ares_buf.h index 7836a313e066d1..10d29eaf83bd8e 100644 --- a/deps/cares/src/lib/include/ares_buf.h +++ b/deps/cares/src/lib/include/ares_buf.h @@ -219,6 +219,26 @@ CARES_EXTERN unsigned char *ares_buf_finish_bin(ares_buf_t *buf, size_t *len); */ CARES_EXTERN char *ares_buf_finish_str(ares_buf_t *buf, size_t *len); +/*! Replace the given search byte sequence with the replacement byte sequence. + * This is only valid for allocated buffers, not const buffers. Will replace + * all byte sequences starting at the current offset to the end of the buffer. + * + * \param[in] buf Initialized buffer object. Can not be a "const" buffer. + * \param[in] srch Search byte sequence, must not be NULL. + * \param[in] srch_size Size of byte sequence, must not be zero. + * \param[in] rplc Byte sequence to use as replacement. May be NULL if + * rplc_size is zero. + * \param[in] rplc_size Size of replacement byte sequence, may be 0. + * \return ARES_SUCCESS on success, otherwise on may return failure only on + * memory allocation failure or misuse. Will not return indication + * if any replacements occurred + */ +CARES_EXTERN ares_status_t ares_buf_replace(ares_buf_t *buf, + const unsigned char *srch, + size_t srch_size, + const unsigned char *rplc, + size_t rplc_size); + /*! Tag a position to save in the buffer in case parsing needs to rollback, * such as if insufficient data is available, but more data may be added in * the future. Only a single tag can be set per buffer object. Setting a diff --git a/deps/cares/src/lib/include/ares_str.h b/deps/cares/src/lib/include/ares_str.h index ea75b3b3e7441d..4ee339510bf026 100644 --- a/deps/cares/src/lib/include/ares_str.h +++ b/deps/cares/src/lib/include/ares_str.h @@ -29,6 +29,20 @@ CARES_EXTERN char *ares_strdup(const char *s1); +/*! Scan up to maxlen bytes for the first NULL character and return + * its index, or maxlen if not found. The function only returns + * maxlen if the first maxlen bytes were not NULL characters; it + * makes no guarantee for what \c str[maxlen] (if defined) is, and + * does not access it. It is behaving like the POSIX \c strnlen() + * function, except that it returns 0 if the \p str pointer is \c + * NULL. + * + * \param[in] str The string to scan for the NULL character + * \param[in] maxlen The maximum number of bytes to scan + * \return Index of first NULL byte. Between 0 and maxlen (inclusive). + */ +CARES_EXTERN size_t ares_strnlen(const char *str, size_t maxlen); + CARES_EXTERN size_t ares_strlen(const char *str); /*! Copy string from source to destination with destination buffer size diff --git a/deps/cares/src/lib/record/ares_dns_multistring.c b/deps/cares/src/lib/record/ares_dns_multistring.c index 57c0d1c0a803ec..44fcaccd65bb6a 100644 --- a/deps/cares/src/lib/record/ares_dns_multistring.c +++ b/deps/cares/src/lib/record/ares_dns_multistring.c @@ -146,6 +146,18 @@ ares_status_t ares_dns_multistring_add_own(ares_dns_multistring_t *strs, return status; } + /* Issue #921, ares_dns_multistring_get() doesn't have a way to indicate + * success or fail on a zero-length string which is actually valid. So we + * are going to allocate a 1-byte buffer to use as a placeholder in this + * case */ + if (str == NULL) { + str = ares_malloc_zero(1); + if (str == NULL) { + ares_array_remove_last(strs->strs); + return ARES_ENOMEM; + } + } + data->data = str; data->len = len; @@ -252,36 +264,38 @@ ares_status_t ares_dns_multistring_parse_buf(ares_buf_t *buf, break; /* LCOV_EXCL_LINE: DefensiveCoding */ } - if (len) { - /* When used by the _str() parser, it really needs to be validated to - * be a valid printable ascii string. Do that here */ - if (validate_printable && ares_buf_len(buf) >= len) { - size_t mylen; - const char *data = (const char *)ares_buf_peek(buf, &mylen); - if (!ares_str_isprint(data, len)) { - status = ARES_EBADSTR; - break; - } + + /* When used by the _str() parser, it really needs to be validated to + * be a valid printable ascii string. Do that here */ + if (len && validate_printable && ares_buf_len(buf) >= len) { + size_t mylen; + const char *data = (const char *)ares_buf_peek(buf, &mylen); + if (!ares_str_isprint(data, len)) { + status = ARES_EBADSTR; + break; } + } - if (strs != NULL) { - unsigned char *data = NULL; + if (strs != NULL) { + unsigned char *data = NULL; + if (len) { status = ares_buf_fetch_bytes_dup(buf, len, ARES_TRUE, &data); if (status != ARES_SUCCESS) { break; } - status = ares_dns_multistring_add_own(*strs, data, len); - if (status != ARES_SUCCESS) { - ares_free(data); - break; - } - } else { - status = ares_buf_consume(buf, len); - if (status != ARES_SUCCESS) { - break; - } + } + status = ares_dns_multistring_add_own(*strs, data, len); + if (status != ARES_SUCCESS) { + ares_free(data); + break; + } + } else { + status = ares_buf_consume(buf, len); + if (status != ARES_SUCCESS) { + break; } } + } if (status != ARES_SUCCESS && strs != NULL) { diff --git a/deps/cares/src/lib/str/ares_buf.c b/deps/cares/src/lib/str/ares_buf.c index 69e6b38aac849e..63acc6cf7714d3 100644 --- a/deps/cares/src/lib/str/ares_buf.c +++ b/deps/cares/src/lib/str/ares_buf.c @@ -1104,6 +1104,72 @@ const unsigned char *ares_buf_peek(const ares_buf_t *buf, size_t *len) return ares_buf_fetch(buf, len); } +ares_status_t ares_buf_replace(ares_buf_t *buf, const unsigned char *srch, + size_t srch_size, const unsigned char *rplc, + size_t rplc_size) +{ + size_t processed_len = 0; + ares_status_t status; + + if (buf->alloc_buf == NULL || srch == NULL || srch_size == 0 || + (rplc == NULL && rplc_size != 0)) { + return ARES_EFORMERR; + } + + while (1) { + unsigned char *ptr = buf->alloc_buf + buf->offset + processed_len; + size_t remaining_len = buf->data_len - buf->offset - processed_len; + size_t found_offset = 0; + size_t move_data_len; + + /* Find pattern */ + ptr = ares_memmem(ptr, remaining_len, srch, srch_size); + if (ptr == NULL) { + break; + } + + /* Store the offset this was found because our actual pointer might be + * switched out from under us by the call to ensure_space() if the + * replacement pattern is larger than the search pattern */ + found_offset = (size_t)(ptr - (size_t)(buf->alloc_buf + buf->offset)); + if (rplc_size > srch_size) { + status = ares_buf_ensure_space(buf, rplc_size - srch_size); + if (status != ARES_SUCCESS) { + return status; + } + } + + /* Impossible, but silence clang */ + if (buf->alloc_buf == NULL) { + return ARES_ENOMEM; + } + + /* Recalculate actual pointer */ + ptr = buf->alloc_buf + buf->offset + found_offset; + + /* Move the data */ + move_data_len = buf->data_len - buf->offset - found_offset - srch_size; + memmove(ptr + rplc_size, + ptr + srch_size, + move_data_len); + + /* Copy in the replacement data */ + if (rplc != NULL && rplc_size > 0) { + memcpy(ptr, rplc, rplc_size); + } + + if (rplc_size > srch_size) { + buf->data_len += rplc_size - srch_size; + } else { + buf->data_len -= srch_size - rplc_size; + } + + processed_len = found_offset + rplc_size; + } + + return ARES_SUCCESS; +} + ares_status_t ares_buf_peek_byte(const ares_buf_t *buf, unsigned char *b) { size_t remaining_len = 0; diff --git a/deps/cares/src/lib/str/ares_str.c b/deps/cares/src/lib/str/ares_str.c index f6bfabf11f4467..0eda1ab9f15783 100644 --- a/deps/cares/src/lib/str/ares_str.c +++ b/deps/cares/src/lib/str/ares_str.c @@ -32,6 +32,23 @@ # include #endif +size_t ares_strnlen(const char *str, size_t maxlen) { + const char *p = NULL; + if (str == NULL) { + return 0; + } +#ifdef HAVE_STRNLEN + (void)p; + return strnlen(str, maxlen); +#else + if ((p = memchr(str, 0, maxlen)) == NULL) { + return maxlen; + } else { + return (size_t)(p - str); + } +#endif /* HAVE_STRNLEN */ +} + size_t ares_strlen(const char *str) { if (str == NULL) { diff --git a/deps/cares/src/tools/Makefile.in b/deps/cares/src/tools/Makefile.in index 9a96a74fa6957d..19e99a253378c7 100644 --- a/deps/cares/src/tools/Makefile.in +++ b/deps/cares/src/tools/Makefile.in @@ -91,7 +91,9 @@ host_triplet = @host@ noinst_PROGRAMS = $(am__EXEEXT_1) subdir = src/tools ACLOCAL_M4 = $(top_srcdir)/aclocal.m4 -am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ +am__aclocal_m4_deps = $(top_srcdir)/m4/ares_check_user_namespace.m4 \ + $(top_srcdir)/m4/ares_check_uts_namespace.m4 \ + $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_ac_print_to_file.m4 \ $(top_srcdir)/m4/ax_add_am_macro_static.m4 \ $(top_srcdir)/m4/ax_am_macros_static.m4 \ @@ -101,8 +103,6 @@ am__aclocal_m4_deps = $(top_srcdir)/m4/ax_ac_append_to_file.m4 \ $(top_srcdir)/m4/ax_check_compile_flag.m4 \ $(top_srcdir)/m4/ax_check_gnu_make.m4 \ $(top_srcdir)/m4/ax_check_link_flag.m4 \ - $(top_srcdir)/m4/ax_check_user_namespace.m4 \ - $(top_srcdir)/m4/ax_check_uts_namespace.m4 \ $(top_srcdir)/m4/ax_code_coverage.m4 \ $(top_srcdir)/m4/ax_compiler_vendor.m4 \ $(top_srcdir)/m4/ax_cxx_compile_stdcxx.m4 \ From 78743b15337b27e36387e8eba3448ea5e7d23331 Mon Sep 17 00:00:00 2001 From: Mert Can Altin Date: Tue, 17 Dec 2024 13:03:09 +0300 Subject: [PATCH 75/88] tools: add REPLACEME check to workflow PR-URL: https://github.com/nodejs/node/pull/56251 Reviewed-By: Antoine du Hamel Reviewed-By: Yagiz Nizipli --- .github/workflows/lint-release-proposal.yml | 3 +++ 1 file changed, 3 insertions(+) diff --git a/.github/workflows/lint-release-proposal.yml b/.github/workflows/lint-release-proposal.yml index 5f0f9a87329b17..1ea2b4b1b173e2 100644 --- a/.github/workflows/lint-release-proposal.yml +++ b/.github/workflows/lint-release-proposal.yml @@ -57,3 +57,6 @@ jobs: - name: Verify NODE_VERSION_IS_RELEASE bit is correctly set run: | grep -q '^#define NODE_VERSION_IS_RELEASE 1$' src/node_version.h + - name: Check for placeholders in documentation + run: | + ! grep "REPLACEME" doc/api/*.md From ea9a675f56a2ac08bcda70dd20b6c0475f54438b Mon Sep 17 00:00:00 2001 From: Pietro Marchini Date: Tue, 17 Dec 2024 12:10:17 +0100 Subject: [PATCH 76/88] test_runner: exclude test files from coverage by default PR-URL: https://github.com/nodejs/node/pull/56060 Reviewed-By: Colin Ihrig Reviewed-By: Matteo Collina Reviewed-By: Moshe Atlow --- doc/api/cli.md | 3 + doc/api/test.md | 7 +- lib/internal/fs/glob.js | 23 ++++ lib/internal/test_runner/coverage.js | 17 ++- lib/internal/test_runner/utils.js | 5 + lib/path.js | 27 +--- .../coverage-default-exclusion/file-test.js | 7 ++ .../coverage-default-exclusion/file.test.mjs | 7 ++ .../coverage-default-exclusion/file.test.ts | 7 ++ .../coverage-default-exclusion/logic-file.js | 9 ++ .../coverage-default-exclusion/test.cjs | 7 ++ .../test/not-matching-test-name.js | 7 ++ .../test-runner/output/lcov_reporter.js | 11 +- ...test-runner-coverage-default-exclusion.mjs | 116 ++++++++++++++++++ .../test-runner-coverage-source-map.js | 6 +- .../test-runner-coverage-thresholds.js | 6 + test/parallel/test-runner-coverage.js | 89 +++++++++++--- test/parallel/test-runner-output.mjs | 29 +++-- 18 files changed, 328 insertions(+), 55 deletions(-) create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/file-test.js create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/file.test.mjs create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/file.test.ts create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/logic-file.js create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/test.cjs create mode 100644 test/fixtures/test-runner/coverage-default-exclusion/test/not-matching-test-name.js create mode 100644 test/parallel/test-runner-coverage-default-exclusion.mjs diff --git a/doc/api/cli.md b/doc/api/cli.md index faade390886e9b..18c16aba1e7604 100644 --- a/doc/api/cli.md +++ b/doc/api/cli.md @@ -2267,6 +2267,9 @@ This option may be specified multiple times to exclude multiple glob patterns. If both `--test-coverage-exclude` and `--test-coverage-include` are provided, files must meet **both** criteria to be included in the coverage report. +By default all the matching test files are excluded from the coverage report. +Specifying this option will override the default behavior. + ### `--test-coverage-functions=threshold` + +* {Object} + +An object containing commonly used constants for SQLite operations. + +### SQLite constants -### SQLite Session constants +The following constants are exported by the `sqlite.constants` object. #### Conflict-resolution constants @@ -497,7 +505,7 @@ The following constants are meant for use with [`database.applyChangeset()`](#da SQLITE_CHANGESET_ABORT - Abort when a change encounters a conflict and roll back databsase. + Abort when a change encounters a conflict and roll back database. diff --git a/src/node_sqlite.cc b/src/node_sqlite.cc index 1238643b764415..7f5e2f89ce9dba 100644 --- a/src/node_sqlite.cc +++ b/src/node_sqlite.cc @@ -1658,6 +1658,12 @@ void Session::Delete() { session_ = nullptr; } +void DefineConstants(Local target) { + NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_OMIT); + NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_REPLACE); + NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_ABORT); +} + static void Initialize(Local target, Local unused, Local context, @@ -1668,6 +1674,9 @@ static void Initialize(Local target, NewFunctionTemplate(isolate, DatabaseSync::New); db_tmpl->InstanceTemplate()->SetInternalFieldCount( DatabaseSync::kInternalFieldCount); + Local constants = Object::New(isolate); + + DefineConstants(constants); SetProtoMethod(isolate, db_tmpl, "open", DatabaseSync::Open); SetProtoMethod(isolate, db_tmpl, "close", DatabaseSync::Close); @@ -1690,9 +1699,7 @@ static void Initialize(Local target, "StatementSync", StatementSync::GetConstructorTemplate(env)); - NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_OMIT); - NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_REPLACE); - NODE_DEFINE_CONSTANT(target, SQLITE_CHANGESET_ABORT); + target->Set(context, OneByteString(isolate, "constants"), constants).Check(); } } // namespace sqlite diff --git a/test/parallel/test-sqlite-session.js b/test/parallel/test-sqlite-session.js index 306f439939e2e0..617c0c2aa71181 100644 --- a/test/parallel/test-sqlite-session.js +++ b/test/parallel/test-sqlite-session.js @@ -3,9 +3,7 @@ require('../common'); const { DatabaseSync, - SQLITE_CHANGESET_OMIT, - SQLITE_CHANGESET_REPLACE, - SQLITE_CHANGESET_ABORT + constants, } = require('node:sqlite'); const { test, suite } = require('node:test'); @@ -165,7 +163,7 @@ suite('conflict resolution', () => { test('database.applyChangeset() - conflict with SQLITE_CHANGESET_ABORT', (t) => { const { database2, changeset } = prepareConflict(); const result = database2.applyChangeset(changeset, { - onConflict: SQLITE_CHANGESET_ABORT + onConflict: constants.SQLITE_CHANGESET_ABORT }); // When changeset is aborted due to a conflict, applyChangeset should return false t.assert.strictEqual(result, false); @@ -177,7 +175,7 @@ suite('conflict resolution', () => { test('database.applyChangeset() - conflict with SQLITE_CHANGESET_REPLACE', (t) => { const { database2, changeset } = prepareConflict(); const result = database2.applyChangeset(changeset, { - onConflict: SQLITE_CHANGESET_REPLACE + onConflict: constants.SQLITE_CHANGESET_REPLACE }); // Not aborted due to conflict, so should return true t.assert.strictEqual(result, true); @@ -189,7 +187,7 @@ suite('conflict resolution', () => { test('database.applyChangeset() - conflict with SQLITE_CHANGESET_OMIT', (t) => { const { database2, changeset } = prepareConflict(); const result = database2.applyChangeset(changeset, { - onConflict: SQLITE_CHANGESET_OMIT + onConflict: constants.SQLITE_CHANGESET_OMIT }); // Not aborted due to conflict, so should return true t.assert.strictEqual(result, true); @@ -199,12 +197,6 @@ suite('conflict resolution', () => { }); }); -test('session related constants are defined', (t) => { - t.assert.strictEqual(SQLITE_CHANGESET_OMIT, 0); - t.assert.strictEqual(SQLITE_CHANGESET_REPLACE, 1); - t.assert.strictEqual(SQLITE_CHANGESET_ABORT, 2); -}); - test('database.createSession() - filter changes', (t) => { const database1 = new DatabaseSync(':memory:'); const database2 = new DatabaseSync(':memory:'); diff --git a/test/parallel/test-sqlite.js b/test/parallel/test-sqlite.js index 825e44fb2965f7..87162526ffadcd 100644 --- a/test/parallel/test-sqlite.js +++ b/test/parallel/test-sqlite.js @@ -2,7 +2,7 @@ const { spawnPromisified } = require('../common'); const tmpdir = require('../common/tmpdir'); const { join } = require('node:path'); -const { DatabaseSync } = require('node:sqlite'); +const { DatabaseSync, constants } = require('node:sqlite'); const { suite, test } = require('node:test'); let cnt = 0; @@ -85,6 +85,12 @@ test('in-memory databases are supported', (t) => { ); }); +test('sqlite constants are defined', (t) => { + t.assert.strictEqual(constants.SQLITE_CHANGESET_OMIT, 0); + t.assert.strictEqual(constants.SQLITE_CHANGESET_REPLACE, 1); + t.assert.strictEqual(constants.SQLITE_CHANGESET_ABORT, 2); +}); + test('PRAGMAs are supported', (t) => { const db = new DatabaseSync(nextDb()); t.after(() => { db.close(); }); diff --git a/typings/internalBinding/constants.d.ts b/typings/internalBinding/constants.d.ts index 89d2a53aae2118..dc4657080ba54b 100644 --- a/typings/internalBinding/constants.d.ts +++ b/typings/internalBinding/constants.d.ts @@ -130,6 +130,11 @@ export interface ConstantsBinding { PRIORITY_HIGHEST: -20; }; }; + sqlite: { + SQLITE_CHANGESET_OMIT: 0; + SQLITE_CHANGESET_REPLACE: 1; + SQLITE_CHANGESET_ABORT: 2; + }; fs: { UV_FS_SYMLINK_DIR: 1; UV_FS_SYMLINK_JUNCTION: 2; From 72f79b44edaab34df6c7a1e3a05810d9e3300a5f Mon Sep 17 00:00:00 2001 From: Rafael Gonzaga Date: Tue, 17 Dec 2024 17:22:52 -0300 Subject: [PATCH 84/88] doc: stabilize util.styleText MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit PR-URL: https://github.com/nodejs/node/pull/56265 Reviewed-By: Yagiz Nizipli Reviewed-By: Marco Ippolito Reviewed-By: Adrian Estrada Reviewed-By: Vinícius Lourenço Claro Cardoso Reviewed-By: Moshe Atlow --- doc/api/util.md | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/doc/api/util.md b/doc/api/util.md index 03c1b621358fea..a3ea6c96dd83db 100644 --- a/doc/api/util.md +++ b/doc/api/util.md @@ -1915,13 +1915,16 @@ console.log(util.stripVTControlCharacters('\u001B[4mvalue\u001B[0m')); ## `util.styleText(format, text[, options])` -> Stability: 1.1 - Active development +> Stability: 2 - Stable. @@ -2630,7 +2630,7 @@ i.e. invoking `process.exit()`. Prints information about usage of [Loading ECMAScript modules using `require()`][]. diff --git a/doc/api/errors.md b/doc/api/errors.md index f2bfa7e087f4ec..d59a51329a8bfa 100644 --- a/doc/api/errors.md +++ b/doc/api/errors.md @@ -2177,7 +2177,7 @@ signaling a short circuit. ### `ERR_LOAD_SQLITE_EXTENSION` An error occurred while loading a SQLite extension. diff --git a/doc/api/module.md b/doc/api/module.md index 175bc6ab3475f7..40615c39728cab 100644 --- a/doc/api/module.md +++ b/doc/api/module.md @@ -22,7 +22,7 @@ added: - v8.10.0 - v6.13.0 changes: - - version: REPLACEME + - version: v23.5.0 pr-url: https://github.com/nodejs/node/pull/56185 description: The list now also contains prefix-only modules. --> @@ -204,7 +204,7 @@ resolution and loading behavior. See [Customization hooks][]. ### `module.registerHooks(options)` > Stability: 1.1 - Active development @@ -529,7 +529,7 @@ added: v22.8.0 > Stability: 1.1 - Active development @@ -991,7 +991,7 @@ register('./path-to-my-hooks.js', { diff --git a/doc/api/sqlite.md b/doc/api/sqlite.md index 9576f112b2ec41..1f5054cd65e26d 100644 --- a/doc/api/sqlite.md +++ b/doc/api/sqlite.md @@ -127,7 +127,7 @@ open. This method is a wrapper around [`sqlite3_close_v2()`][]. ### `database.loadExtension(path)` * `path` {string} The path to the shared library to load. @@ -139,7 +139,7 @@ around [`sqlite3_load_extension()`][]. It is required to enable the ### `database.enableLoadExtension(allow)` * `allow` {boolean} Whether to allow loading extensions. @@ -163,7 +163,7 @@ file. This method is a wrapper around [`sqlite3_exec()`][]. ### `database.function(name[, options], function)` * `name` {string} The name of the SQLite function to create. @@ -475,7 +475,7 @@ exception. ## `sqlite.constants` * {Object} diff --git a/doc/api/util.md b/doc/api/util.md index a3ea6c96dd83db..7299d23ab67de3 100644 --- a/doc/api/util.md +++ b/doc/api/util.md @@ -1922,7 +1922,7 @@ added: - v21.7.0 - v20.12.0 changes: - - version: REPLACEME + - version: v23.5.0 pr-url: https://github.com/nodejs/node/pull/56265 description: styleText is now stable. - version: diff --git a/doc/api/webcrypto.md b/doc/api/webcrypto.md index b0e3257a8a7100..147b93d4a8a682 100644 --- a/doc/api/webcrypto.md +++ b/doc/api/webcrypto.md @@ -2,7 +2,7 @@