Releases: datacontract/datacontract-cli
Releases · datacontract/datacontract-cli
v0.10.21
Added
datacontract export --format custom
: Export to custom format with Jinjadatacontract api
now can be protected with an API key
Changed
datacontract serve
renamed todatacontract api
Fixed
- Fix Error: 'dict object' has no attribute 'model_extra' when trying to use type: string with enum
values inside an array (#619)
v0.10.20
Added
- datacontract serve: now has a route for testing data contracts
- datacontract serve: now has a OpenAPI documentation on root
Changed
- FastAPI endpoint is now moved to extra "web"
Fixed
- API Keys for Data Mesh Manager are now also applied for on-premise installations
v0.10.19
Added
- datacontract import --format csv
- publish command also supports publishing ODCS format
- Option to separate physical table name for a model via config option (#270)
Changed
- JSON Schemas are now bundled with the application (#598)
- datacontract export --format html: The model title is now shown if it is different to the model
name (#585) - datacontract export --format html: Custom model attributes are now shown (#585)
- datacontract export --format html: The composite primary key is now shown. (#591)
- datacontract export --format html: now examples are rendered in the model and definition (#497)
- datacontract export --format sql: Create arrays and struct for Databricks (#467)
Fixed
- datacontract lint: Linter 'Field references existing field' too many values to unpack (expected
2) (#586) - datacontract test (Azure): Error querying delta tables from azure storage. (#458)
- datacontract export --format data-caterer: Use
fields
instead ofschema
- datacontract export --format data-caterer: Use
options
instead ofgenerator.options
- datacontract export --format data-caterer: Capture array type length option and inner data type
- Fixed schemas/datacontract-1.1.0.init.yaml not included in build and
--template
not resolving file
v0.10.18
Fixed
- Fixed an issue when resolving project's dependencies when all extras are installed.
- Definitions referenced by nested fields are not validated correctly (#595)
- Replaced deprecated
primary
field withprimaryKey
in exporters, importers, examples, and Jinja templates for backward compatibility. Fixes #518. - Cannot execute test on column of type record(bigquery) #597
v0.10.16
[0.10.16] - 2024-12-19
Added
- Support for exporting a Data Contract to an Iceberg schema definition.
- When importing in dbt format, add the dbt
not_null
information as a datacontractrequired
field (#547)
Changed
- Type conversion when importing contracts into dbt and exporting contracts from dbt (#534)
- Ensure 'name' is the first column when exporting in dbt format, considering column attributes (#541)
- Rename dbt's
tests
todata_tests
(#548)
Fixed
- Modify the arguments to narrow down the import target with
--dbt-model
(#532) - SodaCL: Prevent
KeyError: 'fail'
from happening when testing with SodaCL - fix: populate database and schema values for bigquery in exported dbt sources (#543)
- Fixing the options for importing and exporting to standard output (#544)
- Fixing the data quality name for model-level and field-level quality tests
v0.10.15
[0.10.15] - 2024-12-02
Added
- Support for model import from parquet file metadata.
- Great Expectation export: add optional args (#496)
suite_name
the name of the expectation suite to exportengine
used to run checkssql_server_type
to define the type of SQL Server to use when engine issql
- Changelog support for
Info
andTerms
blocks. datacontract import
now has--output
option for saving Data Contract to file- Enhance JSON file validation (local and S3) to return the first error for each JSON object, the max number of total errors can be configured via the environment variable:
DATACONTRACT_MAX_ERRORS
. Furthermore, the primaryKey will be additionally added to the error message. - fixes issue where records with no fields create an invalid bq schema.
Changed
- Changelog support for custom extension keys in
Models
andFields
blocks. datacontract catalog --files '*.yaml'
now checks also any subfolders for such files.- Optimize test output table on console if tests fail
Fixed
- raise valid exception in DataContractSpecification.from_file if file does not exist
- Fix importing JSON Schemas containing deeply nested objects without
required
array - SodaCL: Only add data quality tests for executable queries
v0.10.14
[0.10.14] - 2024-10-26
Data Contract CLI now supports the Open Data Contract Standard (ODCS) v3.0.0.
Added
datacontract test
now also supports ODCS v3 data contract formatdatacontract export --format odcs_v3
: Export to Open Data Contract Standard v3.0.0 (#460)datacontract test
now also supports ODCS v3 anda Data Contract SQL quality checks on field and model level- Support for import from Iceberg table definitions.
- Support for decimal logical type on avro export.
- Support for custom Trino types
Changed
datacontract import --format odcs
: Now supports ODSC v3.0.0 files (#474)datacontract export --format odcs
: Now creates v3.0.0 Open Data Contract Standard files (alias to odcs_v3). Old versions are still available as formatodcs_v2
. (#460)
Fixed
- fix timestamp serialization from parquet -> duckdb (#472)
v0.10.13
Prepare 0.10.13
v0.10.12
Added
- Support for import of DBML Models (#379)
datacontract export --format sqlalchemy
: Export to SQLAlchemy ORM models (#399)- Support of varchar max length in Glue import (#351)
datacontract publish
now also accepts theDATACONTRACT_MANAGER_API_KEY
as an environment variable- Support required fields for Avro schema export (#390)
- Support data type map in Spark import and export (#408)
- Support of enum on export to avro
- Support of enum title on avro import
Changed
- Deltalake is now using DuckDB's native deltalake support (#258). Extra deltalake removed.
- When dumping to YAML (import) the alias name is used instead of the pythonic name. (#373)
Fixed
- Fix an issue where the datacontract cli fails if installed without any extras (#400)
- Fix an issue where Glue database without a location creates invalid data contract (#351)
- Fix bigint -> long data type mapping (#351)
- Fix an issue where column description for Glue partition key column is ignored (#351)
- Corrected name of table parameter for bigquery import (#377)
- Fix a failed to connect to S3 Server (#384)
- Fix a model bug mismatching with the specification (
definitions.fields
) (#375) - Fix array type management in Spark import (#408)
v0.10.11
Added
- Support data type map in Glue import. (#340)
- Basic html export for new
keys
andvalues
fields. - Support for recognition of 1 to 1 relationships when exporting to DBML.
- Added support for arrays in JSON schema import (#305)
Changed
- Aligned JSON schema import and export of required properties
Fixed
- Fix required field handling in JSON schema import
- Fix an issue where the quality and definition
$ref
are not always resolved. - Fix an issue where the JSON schema validation fails for a field with type
string
and formatuuid
- Fix an issue where common DBML renderers may not be able to parse parts of an exported file.