Skip to content

Commit

Permalink
Merge branch 'current' into linting
Browse files Browse the repository at this point in the history
  • Loading branch information
mirnawong1 authored Dec 3, 2024
2 parents 05a160d + 75c062e commit 16875a5
Show file tree
Hide file tree
Showing 17 changed files with 74 additions and 62 deletions.
2 changes: 1 addition & 1 deletion website/docs/docs/build/incremental-microbatch.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ models:
```
</File>
We run the `sessions` model on October 1, 2024, and then again on October 2. It produces the following queries:
We run the `sessions` model for October 1, 2024, and then again for October 2. It produces the following queries:

<Tabs>

Expand Down
6 changes: 4 additions & 2 deletions website/docs/docs/build/incremental-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,15 +156,17 @@ Building this model incrementally without the `unique_key` parameter would resul
## How do I rebuild an incremental model?
If your incremental model logic has changed, the transformations on your new rows of data may diverge from the historical transformations, which are stored in your target table. In this case, you should rebuild your incremental model.

To force dbt to rebuild the entire incremental model from scratch, use the `--full-refresh` flag on the command line. This flag will cause dbt to drop the existing target table in the database before rebuilding it for all-time.
To force dbt to rebuild the entire incremental model from scratch, use the `--full-refresh` flag on the command line. This flag will cause dbt to drop the existing target table in the database before rebuilding it for all-time.

```bash
$ dbt run --full-refresh --select my_incremental_model+
```

It's also advisable to rebuild any downstream models, as indicated by the trailing `+`.

For detailed usage instructions, check out the [dbt run](/reference/commands/run) documentation.
You can optionally use the [`full_refresh config`](/reference/resource-configs/full_refresh) to set a resource to always or never full-refresh at the project or resource level. If specified as true or false, the `full_refresh` config will take precedence over the presence or absence of the `--full-refresh` flag.

For detailed usage instructions, check out the [dbt run](/reference/commands/run) documentation.

## What if the columns of my incremental model change?

Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/incremental-strategy.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Click the name of the adapter in the below table for more information about supp
| [dbt-redshift](/reference/resource-configs/redshift-configs#incremental-materialization-strategies) |||| ||
| [dbt-bigquery](/reference/resource-configs/bigquery-configs#merge-behavior-incremental-models) | || |||
| [dbt-spark](/reference/resource-configs/spark-configs#incremental-models) ||| |||
| [dbt-databricks](/reference/resource-configs/databricks-configs#incremental-models) ||| || |
| [dbt-databricks](/reference/resource-configs/databricks-configs#incremental-models) ||| || |
| [dbt-snowflake](/reference/resource-configs/snowflake-configs#merge-behavior-incremental-models) |||| ||
| [dbt-trino](/reference/resource-configs/trino-configs#incremental) |||| | |
| [dbt-fabric](/reference/resource-configs/fabric-configs#incremental) |||| | |
Expand Down
5 changes: 3 additions & 2 deletions website/docs/docs/build/python-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -641,7 +641,8 @@ In their initial launch, Python models are supported on three of the most popula
**Installing packages:** Snowpark supports several popular packages via Anaconda. Refer to the [complete list](https://repo.anaconda.com/pkgs/snowflake/) for more details. Packages are installed when your model is run. Different models can have different package dependencies. If you use third-party packages, Snowflake recommends using a dedicated virtual warehouse for best performance rather than one with many concurrent users.

**Python version:** To specify a different python version, use the following configuration:
```

```python
def model(dbt, session):
dbt.config(
materialized = "table",
Expand All @@ -653,7 +654,7 @@ def model(dbt, session):

**External access integrations and secrets**: To query external APIs within dbt Python models, use Snowflake’s [external access](https://docs.snowflake.com/en/developer-guide/external-network-access/external-network-access-overview) together with [secrets](https://docs.snowflake.com/en/developer-guide/external-network-access/secret-api-reference). Here are some additional configurations you can use:

```
```python
import pandas
import snowflake.snowpark as snowpark
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/build/snapshots.md
Original file line number Diff line number Diff line change
Expand Up @@ -487,7 +487,7 @@ Snapshot results:

For information about configuring snapshots in dbt versions 1.8 and earlier, select **1.8** from the documentation version picker, and it will appear in this section.

To configure snapshots in versions 1.9 and later, refer to [Configuring snapshots](#configuring-snapshots). The latest versions use an updated snapshot configuration syntax that optimizes performance.
To configure snapshots in versions 1.9 and later, refer to [Configuring snapshots](#configuring-snapshots). The latest versions use a more ergonomic snapshot configuration syntax that also speeds up parsing and compilation.

</VersionBlock>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ The following are the required fields for setting up a connection with a [Starbu
| **Host** | The hostname of your cluster. Don't include the HTTP protocol prefix. | `mycluster.mydomain.com` |
| **Port** | The port to connect to your cluster. By default, it's 443 for TLS enabled clusters. | `443` |
| **User** | The username (of the account) to log in to your cluster. When connecting to Starburst Galaxy clusters, you must include the role of the user as a suffix to the username.<br/><br/> | Format for Starburst Enterprise or Trino depends on your configured authentication method. <br/>Format for Starburst Galaxy:<br/> <ul><li>`[email protected]/role`</li></ul> |
| **Password** | The user's password. | |
| **Password** | The user's password. | - |
| **Database** | The name of a catalog in your cluster. | `example_catalog` |
| **Schema** | The name of a schema that exists within the specified catalog.  | `example_schema` |

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,12 @@ sidebar_label: "Connect BigQuery"

:::info Uploading a service account JSON keyfile

While the fields in a BigQuery connection can be specified manually, we recommend uploading a service account <Term id="json" /> keyfile to quickly and accurately configure a connection to BigQuery.
While the fields in a BigQuery connection can be specified manually, we recommend uploading a service account <Term id="json" /> keyfile to quickly and accurately configure a connection to BigQuery.

You can provide the JSON keyfile in one of two formats:

- JSON keyfile upload &mdash; Upload the keyfile directly in its normal JSON format.
- Base64-encoded string &mdash; Provide the keyfile as a base64-encoded string. When you provide a base64-encoded string, dbt decodes it automatically and populates the necessary fields.

:::

Expand Down
4 changes: 2 additions & 2 deletions website/docs/docs/cloud/git/connect-gitlab.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,8 +61,8 @@ In GitLab, when creating your Group Application, input the following:
| ------ | ----- |
| **Name** | dbt Cloud |
| **Redirect URI** | `https://YOUR_ACCESS_URL/complete/gitlab` |
| **Confidential** | ✔️ |
| **Scopes** | ✔️ api |
| **Confidential** | |
| **Scopes** | api |

Replace `YOUR_ACCESS_URL` with the [appropriate Access URL](/docs/cloud/about-cloud/access-regions-ip-addresses) for your region and plan.

Expand Down
46 changes: 23 additions & 23 deletions website/docs/docs/cloud/manage-access/self-service-permissions.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,33 +52,33 @@ The following tables outline the access that users have if they are assigned a D

| Account-level permission| Owner | Member | Read-only license| IT license |
|:------------------------|:-----:|:------:|:----------------:|:------------:|
| Account settings | W | W | | W |
| Billing | W | | | W |
| Invitations | W | W | | W |
| Licenses | W | R | | W |
| Users | W | R | | W |
| Project (create) | W | W | | W |
| Connections | W | W | | W |
| Service tokens | W | | | W |
| Webhooks | W | W | | |
| Account settings | W | W | - | W |
| Billing | W | - | - | W |
| Invitations | W | W | - | W |
| Licenses | W | R | - | W |
| Users | W | R | - | W |
| Project (create) | W | W | - | W |
| Connections | W | W | - | W |
| Service tokens | W | - | - | W |
| Webhooks | W | W | - | - |

#### Project permissions for account roles

|Project-level permission | Owner | Member | Read-only | IT license |
|:------------------------|:-----:|:-------:|:---------:|:----------:|
| Adapters | W | W | R | |
| Connections | W | W | R | |
| Credentials | W | W | R | |
| Custom env. variables | W | W | R | |
| Develop (IDE or dbt Cloud CLI)| W | W | | |
| Environments | W | W | R | |
| Jobs | W | W | R | |
| dbt Explorer | W | W | R | |
| Permissions | W | R | | |
| Profile | W | W | R | |
| Projects | W | W | R | |
| Repositories | W | W | R | |
| Runs | W | W | R | |
| Semantic Layer Config | W | W | R | |
| Adapters | W | W | R | - |
| Connections | W | W | R | - |
| Credentials | W | W | R | - |
| Custom env. variables | W | W | R | - |
| Develop (IDE or dbt Cloud CLI)| W | W | - | - |
| Environments | W | W | R | - |
| Jobs | W | W | R | - |
| dbt Explorer | W | W | R | - |
| Permissions | W | R | - | - |
| Profile | W | W | R | - |
| Projects | W | W | R | - |
| Repositories | W | W | R | - |
| Runs | W | W | R | - |
| Semantic Layer Config | W | W | R | - |


14 changes: 6 additions & 8 deletions website/docs/docs/collaborate/govern/model-contracts.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,13 +205,11 @@ At the same time, for models with many columns, we understand that this can mean

When comparing to a previous project state, dbt will look for breaking changes that could impact downstream consumers. If breaking changes are detected, dbt will present a contract error.

Breaking changes include:
- Removing an existing column.
- Changing the `data_type` of an existing column.
- Removing or modifying one of the `constraints` on an existing column (dbt v1.6 or higher).
- Removing a contracted model by deleting, renaming, or disabling it (dbt v1.9 or higher).
- versioned models will raise an error.
- unversioned models will raise a warning.
import BreakingChanges from '/snippets/_versions-contracts.md';

More details are available in the [contract reference](/reference/resource-configs/contract#detecting-breaking-changes).
<BreakingChanges
value="Removing a contracted model by deleting, renaming, or disabling it (dbt v1.9 or higher)."
value2="versioned models will raise an error. unversioned models will raise a warning."
/>

More details are available in the [contract reference](/reference/resource-configs/contract#detecting-breaking-changes).
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ The following profile fields are always required except for `user`, which is als

| Field | Example | Description |
| --------- | ------- | ----------- |
| `host` | `mycluster.mydomain.com` | The hostname of your cluster.<br/><br/>Don't include the `http://` or `https://` prefix. |
| `host` | `mycluster.mydomain.com`<br/><br/>Format for Starburst Galaxy:<br/><ul><li>`mygalaxyaccountname-myclustername.trino.galaxy.starburst.io`</li></ul> | The hostname of your cluster.<br/><br/>Don't include the `http://` or `https://` prefix. |
| `database` | `my_postgres_catalog` | The name of a catalog in your cluster. |
| `schema` | `my_schema` | The name of a schema within your cluster's catalog. <br/><br/>It's _not recommended_ to use schema names that have upper case or mixed case letters. |
| `port` | `443` | The port to connect to your cluster. By default, it's 443 for TLS enabled clusters. |
Expand Down
2 changes: 1 addition & 1 deletion website/docs/docs/dbt-cloud-apis/user-tokens.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ pagination_next: "docs/dbt-cloud-apis/service-tokens"

:::Warning

User API tokens have been deprecated and will no longer work. [Migrate](#migrate-from-user-api-keys-to-personal-access-tokens) to personal access tokens to resume services.
User API tokens have been deprecated and will no longer work. [Migrate](#migrate-deprecated-user-api-keys-to-personal-access-tokens) to personal access tokens to resume services.

:::

Expand Down
7 changes: 4 additions & 3 deletions website/docs/guides/bigquery-qs.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,13 +85,14 @@ In order to let dbt connect to your warehouse, you'll need to generate a keyfile
3. Create a service account key for your new project from the [Service accounts page](https://console.cloud.google.com/iam-admin/serviceaccounts?walkthrough_id=iam--create-service-account-keys&start_index=1#step_index=1). For more information, refer to [Create a service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating) in the Google Cloud docs. When downloading the JSON file, make sure to use a filename you can easily remember. For example, `dbt-user-creds.json`. For security reasons, dbt Labs recommends that you protect this JSON file like you would your identity credentials; for example, don't check the JSON file into your version control software.

## Connect dbt Cloud to BigQuery​
1. Create a new project in [dbt Cloud](/docs/cloud/about-cloud/access-regions-ip-addresses). Navigate to **Account settings** (by clicking on your account name in the left side menu), and click **+ New Project**.
1. Create a new project in [dbt Cloud](/docs/cloud/about-cloud/access-regions-ip-addresses). Navigate to **Account settings** (by clicking on your account name in the left side menu), and click **+ New project**.
2. Enter a project name and click **Continue**.
3. For the warehouse, click **BigQuery** then **Next** to set up your connection.
4. Click **Upload a Service Account JSON File** in settings.
5. Select the JSON file you downloaded in [Generate BigQuery credentials](#generate-bigquery-credentials) and dbt Cloud will fill in all the necessary fields.
6. Click **Test Connection**. This verifies that dbt Cloud can access your BigQuery account.
7. Click **Next** if the test succeeded. If it failed, you might need to go back and regenerate your BigQuery credentials.
6. Optional &mdash; dbt Cloud Enterprise plans can configure developer OAuth with BigQuery, providing an additional layer of security. For more information, refer to [Set up BigQuery OAuth](/docs/cloud/manage-access/set-up-bigquery-oauth).
7. Click **Test Connection**. This verifies that dbt Cloud can access your BigQuery account.
8. Click **Next** if the test succeeded. If it failed, you might need to go back and regenerate your BigQuery credentials.


## Set up a dbt Cloud managed repository
Expand Down
13 changes: 5 additions & 8 deletions website/docs/reference/resource-configs/database.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,22 +79,19 @@ This results in the generated relation being located in the `snapshots` database

<TabItem value="test" label="Tests">

Configure a database in your `dbt_project.yml` file.
Customize the database for storing test results in your `dbt_project.yml` file.

For example, to load a test into a database called `reporting` instead of the target database, you can configure it like this:
For example, to save test results in a specific database, you can configure it like this:

<File name='dbt_project.yml'>

```yml
tests:
- my_not_null_test:
column_name: order_id
type: not_null
+database: reporting
+store_failures: true
+database: test_results
```

This would result in the generated relation being located in the `reporting` database, so the full relation name would be `reporting.finance.my_not_null_test`.

This would result in the test results being stored in the `test_results` database.
</File>
</TabItem>
</Tabs>
Expand Down
14 changes: 7 additions & 7 deletions website/docs/reference/resource-properties/versions.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,13 +73,13 @@ Note that the value of `defined_in` and the `alias` configuration of a model are

When you use the `state:modified` selection method in Slim CI, dbt will detect changes to versioned model contracts, and raise an error if any of those changes could be breaking for downstream consumers.

Breaking changes include:
- Removing an existing column
- Changing the `data_type` of an existing column
- Removing or modifying one of the `constraints` on an existing column (dbt v1.6 or higher)
- Changing unversioned, contracted models.
- dbt also warns if a model has or had a contract but isn't versioned
import BreakingChanges from '/snippets/_versions-contracts.md';

<BreakingChanges
value="Changing unversioned, contracted models."
value2="dbt also warns if a model has or had a contract but isn't versioned."
/>

<Tabs>

<TabItem value="unversioned" label="Example message for unversioned models">
Expand Down
1 change: 1 addition & 0 deletions website/docs/reference/snapshot-configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -347,6 +347,7 @@ The following examples demonstrate how to configure snapshots using the `dbt_pro
{{
config(
unique_key='id',
target_schema='snapshots',
strategy='timestamp',
updated_at='updated_at'
)
Expand Down
7 changes: 7 additions & 0 deletions website/snippets/_versions-contracts.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Breaking changes include:

- Removing an existing column
- Changing the data_type of an existing column
- Removing or modifying one of the `constraints` on an existing column (dbt v1.6 or higher)
- {props.value}
- {props.value2}

0 comments on commit 16875a5

Please sign in to comment.