-
Notifications
You must be signed in to change notification settings - Fork 150
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Support run multiple dbt incremental models in the same table in parallel #1362
Open
3 tasks done
Labels
Comments
This code snippet proposes improvements to the https:/dbt-labs/dbt-bigquery/blob/main/dbt/include/bigquery/macros/materializations/incremental.sql by introducing a new table configuration
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Is this your first time submitting a feature request?
Describe the feature
Context
dbt creates a temporary table and uses a merge statement on Bigquery to refresh incremental models. See https://discourse.getdbt.com/t/bigquery-dbt-incremental-changes/982 for more details.
Current Status
In the current implementation, it automatically generates a temporary table name by appending
__dbt_tmp
to the existing target table name. This limitation prevents parallel execution of incremental models that target the same table. See https:/dbt-labs/dbt-bigquery/blob/main/dbt/include/bigquery/macros/materializations/incremental.sql#L80C26-L80C44 and https:/dbt-labs/dbt-adapters/blob/main/dbt/include/global_project/macros/adapters/relation.sql#L9C12-L9C31).Request
Provide a setting to create unique temporary table names for parallel execution of multiple incremental models targeting the same table. There is a similar feature in dbt-athena See “unique_tmp_table_suffix” section in https:/dbt-labs/dbt-athena/blob/main/README.md
Describe alternatives you've considered
No response
Who will this benefit?
Developers using GCP Bigquery
Are you interested in contributing this feature?
Yes
Anything else?
No response
The text was updated successfully, but these errors were encountered: