-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adapt get_url_parameter to work with SparkSQL #11
Comments
Just transferred this to the spark-utils repo, since I think we'll want to contribute the fix here rather than on dbt utils! |
@clrcrl Thanks for transferring! @foundinblank I think this could be fixed by the improvements to |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
The
get_url_parameter()
macro breaks on Spark SQL (Databricks). I've come up with a replacement macro that'll work on SparkSQL and am wondering if I could contribute that fix.Steps to reproduce
This was triggered when setting up Google Ads which uses
get_url_parameter()
macros: https:/fivetran/dbt_google_ads_source/blob/master/models/stg_google_ads__final_url_performance.sql#L30-L34.Expected results
I expected no errors to be thrown and UTM parameters to be parsed out per the model definition.
Actual results
Model fails to build with the error message:
It passes when using this local macro as a replacement (stored in our
/macros
folder) which overwrites dbt_util's macro:System information
packages.yml
Which database are you using dbt with?
The output of
dbt --version
:Are you interested in contributing the fix?
I'm happy to contribute my macro which works on SparkSQL. If there's a way for
dbt_utils
to know which database or adapter it's running on, it could pass the appropriate macro?The text was updated successfully, but these errors were encountered: