You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to create a macro that defines different SQL warehouses for different models/environments. For ex - dev would almost always have 2x-small and production x-large. It can also be that different models require different warehouses depending on size.
This is already possible without a macro. The compute definitions are per 'output' in your profile, so just use the same tag with a different definition that's matched to each env.
ex:
Thanks for the answer it is really helpful. Just for any future use case, is there a way to access the compute using a macro. Would we do it like target.compute.small_tables.http_path ?
Hi,
I am trying to create a macro that defines different SQL warehouses for different models/environments. For ex - dev would almost always have 2x-small and production x-large. It can also be that different models require different warehouses depending on size.
I am looking for something similar to - dbt-labs/dbt-snowflake#103 (comment) but for the databricks adapter. I know we can set compute manually from https://docs.getdbt.com/reference/resource-configs/databricks-configs#selecting-compute-per-model but looking for a way to automate based on environment first and table size in the next iteration.
I could not find any documentation about what information can be accessed via the target variable. It would be helpful if you can point towards that.
The text was updated successfully, but these errors were encountered: