-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Stage macro fails on ghost record creation in Databricks for binary datatype columns #299
Comments
Hi @kafkaguru and thanks for the clear description of the error you encountered! I could reproduce and also fix it on my side. If you set your packages.yml to
you could check if it also works for you. Best Regards |
Hi @kafkaguru, True, since these fixes were on separate branches they were independent from each other. Then you can install from the main branch and should be able to access both fixes. Later in the day we plan to publish another release, after that you can fall back to the default installation method of specifying a version number. Edit to add: Both fixes are also now available in the branch fix-databricks-ghost-records-binary-defaults as well! |
FYI @kafkaguru: As this is merged into main now the branch Thanks again for your contribution! Best Regards |
Describe the bug
When creating a stage view and setting
enable_ghost_records=true
, the resulting sql is invalid and fails to run the stage macro.Environment
To Reproduce
id: string, data: binary
source.yml
file and add the table to the filestg_data.sql
file with the following code:When validating the resulting compiled sql, the sql is invalid due to:
Reading the documentation, it seems that we should be able to configure unknown values for a binary data type as follows:
(This was also tried with BINARY in all caps)
Excluding binary columns from the input model and setting
enable_ghost_records=false
both resolve the issue but is not ideal.Expected behavior
Ghost records are created with unknown values for binary datatypes.
The text was updated successfully, but these errors were encountered: