-
Notifications
You must be signed in to change notification settings - Fork 400
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Introduced Terraform Plugin Framework #3751
[Feature] Introduced Terraform Plugin Framework #3751
Conversation
…ricks into introduce-plugin-framework-poc
…tabricks_storage_credential` (#3704) * add isolation mode support for external location & storage credential * add doc & test
* Add periodic triggers * Add acceptance test for periodic triggers * Fix typo
Bumps [github.com/hashicorp/hcl/v2](https://github.com/hashicorp/hcl) from 2.20.1 to 2.21.0. - [Release notes](https://github.com/hashicorp/hcl/releases) - [Changelog](https://github.com/hashicorp/hcl/blob/main/CHANGELOG.md) - [Commits](hashicorp/hcl@v2.20.1...v2.21.0) --- updated-dependencies: - dependency-name: github.com/hashicorp/hcl/v2 dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
* relax cluster check * fix * fix * refactor `databricks_cluster` data source to Go SDK * refactor `databricks_clusters` data source to Go SDK
…ce_binding` (#3703) * rename resource * fix test
* Exporter: fix generation of `run_as` blocks in `databricks_job` Because the `run_as` was marked as `computed` it was ignored when generating the code. * Ignore `run_as` for the current user
* data_volume * data_volume unit and acceptance tests * docs * WorkspaceDataWithCustomParams test * fixed formatting * Removing unnecessary changes to resource.go * refactored data_volume * making change for consitency with GO SDK v0.35.0 * Update catalog/data_volume.go * Update catalog/data_volume.go * data source as nested strucutre * review comments addressed * acceptance test --------- Co-authored-by: Alex Ott <alexey.ott@databricks.com> Co-authored-by: vuong-nguyen <44292934+nkvuong@users.noreply.github.com>
…to specific workspaces (#3678) * add isolation mode * rename * doc * fix doc * add tests * add acceptance tests * add computed * typo * add tests * use correct isolation_mode * fix test
…rraform-provider-databricks into tanmay/introduce-plugin-framework
c.SetRequired("assets_dir") | ||
c.SetRequired("output_schema_name") | ||
c.SetRequired("table_name") | ||
// TODO: Uncomment this once SetReadOnly is supported in the plugin framework |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't know how to actually achieve this. As I saw in the provider schema types, all of the isReadOnly
functions are hard coded to be returning false
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, thanks!
## Changes <!-- Summary of your changes that are easy to understand --> This PR introduces plugin framework to the Databricks Terraform Provider. - Introduce plugin framework - Add provider schema - Add support to configure databricks client - Add support for provider fixture - Add support to get raw config to be used in provider tests and further in unit testing framework. - Fix bugs in configuring client + other misc bugs Note: It is a subpart of: #3751. Rest would be added once support for schema customisation and SDK converted are added. ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Unit tests, nightly were run over the above PRs. - [ ] `make test` run locally - [ ] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] relevant acceptance tests are passing - [ ] using Go SDK
…s_volumes` data source to plugin framework (#3958) ## Changes <!-- Summary of your changes that are easy to understand --> Add support for Quality Monitor resource and Volumes data source in plugin framework. Also add utility method to get workspace and account client with diagnostics. Subpart of: #3751 + Addressed comments from: https://github.com/databricks/terraform-provider-databricks/pull/3893/files#diff-8edec0f6289f63bfb61c7c6fb76fc1a61eec69c33fc2ba3a4ee6e77a8103f85b Note: The changes for flipping `databricks_quality_monitor_pluginframework` -> `databricks_quality_monitor` in pluginframework along with making this legacy for sdkv2 i.e. `databricks_quality_monitor` -> `databricks_quality_monitor_legacy` will be done in separate PR. Same for `databricks_volumes` ## Tests <!-- How is this tested? Please see the checklist below and also describe any other relevant tests --> Existing unit tests. Integration and Unit tests for this resource will be merged after this PR is merged. - [ ] `make test` run locally - [ ] relevant change in `docs/` folder - [ ] covered with integration tests in `internal/acceptance` - [ ] relevant acceptance tests are passing - [ ] using Go SDK --------- Co-authored-by: Miles Yucht <miles@databricks.com>
Changes
This PR introduces plugin framework to the Databricks Terraform Provider.
Tests
SetReadOnly
)Note: We don't have unit and integration test support as of now, so going ahead with commiting the .tf files for these resources so we can test the changes in the meantime. Once we have proper support, these can be converted to example or removed.