Skip to content

Commit

Permalink
Merge branch 'main' of github.com:dgomez04/terraform-provider-databri…
Browse files Browse the repository at this point in the history
…cks into feature/3950-databricks_notification_destination
  • Loading branch information
dgomez04 committed Oct 9, 2024
2 parents 8784d4c + 4e5951e commit 14b618d
Show file tree
Hide file tree
Showing 307 changed files with 9,147 additions and 3,137 deletions.
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
3eae49b444cac5a0118a3503e5b7ecef7f96527a
0c86ea6dbd9a730c24ff0d4e509603e476955ac5
106 changes: 106 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,111 @@
# Version changelog

## [Release] Release v1.53.0

### New Features and Improvements

* Add `databricks_budget` resource ([#3955](https://github.com/databricks/terraform-provider-databricks/pull/3955)).
* Add `databricks_mlflow_models` data source ([#3874](https://github.com/databricks/terraform-provider-databricks/pull/3874)).
* Add computed attribute `table_serving_url` to `databricks_online_table` ([#4048](https://github.com/databricks/terraform-provider-databricks/pull/4048)).
* Add support for Identity Column in `databricks_sql_table` ([#4035](https://github.com/databricks/terraform-provider-databricks/pull/4035)).


### Bug Fixes

* Add Sufficient Network Privileges to the Databricks Default Cross Account Policy ([#4027](https://github.com/databricks/terraform-provider-databricks/pull/4027))
* Ignore presence or absence of `/Workspace` prefix for dashboard resource ([#4061](https://github.com/databricks/terraform-provider-databricks/pull/4061)).
* Refactor `databricks_permissions` and allow the current user to set their own permissions ([#3956](https://github.com/databricks/terraform-provider-databricks/pull/3956)).
* Set ID for online table resource if creation succeeds but it isn't available yet ([#4072](https://github.com/databricks/terraform-provider-databricks/pull/4072)).


### Documentation

* Update CONTRIBUTING guide for plugin framework resources ([#4078](https://github.com/databricks/terraform-provider-databricks/pull/4078))
* Add guide for OIDC authentication ([#4016](https://github.com/databricks/terraform-provider-databricks/pull/4016)).
* Correctly use native markdown callouts supported by TF Registry ([#4073](https://github.com/databricks/terraform-provider-databricks/pull/4073)).
* Fixing links to `databricks_service_principal` in TF guides ([#4020](https://github.com/databricks/terraform-provider-databricks/pull/4020)).


### Internal Changes

* Fix Permissions Dashboard Test ([#4071](https://github.com/databricks/terraform-provider-databricks/pull/4071)).
* Bump Go SDK to latest and generate TF structs ([#4062](https://github.com/databricks/terraform-provider-databricks/pull/4062)).
* Skip Budget tests on GCP ([#4070](https://github.com/databricks/terraform-provider-databricks/pull/4070)).
* Update to latest OpenAPI spec and bump Go SDK ([#4069](https://github.com/databricks/terraform-provider-databricks/pull/4069)).


## [Release] Release v1.52.0

### New Features and Improvements

* Add support for filters in `databricks_clusters` data source ([#4014](https://github.com/databricks/terraform-provider-databricks/pull/4014)).
* Added `no_wait` option for clusters to skip waiting to start on cluster creation ([#3953](https://github.com/databricks/terraform-provider-databricks/pull/3953)).
* Introduced Plugin Framework ([#3920](https://github.com/databricks/terraform-provider-databricks/pull/3920)).


### Bug Fixes

* Add suppress diff for `azure_attributes.spot_bid_max_price` in `databricks_instance_pool` ([#3970](https://github.com/databricks/terraform-provider-databricks/pull/3970)).
* Correctly send workload_type fields in `databricks_cluster` to allow users to disable usage in certain contexts ([#3972](https://github.com/databricks/terraform-provider-databricks/pull/3972)).
* Fix `databricks_sql_table` treatment of properties ([#3925](https://github.com/databricks/terraform-provider-databricks/pull/3925)).
* Force send fields for settings resources ([#3978](https://github.com/databricks/terraform-provider-databricks/pull/3978)).
* Handle cluster deletion in `databricks_library` read ([#3909](https://github.com/databricks/terraform-provider-databricks/pull/3909)).
* Make subscriptions optional for SqlAlertTask ([#3983](https://github.com/databricks/terraform-provider-databricks/pull/3983)).
* Permanently delete `ERROR` and `TERMINATED` state clusters if their creation fails ([#4021](https://github.com/databricks/terraform-provider-databricks/pull/4021)).


### Documentation

* Add troubleshooting guide for Provider "registry.terraform.io/databricks/databricks" planned an invalid value ([#3961](https://github.com/databricks/terraform-provider-databricks/pull/3961)).
* Adopt official naming of Mosaic AI Vector Search ([#3971](https://github.com/databricks/terraform-provider-databricks/pull/3971)).
* Document Terraform 1.0 as minimum version ([#3952](https://github.com/databricks/terraform-provider-databricks/pull/3952)).
* Mention Salesforce as supported type in `databricks_connection` ([#3949](https://github.com/databricks/terraform-provider-databricks/pull/3949)).
* Reimplement Azure Databricks deployment guide to use VNet injection & NPIP ([#3986](https://github.com/databricks/terraform-provider-databricks/pull/3986)).
* Resolves [#3127](https://github.com/databricks/terraform-provider-databricks/pull/3127): Remove deprecated account_id field from mws_credentials resource ([#3974](https://github.com/databricks/terraform-provider-databricks/pull/3974)).
* Small Grammar Corrections in Docs ([#4006](https://github.com/databricks/terraform-provider-databricks/pull/4006)).
* Update `databricks_vector_search_index` docs to match latest SDK ([#4008](https://github.com/databricks/terraform-provider-databricks/pull/4008)).
* Update aws_unity_catalog_assume_role_policy.md ([#3968](https://github.com/databricks/terraform-provider-databricks/pull/3968)).
* Update documentation regarding authentication with Azure-managed Service Principal using GITHUB OIDC ([#3932](https://github.com/databricks/terraform-provider-databricks/pull/3932)).
* Update metastore_assignment.md to properly reflect possible usage ([#3967](https://github.com/databricks/terraform-provider-databricks/pull/3967)).
* Update minimum supported terraform version to 1.1.5 ([#3965](https://github.com/databricks/terraform-provider-databricks/pull/3965)).
* Update resources diagram to include newer resources ([#3962](https://github.com/databricks/terraform-provider-databricks/pull/3962)).
* Update workspace_binding import command ([#3944](https://github.com/databricks/terraform-provider-databricks/pull/3944)).
* fix possible values for `securable_type` in `databricks_workspace_binding` ([#3942](https://github.com/databricks/terraform-provider-databricks/pull/3942)).


### Internal Changes

* Add `AddPlanModifer` method for AttributeBuilder ([#4009](https://github.com/databricks/terraform-provider-databricks/pull/4009)).
* Add integration tests for volumes and quality monitor plugin framework ([#3975](https://github.com/databricks/terraform-provider-databricks/pull/3975)).
* Add support for `computed` tag in TfSDK Structs ([#4005](https://github.com/databricks/terraform-provider-databricks/pull/4005)).
* Added `databricks_quality_monitor` resource and `databricks_volumes` data source to plugin framework ([#3958](https://github.com/databricks/terraform-provider-databricks/pull/3958)).
* Allow vector search tests to fail ([#3959](https://github.com/databricks/terraform-provider-databricks/pull/3959)).
* Clean up comments in library resource ([#4015](https://github.com/databricks/terraform-provider-databricks/pull/4015)).
* Fix irregularities in plugin framework converter function errors ([#4010](https://github.com/databricks/terraform-provider-databricks/pull/4010)).
* Make test utils public and move integration test for quality monitor ([#3993](https://github.com/databricks/terraform-provider-databricks/pull/3993)).
* Migrate Share resource to Go SDK ([#3916](https://github.com/databricks/terraform-provider-databricks/pull/3916)).
* Migrate `databricks_cluster` data source to plugin framework ([#3988](https://github.com/databricks/terraform-provider-databricks/pull/3988)).
* Migrate imports for terraform plugin framework + update init test provider factory ([#3943](https://github.com/databricks/terraform-provider-databricks/pull/3943)).
* Move volumes test next to plugin framework data source ([#3995](https://github.com/databricks/terraform-provider-databricks/pull/3995)).
* Refactor provider and related packages ([#3940](https://github.com/databricks/terraform-provider-databricks/pull/3940)).
* Support import in acceptance test + adding import state for quality monitor ([#3994](https://github.com/databricks/terraform-provider-databricks/pull/3994)).
* Library plugin framework migration ([#3979](https://github.com/databricks/terraform-provider-databricks/pull/3979)).
* Fix `TestAccClusterResource_WorkloadType` ([#3989](https://github.com/databricks/terraform-provider-databricks/pull/3989)).


### Dependency Updates

* Bump github.com/hashicorp/hcl/v2 from 2.21.0 to 2.22.0 ([#3948](https://github.com/databricks/terraform-provider-databricks/pull/3948)).
* Update Go SDK to 0.46.0 ([#4007](https://github.com/databricks/terraform-provider-databricks/pull/4007)).


### Exporter

* Don't generate instance pools if the pool name is empty ([#3960](https://github.com/databricks/terraform-provider-databricks/pull/3960)).
* Expand list of non-interactive clusters ([#4023](https://github.com/databricks/terraform-provider-databricks/pull/4023)).
* Ignore databricks_artifact_allowlist with zero artifact_matcher blocks ([#4019](https://github.com/databricks/terraform-provider-databricks/pull/4019)).


## [Release] Release v1.51.0

### Breaking Changes
Expand Down
66 changes: 64 additions & 2 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,13 +109,75 @@ provider_installation {

After installing the necessary software for building provider from sources, you should be able to run `make coverage` to run the tests and see the coverage.

## Package organization for Providers
## Developing Resources or Data Sources using Plugin Framework

### Package organization for Providers
We are migrating the resource from SDKv2 to Plugin Framework provider and hence both of them exist in the codebase. For uniform code convention, readability and development, they are organized in the `internal/providers` directory under root as follows:
- `providers`: Contains the changes that `depends` on both internal/providers/sdkv2 and internal/providers/pluginfw packages, eg: `GetProviderServer`.
- `common`: Contains the changes `used by` both internal/providers/sdkv2 and internal/providers/pluginfw packages, eg: `ProviderName`.
- `pluginfw`: Contains the changes specific to Plugin Framework. This package shouldn't depend on sdkv2 or common.
- `sdkv2`: Contains the changes specific to SDKv2. This package shouldn't depend on pluginfw or common.

### Adding a new resource
1. Check if the directory for this particular resource exists under `internal/providers/pluginfw/resources`, if not create the directory eg: `cluster`, `volume` etc... Please note: Resources and Data sources are organized under the same package for that service.
2. Create a file with resource_resource-name.go and write the CRUD methods, schema for that resource. For reference, please take a look at existing resources eg: `resource_quality_monitor.go`
3. Create a file with `resource_resource-name_acc_test.go` and add integration tests here.
4. Create a file with `resource_resource-name_test.go` and add unit tests here. Note: Please make sure to abstract specific method of the resource so they are unit test friendly and not testing internal part of terraform plugin framework library. You can compare the diagnostics, for example: please take a look at: `data_cluster_test.go`
5. Add the resource under `internal/providers/pluginfw/pluginfw.go` in `Resources()` method. Please update the list so that it stays in alphabetically sorted order.
6. Create a PR and send it for review.
### Adding a new data source
1. Check if the directory for this particular datasource exists under `internal/providers/pluginfw/resources`, if not create the directory eg: `cluster`, `volume` etc... Please note: Resources and Data sources are organized under the same package for that service.
2. Create a file with `data_resource-name.go` and write the CRUD methods, schema for that data source. For reference, please take a look at existing data sources eg: `data_cluster.go`
3. Create a file with `data_resource-name_acc_test.go` and add integration tests here.
4. Create a file with `data_resource-name_test.go` and add unit tests here. Note: Please make sure to abstract specific method of the resource so they are unit test friendly and not testing internal part of terraform plugin framework library. You can compare the diagnostics, for example: please take a look at: `data_cluster_test.go`
5. Add the resource under `internal/providers/pluginfw/pluginfw.go` in `DataSources()` method. Please update the list so that it stays in alphabetically sorted order.
6. Create a PR and send it for review.
### Migrating resource to plugin framework
Ideally there shouldn't be any behaviour change when migrating a resource or data source to either Go SDk or Plugin Framework.
- Please make sure there are no breaking differences due to changes in schema by running: `make diff-schema`.
- Integration tests shouldn't require any major changes.
### Code Organization
Each resource and data source should be defined in package `internal/providers/pluginfw/resources/<resource>`, e.g.: `internal/providers/pluginfw/resources/volume` package will contain both resource, data sources and other utils specific to volumes. Tests (both unit and integration tests) will also remain in this package.
Note: Only Docs will stay under root docs/ directory.
### Code Conventions
1. Make sure the resource or data source implemented is of the right type:
```golang
var _ resource.ResourceWithConfigure = &QualityMonitorResource{}
var _ datasource.DataSourceWithConfigure = &VolumesDataSource{}
```
2. To get the databricks client, `func (*common.DatabricksClient).GetWorkspaceClient()` or `func (*common.DatabricksClient).GetAccountClient()` will be used instead of directly using the underlying `WorkspaceClient()`, `AccountClient()` functions respectively.
3. Any method that returns only diagnostics should be called inline while appending diagnostics in response. Example:
```golang
resp.Diagnostics.Append(req.Plan.Get(ctx, &monitorInfoTfSDK)...)
if resp.Diagnostics.HasError() {
return
}
```
is preferred over the following:
```golang
diags := req.Plan.Get(ctx, &monitorInfoTfSDK)
if diags.HasError() {
resp.Diagnostics.Append(diags...)
return
}
```
4. Any method returning an error should directly be followed by appending that to the diagnostics.
```golang
err := method()
if err != nil {
resp.Diagnostics.AddError("message", err.Error())
return
}
```
5. Any method returning a value alongside Diagnostics should also directly be followed by appending that to the diagnostics.
## Debugging
Expand Down Expand Up @@ -287,7 +349,7 @@ func TestExampleResourceCreate(t *testing.T) {
```go
func TestAccSecretAclResource(t *testing.T) {
workspaceLevel(t, step{
WorkspaceLevel(t, Step{
Template: `
resource "databricks_group" "ds" {
display_name = "data-scientists-{var.RANDOM}"
Expand Down
4 changes: 4 additions & 0 deletions aws/data_aws_crossaccount_policy.go
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,10 @@ func DataAwsCrossaccountPolicy() common.Resource {
// additional permissions for Databricks-managed VPC policy
if data.PolicyType == "managed" {
actions = append(actions, []string{
"ec2:AttachInternetGateway",
"ec2:AllocateAddress",
"ec2:AssociateDhcpOptions",
"ec2:AssociateRouteTable",
"ec2:CreateDhcpOptions",
"ec2:CreateInternetGateway",
"ec2:CreateNatGateway",
Expand Down
Loading

0 comments on commit 14b618d

Please sign in to comment.