Skip to content

Commit

Permalink
Merge branch 'main' into feature/app
Browse files Browse the repository at this point in the history
  • Loading branch information
nkvuong authored Nov 14, 2024
2 parents 2d7b38c + 714e78c commit b733ee9
Show file tree
Hide file tree
Showing 89 changed files with 2,994 additions and 1,462 deletions.
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
cf9c61453990df0f9453670f2fe68e1b128647a2
d25296d2f4aa7bd6195c816fdf82e0f960f775da
56 changes: 56 additions & 0 deletions .github/workflows/external-message.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
name: PR Comment

# WARNING:
# THIS WORKFLOW ALWAYS RUNS FOR EXTERNAL CONTRIBUTORS WITHOUT ANY APPROVAL.
# THIS WORKFLOW RUNS FROM MAIN BRANCH, NOT FROM THE PR BRANCH.
# DO NOT PULL THE PR OR EXECUTE ANY CODE FROM THE PR.

on:
pull_request_target:
types: [opened, reopened, synchronize]
branches:
- main

jobs:
comment-on-pr:
runs-on: ubuntu-latest
permissions:
pull-requests: write

steps:
- uses: actions/checkout@v4

- name: Delete old comments
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Delete previous comment if it exists
previous_comment_ids=$(gh api "repos/${{ github.repository }}/issues/${{ github.event.pull_request.number }}/comments" \
--jq '.[] | select(.body | startswith("<!-- INTEGRATION_TESTS_MANUAL -->")) | .id')
echo "Previous comment IDs: $previous_comment_ids"
# Iterate over each comment ID and delete the comment
if [ ! -z "$previous_comment_ids" ]; then
echo "$previous_comment_ids" | while read -r comment_id; do
echo "Deleting comment with ID: $comment_id"
gh api "repos/${{ github.repository }}/issues/comments/$comment_id" -X DELETE
done
fi
- name: Comment on PR
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
COMMIT_SHA: ${{ github.event.pull_request.head.sha }}
run: |
gh pr comment ${{ github.event.pull_request.number }} --body \
"<!-- INTEGRATION_TESTS_MANUAL -->
If integration tests don't run automatically, an authorized user can run them manually by following the instructions below:
Trigger:
[go/deco-tests-run/terraform](https://go/deco-tests-run/terraform)
Inputs:
* PR number: ${{github.event.pull_request.number}}
* Commit SHA: \`${{ env.COMMIT_SHA }}\`
Checks will be approved automatically on success.
"
21 changes: 20 additions & 1 deletion .github/workflows/integration-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,29 @@ on:


jobs:
check-token:
name: Check secrets access
runs-on: ubuntu-latest
environment: "test-trigger-is"
outputs:
has_token: ${{ steps.set-token-status.outputs.has_token }}
steps:
- name: Check if DECO_WORKFLOW_TRIGGER_APP_ID is set
id: set-token-status
run: |
if [ -z "${{ secrets.DECO_WORKFLOW_TRIGGER_APP_ID }}" ]; then
echo "DECO_WORKFLOW_TRIGGER_APP_ID is empty. User has no access to secrets."
echo "::set-output name=has_token::false"
else
echo "DECO_WORKFLOW_TRIGGER_APP_ID is set. User has access to secrets."
echo "::set-output name=has_token::true"
fi
trigger-tests:
if: github.event_name == 'pull_request'
name: Trigger Tests
runs-on: ubuntu-latest
needs: check-token
if: github.event_name == 'pull_request' && needs.check-token.outputs.has_token == 'true'
environment: "test-trigger-is"

steps:
Expand Down
52 changes: 52 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,57 @@
# Version changelog

## [Release] Release v1.58.0

### Bug Fixes

* Always fill `cluster_name` in `databricks_cluster` data source ([#4197](https://github.com/databricks/terraform-provider-databricks/pull/4197)).
* Suppress equal fold diff for DLT pipeline resource ([#4196](https://github.com/databricks/terraform-provider-databricks/pull/4196)).
* Upload content `databricks_workspace_file` using raw format ([#4200](https://github.com/databricks/terraform-provider-databricks/pull/4200)).


### Internal Changes

* Update to latest OpenAPI spec and bump Go SDK ([#4199](https://github.com/databricks/terraform-provider-databricks/pull/4199)).


### Dependency Updates

* Bump github.com/golang-jwt/jwt/v4 from 4.5.0 to 4.5.1 ([#4191](https://github.com/databricks/terraform-provider-databricks/pull/4191)).


## [Release] Release v1.57.0

### New Features and Improvements

* Added `databricks_functions` data source ([#4154](https://github.com/databricks/terraform-provider-databricks/pull/4154)).


### Bug Fixes

* Handle edge case for `effective_properties` in `databricks_sql_table` ([#4153](https://github.com/databricks/terraform-provider-databricks/pull/4153)).
* Provide more prescriptive error when users fail to create a single node cluster ([#4168](https://github.com/databricks/terraform-provider-databricks/pull/4168)).


### Internal Changes

* Add test instructions for external contributors ([#4169](https://github.com/databricks/terraform-provider-databricks/pull/4169)).
* Always write message for manual test integration ([#4188](https://github.com/databricks/terraform-provider-databricks/pull/4188)).
* Make `Read` after `Create`/`Update` configurable ([#4190](https://github.com/databricks/terraform-provider-databricks/pull/4190)).
* Migrate Share Data Source to Plugin Framework ([#4161](https://github.com/databricks/terraform-provider-databricks/pull/4161)).
* Migrate Share Resource to Plugin Framework ([#4047](https://github.com/databricks/terraform-provider-databricks/pull/4047)).
* Rollout Plugin Framework ([#4134](https://github.com/databricks/terraform-provider-databricks/pull/4134)).


### Dependency Updates

* Bump Go SDK to v0.50.0 ([#4178](https://github.com/databricks/terraform-provider-databricks/pull/4178)).


### Exporter

* Allow to match resource names by regular expression ([#4177](https://github.com/databricks/terraform-provider-databricks/pull/4177)).


## [Release] Release v1.56.0

### Bug Fixes
Expand Down
17 changes: 17 additions & 0 deletions aws/constants.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
package aws

var AwsConfig = map[string]map[string]string{
"aws": {
"accountId": "414351767826",
"logDeliveryIamArn": "arn:aws:iam::414351767826:role/SaasUsageDeliveryRole-prod-IAMRole-3PLHICCRR1TK",
"unityCatalogueIamArn": "arn:aws:iam::414351767826:role/unity-catalog-prod-UCMasterRole-14S5ZJVKOTYTL",
},
"aws-us-gov": {
"accountId": "044793339203",
"logDeliveryIamArn": "arn:aws-us-gov:iam::044793339203:role/SaasUsageDeliveryRole-prod-aws-gov-IAMRole-L4QM0RCHYQ1G",
"unityCatalogueIamArn": "arn:aws-us-gov:iam::044793339203:role/unity-catalog-prod-UCMasterRole-1QRFA8SGY15OJ",
},
}

var AwsPartitions = []string{"aws", "aws-us-gov"}
var AwsPartitionsValidationError = "aws_partition must be either 'aws' or 'aws-us-gov'"
26 changes: 19 additions & 7 deletions aws/data_aws_assume_role_policy.go
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import (

"github.com/databricks/terraform-provider-databricks/common"
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema"
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation"
)

type awsIamPolicy struct {
Expand All @@ -31,6 +32,13 @@ func DataAwsAssumeRolePolicy() common.Resource {
return common.Resource{
Read: func(ctx context.Context, d *schema.ResourceData, m *common.DatabricksClient) error {
externalID := d.Get("external_id").(string)
awsPartition := d.Get("aws_partition").(string)
databricksAwsAccountId := d.Get("databricks_account_id").(string)

if databricksAwsAccountId == "" {
databricksAwsAccountId = AwsConfig[awsPartition]["accountId"]
}

policy := awsIamPolicy{
Version: "2012-10-17",
Statements: []*awsIamPolicyStatement{
Expand All @@ -43,16 +51,14 @@ func DataAwsAssumeRolePolicy() common.Resource {
},
},
Principal: map[string]string{
"AWS": fmt.Sprintf("arn:aws:iam::%s:root", d.Get("databricks_account_id").(string)),
"AWS": fmt.Sprintf("arn:%s:iam::%s:root", awsPartition, databricksAwsAccountId),
},
},
},
}
if v, ok := d.GetOk("for_log_delivery"); ok {
if v.(bool) {
// this is production UsageDelivery IAM role, that is considered a constant
logDeliveryARN := "arn:aws:iam::414351767826:role/SaasUsageDeliveryRole-prod-IAMRole-3PLHICCRR1TK"
policy.Statements[0].Principal["AWS"] = logDeliveryARN
policy.Statements[0].Principal["AWS"] = AwsConfig[awsPartition]["logDeliveryIamArn"]
}
}
policyJSON, err := json.MarshalIndent(policy, "", " ")
Expand All @@ -65,10 +71,16 @@ func DataAwsAssumeRolePolicy() common.Resource {
return nil
},
Schema: map[string]*schema.Schema{
"aws_partition": {
Type: schema.TypeString,
Optional: true,
ValidateFunc: validation.StringInSlice(AwsPartitions, false),
Default: "aws",
},
"databricks_account_id": {
Type: schema.TypeString,
Default: "414351767826",
Optional: true,
Type: schema.TypeString,
Optional: true,
Deprecated: "databricks_account_id will be will be removed in the next major release.",
},
"for_log_delivery": {
Type: schema.TypeBool,
Expand Down
49 changes: 49 additions & 0 deletions aws/data_aws_assume_role_policy_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -19,3 +19,52 @@ func TestDataAwsAssumeRolePolicy(t *testing.T) {
j := d.Get("json")
assert.Lenf(t, j, 299, "Strange length for policy: %s", j)
}

func TestDataAwsAssumeRolePolicyGov(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Resource: DataAwsAssumeRolePolicy(),
NonWritable: true,
ID: ".",
HCL: `
aws_partition = "aws-us-gov"
external_id = "abc"
`,
}.Apply(t)
assert.NoError(t, err)
j := d.Get("json")
assert.Lenf(t, j, 306, "Strange length for policy: %s", j)
}

func TestDataAwsAssumeRolePolicyLogDelivery(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Resource: DataAwsAssumeRolePolicy(),
NonWritable: true,
ID: ".",
HCL: `
external_id = "abc"
for_log_delivery = true
`,
}.Apply(t)
assert.NoError(t, err)
j := d.Get("json")
assert.Lenf(t, j, 347, "Strange length for policy: %s", j)
}

func TestDataAwsAssumeRolePolicyLogDeliveryGov(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Resource: DataAwsAssumeRolePolicy(),
NonWritable: true,
ID: ".",
HCL: `
aws_partition = "aws-us-gov"
external_id = "abc"
for_log_delivery = true
`,
}.Apply(t)
assert.NoError(t, err)
j := d.Get("json")
assert.Lenf(t, j, 362, "Strange length for policy: %s", j)
}
25 changes: 19 additions & 6 deletions aws/data_aws_bucket_policy.go
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,13 @@ func DataAwsBucketPolicy() common.Resource {
return common.Resource{
Read: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
bucket := d.Get("bucket").(string)
awsPartition := d.Get("aws_partition").(string)
databricksAwsAccountId := AwsConfig[awsPartition]["accountId"]

if databricksAwsAccountId == "" {
databricksAwsAccountId = AwsConfig[awsPartition]["accountId"]
}

policy := awsIamPolicy{
Version: "2012-10-17",
Statements: []*awsIamPolicyStatement{
Expand All @@ -30,11 +37,11 @@ func DataAwsBucketPolicy() common.Resource {
"s3:GetBucketLocation",
},
Resources: []string{
fmt.Sprintf("arn:aws:s3:::%s/*", bucket),
fmt.Sprintf("arn:aws:s3:::%s", bucket),
fmt.Sprintf("arn:%s:s3:::%s/*", awsPartition, bucket),
fmt.Sprintf("arn:%s:s3:::%s", awsPartition, bucket),
},
Principal: map[string]string{
"AWS": fmt.Sprintf("arn:aws:iam::%s:root", d.Get("databricks_account_id").(string)),
"AWS": fmt.Sprintf("arn:%s:iam::%s:root", awsPartition, databricksAwsAccountId),
},
},
},
Expand All @@ -60,10 +67,16 @@ func DataAwsBucketPolicy() common.Resource {
return nil
},
Schema: map[string]*schema.Schema{
"aws_partition": {
Type: schema.TypeString,
Optional: true,
ValidateFunc: validation.StringInSlice(AwsPartitions, false),
Default: "aws",
},
"databricks_account_id": {
Type: schema.TypeString,
Default: "414351767826",
Optional: true,
Type: schema.TypeString,
Optional: true,
Deprecated: "databricks_account_id will be will be removed in the next major release.",
},
"databricks_e2_account_id": {
Type: schema.TypeString,
Expand Down
16 changes: 16 additions & 0 deletions aws/data_aws_bucket_policy_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -53,3 +53,19 @@ func TestDataAwsBucketPolicyConfusedDeputyProblem(t *testing.T) {
j := d.Get("json")
assert.Lenf(t, j, 575, "Strange length for policy: %s", j)
}

func TestDataAwsBucketPolicyPartitionGov(t *testing.T) {
d, err := qa.ResourceFixture{
Read: true,
Resource: DataAwsBucketPolicy(),
NonWritable: true,
ID: ".",
HCL: `
bucket = "abc"
aws_partition = "aws-us-gov"
`,
}.Apply(t)
assert.NoError(t, err)
j := d.Get("json")
assert.Lenf(t, j, 461, "Strange length for policy: %s", j)
}
Loading

0 comments on commit b733ee9

Please sign in to comment.