You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[ISSUE] cannot read sql global config: failed to unmarshal response body: invalid character '<' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. (may be duplicate of #959?)
#1096
Closed
santiazgo1 opened this issue
Nov 29, 2024
· 0 comments
Description
I cannot create/read a SQL Global Configuration in my Azure environment using Terraform. I want to configure sql global settings and then build a serverless Warehouse within my Databricks Workspace. I have tried deploying the SQL Endpoint Terraform resource without the global configuration, but it fails. The issue occurs when running Terraform apply, as Terraform plan is successful and plans to create both resources.
By the time the SQL Global Config resource is written in my .tfstate, the pipeline just stops working. Pipeline fails at creation but writes the block in my file. I cannot destroy this resource but if I remove the written block from the .tfstate manually, pipeline starts working again without any problems, even creating/reading other Databricks provider resources.
I can successfully deploy the Warehouse instance by doing it from within the Databricks UI, so it is actually supported by the Workspace. The rest of my Databricks resources are created and read as expected using this same provider (i.e. NCCs and its private endpoint rules).
Terraform will perform the following actions:
# module.data_platform.databricks_sql_endpoint.dataptfm_dbr_sql_endpoint will be created
+ resource "databricks_sql_endpoint" "dataptfm_dbr_sql_endpoint" {
+ auto_stop_mins = 30
+ cluster_size = "2X-Small"
+ creator_name = (known after apply)
+ data_source_id = (known after apply)
+ enable_photon = true
+ enable_serverless_compute = true
+ health = (known after apply)
+ id = (known after apply)
+ jdbc_url = (known after apply)
+ max_num_clusters = 2
+ min_num_clusters = 1
+ name = "dataprfm-dbr-serverless-sql-endpoint"
+ num_active_sessions = (known after apply)
+ num_clusters = (known after apply)
+ odbc_params = (known after apply)
+ spot_instance_policy = "COST_OPTIMIZED"
+ state = (known after apply)
+ warehouse_type = "PRO"
}
# module.data_platform.databricks_sql_global_config.dataptfm_dbr_global_config will be created
+ resource "databricks_sql_global_config" "dataptfm_dbr_global_config" {
+ enable_serverless_compute = true
+ id = (known after apply)
+ security_policy = "DATA_ACCESS_CONTROL"
}
Plan: 2 to add, 0 to change, 0 to destroy.
Expected behavior
SQL Global Config settings created and applied successfully for my Databricks environment, and my SQL Warehouse serverless instance created to the Workspace.
Description
I cannot create/read a SQL Global Configuration in my Azure environment using Terraform. I want to configure sql global settings and then build a serverless Warehouse within my Databricks Workspace. I have tried deploying the SQL Endpoint Terraform resource without the global configuration, but it fails. The issue occurs when running Terraform apply, as Terraform plan is successful and plans to create both resources.
By the time the SQL Global Config resource is written in my .tfstate, the pipeline just stops working. Pipeline fails at creation but writes the block in my file. I cannot destroy this resource but if I remove the written block from the .tfstate manually, pipeline starts working again without any problems, even creating/reading other Databricks provider resources.
I can successfully deploy the Warehouse instance by doing it from within the Databricks UI, so it is actually supported by the Workspace. The rest of my Databricks resources are created and read as expected using this same provider (i.e. NCCs and its private endpoint rules).
Reproduction
tf plan:
Expected behavior
SQL Global Config settings created and applied successfully for my Databricks environment, and my SQL Warehouse serverless instance created to the Workspace.
Debug Logs
tf apply:
│ Error: cannot read sql global config: failed to unmarshal response body: invalid character '<' looking for beginning of value. This is likely a bug in the Databricks SDK for Go or the underlying REST API. Please report this issue with the following debugging information to the SDK issue tracker at https://github.com/databricks/databricks-sdk-go/issues. Request log:
│
│ GET /login?next_url=/api/2.0/sql/config/warehouses │ > * Host: │ > * Accept: application/json │ > * Authorization: REDACTED │ > * Referer: https://accounts.azuredatabricks.net/api/2.0/sql/config/warehouses │ > * Traceparent: 00-34a0e23596fad1afa298e44bf25a476f-af841ca01d30361c-01 │ > * User-Agent: databricks-tf-provider/1.58.0 databricks-sdk-go/0.51.0 go/1.22.8 os/linux terraform/1.10.0 sdk/sdkv2 resource/sql_global_config auth/azure-client-secret cicd/bitbucket │ > * X-Databricks-Azure-Sp-Management-Token: REDACTED │ < HTTP/2.0 200 OK │ < * Cache-Control: no-cache, no-store, must-revalidate │ < * Content-Type: text/html; charset=utf-8 │ < * Date: Fri, 29 Nov 2024 14:21:00 GMT │ < * Server: databricks │ < * Set-Cookie: enable-armeria-server-for-ui-flags=false; Max-Age=1800; Expires=Fri, 29 Nov 2024 14:51:00 GMT; Secure; HTTPOnly; SameSite=Strictenable-armeria-workspace-server-for-ui-flags=true; Max-Age=1800; Expires=Fri, 29 Nov 2024 14:51:00 GMT; Secure; HTTPOnly; SameSite=Strict │ < * Strict-Transport-Security: max-age=31536000; includeSubDomains; preload │ < * X-Content-Type-Options: nosniff │ < * X-Request-Id: 9596aa77-d439-4fb8-b3c2-f7f221224ee1 │ < * X-Ui-Svc: true │ < <!doctype html> │ < <html lang="en"> │ < <head> │ < <meta charset="utf-8"> │ < <meta name="viewport" content="width=device-width,initial-scale=1"> │ < <meta name="theme-color" content="#000000"> │ < <meta name="description" content="Databricks Sign in"> │ < <title>Databricks - Sign in</title> │ < <script>window.__DATABRICKS_CONFIG__={"isCuttingEdge":false,"publicPath":{"accounts-console":"[https://databricks-ui-assets.azureedge.net/"},"isSpogDomain":false,"enablePrPreview":"","enableGitBisect":"](https://databricks-ui-assets.azureedge.net/%22%7D,%22isSpogDomain%22:false,%22enablePrPreview%22:%22%22,%22enableGitBisect%22:%22)"}</script> │ < <link rel="icon" href="https://databricks-ui-assets.azureedge.net/favicon.ico"> │ < <script defer src="https://databricks-ui-assets.azureedge.net/static/js/6449.41bde32f.js"></script> │ < <script defer src="https://databricks-ui-assets.azureedge.net/static/js/4518.82915a87.js"></script> │ < <script defer src="https://databricks-ui-assets.azureedge.net/static/js/1040.2529a837.js"></script> │ < <script> │ < function setNoCdnAndReload() { │ < document.cookie = `x-databricks-cdn-inaccessible=true; path=/; max-age=86400`; │ < const metric = 'cdnFallbackOccurred'; │ < const browserUserAgent = navigator.userAgent; │ < const browserTabId = window.browserTabId; │ < const performanceEntry = performance.getEntriesByType('resource').filter(e => e.initiatorType === 'script').slice(-1)[0] │ < sessionStorage.setItem('databricks-cdn-fallback-telemetry-key', JSON.stringify({ tags: { browserUserAgent, browserTabId }, performanceEntry})); │ < window.location.reload(); │ < } │ < </script> │ < <script> │ < // Set a manual timeout for dropped packets to CDN │ < function loadScriptWithTimeout(src, timeout) { │ < return new Promise((resolve, reject) => { │ < const script = document.createElement('script'); │ < script.defer = true; │ < script.src = src; │ < script.onload = resolve; │ < script.onerror = reject; │ < document.head.appendChild(script); │ < setTimeout(() => { │ < reject(new Error('Script load timeout')); │ < }, timeout); │ < }); │ < } │ < loadScriptWithTimeout('https://databricks-ui-assets.azureedge.net/static/js/login.be675ee8.js', 10000).catch(setNoCdnAndReload); │ < </script> │ < <link href="https://databricks-ui-assets.azureedge.net/static/css/6449.31dcf9da.css" rel="stylesheet"> │ < <link href="https://databricks-ui-assets.azureedge.net/static/css/1040.e8881155.css" rel="stylesheet"> │ < <link href="https://databricks-ui-assets.azureedge.net/static/css/login.cf7f0b55.css" rel="stylesheet"> │ < </head> │ < <body> │ < <noscript> │ < You need to enable JavaScript to run this app. │ < </noscript> │ < <div id="login"></div> │ < <script>const telemetryEndpoint="/telemetry-unauth?t=",uiModuleName="accountsConsoleLogin";function shouldIgnoreError(e){return!1}function generateUuidV4(){const e=window.crypto?.randomUUID?.();return e||"xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx".replace(/[xy]/g,(e=>{const n=16*Math.random()|0;return("x"===e?n:3&n|8).toString(16)}))}function networkConnectivityTags(){const e=window.navigator.onLine,n=window.navigator.connection?.rtt??-1,t=window.navigator.connection?.downlink??-1;return{browserNavigatorOnline:e,browserConnectionEstimatedRtt:n,browserConnectionEstimatedDownlink:t,browserConnected:e&&n>0&&t>0}}function getParams(e){let n;const t=/\+/g,o=/([^&=]+)=?([^&]*)/g,r=function(e){return decodeURIComponent(e.replace(t," "))},i={};for(n=o.exec(e);n;)i[r(n[1])]=r(n[2]),n=o.exec(e);return i}function getWorkspaceParam(){const e=getParams(window.location.search.substr(1)),n=/^\d+/.exec(e?.o??"");return null===n?void 0:n[0]}function createTelemetryRequestBody(e,n={},t=null){const o=Date.now(),r={eventId:generateUuidV4(),metric:e,tags:{...n,...networkConnectivityTags(),browserTabId:window.browserTabId,browserUserAgent:navigator.userAgent},ts:o};return t&&(r.blob=t),JSON.stringify({uploadTime:o,items:[JSON.stringify(r)]})}function recordTelemetry(e,n={},t=""){const o=getWorkspaceParam(),r=o?{"Content-Type":"application/json","X-Databricks-Org-Id":o.toString()}:{"Content-Type":"application/json"},i={method:"POST",credentials:"include",body:createTelemetryRequestBody(e,n,t),headers:r};fetch(telemetryEndpoint+Date.now(),i)}window.__databricks_networkConnectivityTags=networkConnectivityTags,Object.defineProperty(window,"browserTabId",{value:generateUuidV4()}),window.recordTelemetry=recordTelemetry,recordTelemetry("uiInit",{uiModule:uiModuleName,eventId:"init",eventClientSource:uiModuleName,eventType:"init"});let logCount=0;function error_handler(e,n,t,o,r){logCount++>4||shouldIgnoreError(e)||recordTelemetry("uncaughtJsException",{eventType:"jsExceptionV3",jsExceptionMessage:e,jsExceptionSource:n,jsExceptionLineno:t,jsExceptionColno:o,jsExceptionBeforeInit:!0},r&&r.stack&&r.stack.toString())}function sendBeaconOnPageExit(e){if(navigator.sendBeacon){const n=e&&e.type||"unknown",t=(Date.now(),createTelemetryRequestBody("uiInit",{eventType:"pageExitBeforeAppInitComplete",eventName:n,eventClientSource:uiModuleName}));navigator.sendBeacon(telemetryEndpoint+Date.now(),t)}}window.onerror=error_handler,window.onunhandledrejection=function(e){error_handler(String(e.reason),null,null,null,e.reason)},window.addEventListener("beforeunload",sendBeaconOnPageExit),window.addEventListener("unload",sendBeaconOnPageExit),window.addEventListener("pagehide",sendBeaconOnPageExit),window.cleanupAfterAppInit=()=>{window.removeEventListener("beforeunload",sendBeaconOnPageExit),window.removeEventListener("unload",sendBeaconOnPageExit),window.removeEventListener("pagehide",sendBeaconOnPageExit)}</script> │ < </body> │ < </html> │
Other Information
Additional context
I've seen similar error messages on #1082 and #959 but neither deals with these specific resource type.
The text was updated successfully, but these errors were encountered: