subcategory |
---|
Workspace |
This resource allows you to manage Databricks notebooks. You can also work with databricks_notebook and databricks_notebook_paths data sources.
You can declare Terraform-managed notebook by specifying source
attribute of corresponding local file. Only .scala
, .py
, .sql
and .r
extensions are supported, if you would like to omit language
attribute.
data "databricks_current_user" "me" {
}
resource "databricks_notebook" "ddl" {
source = "${path.module}/DDLgen.py"
path = "${data.databricks_current_user.me.home}/AA/BB/CC"
}
You can also create managed notebook with inline sources through content_base64
and language
attributes.
resource "databricks_notebook" "notebook" {
content_base64 = base64encode(<<-EOT
# created from ${abspath(path.module)}
display(spark.range(10))
EOT
)
path = "/Shared/Demo"
language = "PYTHON"
}
-> Note Notebook on Databricks workspace would only be changed, if Terraform stage did change. This means that any manual changes to managed notebook won't be overwritten by Terraform, if there's no local change to notebook sources. Notebooks are identified by their path, so changing notebook's name manually on the workspace and then applying Terraform state would result in creation of notebook from Terraform state.
The size of a notebook source code must not exceed few megabytes. The following arguments are supported:
path
- (Required) The absolute path of the notebook or directory, beginning with "/", e.g. "/Demo".source
- Path to notebook in source code format on local filesystem. Conflicts withcontent_base64
.content_base64
- The base64-encoded notebook source code. Conflicts withsource
. Use ofcontent_base64
is discouraged, as it's increasing memory footprint of Terraform state and should only be used in exceptional circumstances, like creating a notebook with configuration properties for a data pipeline.language
- (required withcontent_base64
) One ofSCALA
,PYTHON
,SQL
,R
.
In addition to all arguments above, the following attributes are exported:
id
- Path of notebook on workspaceurl
- Routable URL of the notebookobject_id
- Unique identifier for a NOTEBOOK
- databricks_permissions can control which groups or individual users can access notebooks or folders.
The resource notebook can be imported using notebook path
$ terraform import databricks_notebook.this /path/to/notebook