common

package
v1.41.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Apr 24, 2024 License: Apache-2.0 Imports: 23 Imported by: 0

Documentation

Index

Constants

This section is empty.

Variables

View Source
var (

	// ResourceName is resource name without databricks_ prefix
	ResourceName contextKey = 1
	// Provider is the current instance of provider
	Provider contextKey = 2
	// Current is the current name of integration test
	Current contextKey = 3
	// If current resource is data
	IsData contextKey = 4
	// apiVersion
	Api contextKey = 5
)
View Source
var NoAuth string = "default auth: cannot configure default credentials, " +
	"please check https://docs.databricks.com/en/dev-tools/auth.html#databricks-client-unified-authentication " +
	"to configure credentials for your preferred authentication method"

Functions

func AddAccountIdField added in v1.35.0

func AddAccountIdField(s map[string]*schema.Schema) map[string]*schema.Schema

func AddContextToAllResources

func AddContextToAllResources(p *schema.Provider, prefix string)

AddContextToAllResources ...

func DataToReflectValue

func DataToReflectValue(d *schema.ResourceData, s map[string]*schema.Schema, rv reflect.Value) error

DataToReflectValue reads reflect value from data

func DataToStructPointer

func DataToStructPointer(d *schema.ResourceData, scm map[string]*schema.Schema, result any)

DataToStructPointer reads resource data with given schema onto result pointer. Panics.

func DiffToStructPointer

func DiffToStructPointer(d attributeGetter, scm map[string]*schema.Schema, result any)

DiffToStructPointer reads resource diff with given schema onto result pointer. Panics.

func EqualFoldDiffSuppress added in v1.14.3

func EqualFoldDiffSuppress(k, old, new string, d *schema.ResourceData) bool

func GetTerraformVersionFromContext added in v1.34.0

func GetTerraformVersionFromContext(ctx context.Context) string

func IsExporter added in v1.34.0

func IsExporter(ctx context.Context) bool

func IsRequestEmpty added in v1.32.0

func IsRequestEmpty(v any) (bool, error)

func MustCompileKeyRE

func MustCompileKeyRE(name string) *regexp.Regexp

func MustInt64 added in v1.41.0

func MustInt64(s string) int64

func MustSchemaMap added in v1.40.0

func MustSchemaMap(s map[string]*schema.Schema, path ...string) map[string]*schema.Schema

func MustSchemaPath

func MustSchemaPath(s map[string]*schema.Schema, path ...string) *schema.Schema

func NoCustomize added in v1.29.0

func NoCustomize(m map[string]*schema.Schema) map[string]*schema.Schema

func OwnerRollbackError added in v1.33.0

func OwnerRollbackError(err error, rollbackErr error, oldOwner string, newOwner string) error

func RegisterResourceProvider added in v1.39.0

func RegisterResourceProvider(v any, r ResourceProvider)

Pre-registered ResourceProvider for a given struct into resourceProviderRegistry. This function should be called in the init() function in packages with ResourceProvider. Example:

func init() {
	 common.RegisterResourceProvider(jobs.JobSettings{}) = JobSettingsResource{}
}

func SchemaMap added in v1.40.0

func SchemaMap(s map[string]*schema.Schema, path ...string) (map[string]*schema.Schema, error)

func SchemaPath

func SchemaPath(s map[string]*schema.Schema, path ...string) (*schema.Schema, error)

SchemaPath helps to navigate

func SetDefault added in v1.34.0

func SetDefault(v *schema.Schema, value any)

SetDefault sets the default value for a schema.

func SetForceSendFields added in v1.35.0

func SetForceSendFields(req any, d attributeGetter, fields []string)

SetForceSendFields adds any fields specified in the `fields` parameter to the ForceSendFields field of the request structure if they are present in the resource state. The provided fields must match the JSON tag for some field in the request structure. This ensures that fields explicitly set to the zero value of its type (e.g. `0` for an `int`) will be serialized and sent to the platform.

This function requires that the request structure has a `ForceSendFields` field of type `[]string`. If not, it panics with an appropriate error message.

func SetReadOnly added in v1.34.0

func SetReadOnly(v *schema.Schema)

SetReadOnly sets the schema to be read-only (i.e. computed, non-optional). This should be used for fields that are not user-configurable but are returned by the platform.

func SetRequired added in v1.34.0

func SetRequired(v *schema.Schema)

SetRequired sets the schema to be required.

func StringIsUUID added in v1.28.0

func StringIsUUID(s string) bool

func StructToData

func StructToData(result any, s map[string]*schema.Schema, d *schema.ResourceData) error

StructToData reads result using schema onto resource data

func StructToSchema

func StructToSchema(v any, customize func(map[string]*schema.Schema) map[string]*schema.Schema) map[string]*schema.Schema

StructToSchema makes schema from a struct type & applies customizations from callback given

func SuppressDiffWhitespaceChange added in v1.37.0

func SuppressDiffWhitespaceChange(k, old, new string, d *schema.ResourceData) bool

func Version

func Version() string

Version returns version of provider

Types

type ApiVersion

type ApiVersion string
const (
	API_1_2 ApiVersion = "1.2"
	API_2_0 ApiVersion = "2.0"
	API_2_1 ApiVersion = "2.1"
)

type BindResource

type BindResource struct {
	ReadContext   func(ctx context.Context, left, right string, c *DatabricksClient) error
	CreateContext func(ctx context.Context, left, right string, c *DatabricksClient) error
	DeleteContext func(ctx context.Context, left, right string, c *DatabricksClient) error
}

BindResource defines resource with simplified functions

type CommandExecutor

type CommandExecutor interface {
	Execute(clusterID, language, commandStr string) CommandResults
}

CommandExecutor creates a spark context and executes a command and then closes context

type CommandMock

type CommandMock func(commandStr string) CommandResults

CommandMock mocks the execution of command

type CommandResults

type CommandResults struct {
	ResultType   string `json:"resultType,omitempty"`
	Summary      string `json:"summary,omitempty"`
	Cause        string `json:"cause,omitempty"`
	Data         any    `json:"data,omitempty"`
	Schema       any    `json:"schema,omitempty"`
	Truncated    bool   `json:"truncated,omitempty"`
	IsJSONSchema bool   `json:"isJsonSchema,omitempty"`
	// contains filtered or unexported fields
}

CommandResults captures results of a command

func (*CommandResults) Err

func (cr *CommandResults) Err() error

Err returns error type

func (*CommandResults) Error

func (cr *CommandResults) Error() string

Error returns error in a bit more friendly way

func (*CommandResults) Failed

func (cr *CommandResults) Failed() bool

Failed tells if command execution failed

func (*CommandResults) Scan

func (cr *CommandResults) Scan(dest ...any) bool

Scan scans for results

func (*CommandResults) Text

func (cr *CommandResults) Text() string

Text returns plain text results

type CustomizableSchema added in v1.36.0

type CustomizableSchema struct {
	Schema *schema.Schema
	// contains filtered or unexported fields
}

func CustomizeSchemaPath added in v1.36.0

func CustomizeSchemaPath(s map[string]*schema.Schema, path ...string) *CustomizableSchema

func (*CustomizableSchema) AddNewField added in v1.36.0

func (s *CustomizableSchema) AddNewField(key string, newField *schema.Schema) *CustomizableSchema

func (*CustomizableSchema) GetSchemaMap added in v1.41.0

func (s *CustomizableSchema) GetSchemaMap() map[string]*schema.Schema

func (*CustomizableSchema) SchemaPath added in v1.41.0

func (s *CustomizableSchema) SchemaPath(path ...string) *CustomizableSchema

func (*CustomizableSchema) SetAtLeastOneOf added in v1.36.0

func (s *CustomizableSchema) SetAtLeastOneOf(value []string) *CustomizableSchema

func (*CustomizableSchema) SetComputed added in v1.36.0

func (s *CustomizableSchema) SetComputed() *CustomizableSchema

func (*CustomizableSchema) SetConflictsWith added in v1.36.0

func (s *CustomizableSchema) SetConflictsWith(value []string) *CustomizableSchema

func (*CustomizableSchema) SetCustomSuppressDiff added in v1.36.0

func (s *CustomizableSchema) SetCustomSuppressDiff(suppressor func(k, old, new string, d *schema.ResourceData) bool) *CustomizableSchema

func (*CustomizableSchema) SetDefault added in v1.36.0

func (s *CustomizableSchema) SetDefault(value any) *CustomizableSchema

func (*CustomizableSchema) SetDeprecated added in v1.36.0

func (s *CustomizableSchema) SetDeprecated(reason string) *CustomizableSchema

func (*CustomizableSchema) SetExactlyOneOf added in v1.36.0

func (s *CustomizableSchema) SetExactlyOneOf(value []string) *CustomizableSchema

func (*CustomizableSchema) SetForceNew added in v1.36.0

func (s *CustomizableSchema) SetForceNew() *CustomizableSchema

func (*CustomizableSchema) SetMaxItems added in v1.36.0

func (s *CustomizableSchema) SetMaxItems(value int) *CustomizableSchema

func (*CustomizableSchema) SetMinItems added in v1.36.0

func (s *CustomizableSchema) SetMinItems(value int) *CustomizableSchema

func (*CustomizableSchema) SetOptional added in v1.36.0

func (s *CustomizableSchema) SetOptional() *CustomizableSchema

func (*CustomizableSchema) SetReadOnly added in v1.36.0

func (s *CustomizableSchema) SetReadOnly() *CustomizableSchema

SetReadOnly sets the schema to be read-only (i.e. computed, non-optional). This should be used for fields that are not user-configurable but are returned by the platform.

func (*CustomizableSchema) SetRequired added in v1.36.0

func (s *CustomizableSchema) SetRequired() *CustomizableSchema

SetRequired sets the schema to be required.

func (*CustomizableSchema) SetRequiredWith added in v1.36.0

func (s *CustomizableSchema) SetRequiredWith(value []string) *CustomizableSchema

func (*CustomizableSchema) SetSensitive added in v1.36.0

func (s *CustomizableSchema) SetSensitive() *CustomizableSchema

func (*CustomizableSchema) SetSliceSet added in v1.40.0

func (s *CustomizableSchema) SetSliceSet() *CustomizableSchema

func (*CustomizableSchema) SetSuppressDiff added in v1.36.0

func (s *CustomizableSchema) SetSuppressDiff() *CustomizableSchema

func (*CustomizableSchema) SetSuppressDiffWithDefault added in v1.40.0

func (s *CustomizableSchema) SetSuppressDiffWithDefault(dv any) *CustomizableSchema

SetSuppressDiffWithDefault suppresses the diff if the new value (ie value from HCL config) is not set and the old value (ie value from state / platform) is equal to the default value.

Often Databricks HTTP APIs will return values for fields that were not set by the author in their terraform configuration. This function allows us to suppress the diff in these cases.

func (*CustomizableSchema) SetValidateDiagFunc added in v1.36.0

func (s *CustomizableSchema) SetValidateDiagFunc(validate func(interface{}, cty.Path) diag.Diagnostics) *CustomizableSchema

func (*CustomizableSchema) SetValidateFunc added in v1.36.0

func (s *CustomizableSchema) SetValidateFunc(validate func(interface{}, string) ([]string, []error)) *CustomizableSchema

type DatabricksClient

type DatabricksClient struct {
	*client.DatabricksClient
	// contains filtered or unexported fields
}

DatabricksClient holds properties needed for authentication and HTTP client setup fields with `name` struct tags become Terraform provider attributes. `env` struct tag can hold one or more coma-separated env variable names to find value, if not specified directly. `auth` struct tag describes the type of conflicting authentication used.

func CommonEnvironmentClient

func CommonEnvironmentClient() *DatabricksClient

func (*DatabricksClient) AccountClient added in v1.21.0

func (c *DatabricksClient) AccountClient() (*databricks.AccountClient, error)

func (*DatabricksClient) AccountClientWithAccountIdFromConfig added in v1.35.0

func (c *DatabricksClient) AccountClientWithAccountIdFromConfig(d *schema.ResourceData) (*databricks.AccountClient, error)

func (*DatabricksClient) AccountClientWithAccountIdFromPair added in v1.35.0

func (c *DatabricksClient) AccountClientWithAccountIdFromPair(d *schema.ResourceData, p *Pair) (*databricks.AccountClient, string, error)

func (*DatabricksClient) AccountOrWorkspaceRequest added in v1.24.0

func (c *DatabricksClient) AccountOrWorkspaceRequest(accCallback func(*databricks.AccountClient) error, wsCallback func(*databricks.WorkspaceClient) error) error

func (*DatabricksClient) ClientForHost

func (c *DatabricksClient) ClientForHost(ctx context.Context, url string) (*DatabricksClient, error)

ClientForHost creates a new DatabricksClient instance with the same auth parameters, but for the given host. Authentication has to be reinitialized, as Google OIDC has different authorizers, depending if it's workspace or Accounts API we're talking to.

func (*DatabricksClient) CommandExecutor

func (c *DatabricksClient) CommandExecutor(ctx context.Context) CommandExecutor

CommandExecutor service

func (*DatabricksClient) Delete

func (c *DatabricksClient) Delete(ctx context.Context, path string, request any) error

Delete on path. Ignores succesfull responses from the server.

func (*DatabricksClient) DeleteWithResponse added in v1.30.0

func (c *DatabricksClient) DeleteWithResponse(ctx context.Context, path string, request any, response any) error

Delete on path. Deserializes the response into the response parameter.

func (*DatabricksClient) FormatURL

func (c *DatabricksClient) FormatURL(strs ...string) string

FormatURL creates URL from the client Host and additional strings

func (*DatabricksClient) Get

func (c *DatabricksClient) Get(ctx context.Context, path string, request any, response any) error

Get on path

func (*DatabricksClient) GetAzureJwtProperty

func (aa *DatabricksClient) GetAzureJwtProperty(key string) (any, error)

func (*DatabricksClient) IsAws

func (c *DatabricksClient) IsAws() bool

IsAws returns true if client is configured for AWS

func (*DatabricksClient) IsAzure

func (c *DatabricksClient) IsAzure() bool

IsAzure returns true if client is configured for Azure Databricks - either by using AAD auth or with host+token combination

func (*DatabricksClient) IsGcp

func (c *DatabricksClient) IsGcp() bool

IsGcp returns true if client is configured for GCP

func (*DatabricksClient) Patch

func (c *DatabricksClient) Patch(ctx context.Context, path string, request any) error

Patch on path. Ignores succesfull responses from the server.

func (*DatabricksClient) PatchWithResponse added in v1.30.0

func (c *DatabricksClient) PatchWithResponse(ctx context.Context, path string, request any, response any) error

Patch on path. Deserializes the response into the response parameter.

func (*DatabricksClient) Post

func (c *DatabricksClient) Post(ctx context.Context, path string, request any, response any) error

Post on path

func (*DatabricksClient) Put

func (c *DatabricksClient) Put(ctx context.Context, path string, request any) error

Put on path

func (*DatabricksClient) Scim

func (c *DatabricksClient) Scim(ctx context.Context, method, path string, request any, response any) error

Scim sets SCIM headers

func (*DatabricksClient) SetAccountClient added in v1.35.0

func (c *DatabricksClient) SetAccountClient(a *databricks.AccountClient)

Set the cached account client.

func (*DatabricksClient) SetWorkspaceClient added in v1.35.0

func (c *DatabricksClient) SetWorkspaceClient(w *databricks.WorkspaceClient)

Set the cached workspace client.

func (*DatabricksClient) WithCommandExecutor

func (c *DatabricksClient) WithCommandExecutor(cef func(context.Context, *DatabricksClient) CommandExecutor)

WithCommandExecutor sets command executor implementation to use

func (*DatabricksClient) WithCommandMock

func (c *DatabricksClient) WithCommandMock(mock CommandMock)

WithCommandMock mocks all command executions for this client

func (*DatabricksClient) WorkspaceClient added in v1.10.0

func (c *DatabricksClient) WorkspaceClient() (*databricks.WorkspaceClient, error)

type Pair

type Pair struct {
	// contains filtered or unexported fields
}

Pair defines an ID pair

func NewPairID

func NewPairID(left, right string) *Pair

NewPairID creates new ID pair

func NewPairSeparatedID

func NewPairSeparatedID(left, right, separator string) *Pair

NewPairSeparatedID creates new ID pair with a custom separator

func (*Pair) BindResource

func (p *Pair) BindResource(pr BindResource) Resource

BindResource creates resource that relies on binding ID pair with simple schema & importer

func (*Pair) Pack

func (p *Pair) Pack(d *schema.ResourceData)

Pack data attributes to ID

func (*Pair) Schema

func (p *Pair) Schema(do func(map[string]*schema.Schema) map[string]*schema.Schema) *Pair

Schema sets custom schema

func (*Pair) Unpack

func (p *Pair) Unpack(d *schema.ResourceData) (string, string, error)

Unpack ID into two strings and set data

type RecursiveResourceProvider added in v1.39.0

type RecursiveResourceProvider interface {
	ResourceProvider
	MaxDepthForTypes() map[string]int
}

Interface for ResourceProvider instances that have recursive references in its schema. The function MaxDepthForTypes allows us to specify the max number of recursive depth for a specific field

Example:

func (JobSettings) MaxDepthForTypes map[string]int {
    return map[string]int{"for_each_task": 2}
}

type Resource

type Resource struct {
	Create             func(ctx context.Context, d *schema.ResourceData, c *DatabricksClient) error
	Read               func(ctx context.Context, d *schema.ResourceData, c *DatabricksClient) error
	Update             func(ctx context.Context, d *schema.ResourceData, c *DatabricksClient) error
	Delete             func(ctx context.Context, d *schema.ResourceData, c *DatabricksClient) error
	CustomizeDiff      func(ctx context.Context, d *schema.ResourceDiff) error
	StateUpgraders     []schema.StateUpgrader
	Schema             map[string]*schema.Schema
	SchemaVersion      int
	Timeouts           *schema.ResourceTimeout
	DeprecationMessage string
	Importer           *schema.ResourceImporter
}

Resource aims to simplify things like error & deleted entities handling

func AccountData added in v1.21.0

func AccountData[T any](read func(context.Context, *T, *databricks.AccountClient) error) Resource

AccountData is a generic way to define account data resources in Terraform provider.

Example usage:

type metastoresData struct {
	Ids map[string]string `json:"ids,omitempty" tf:"computed"`
}
return common.AccountData(func(ctx context.Context, d *metastoresData, acc *databricks.AccountClient) error {
	metastores, err := acc.Metastores.List(ctx)
	...
})

func AccountDataWithParams added in v1.34.0

func AccountDataWithParams[T, P any](read func(context.Context, P, *databricks.AccountClient) (*T, error)) Resource

AccountDataWithParams defines a data source that can be used to read data from the workspace API. It differs from AccountData in that it allows extra attributes to be provided as a separate argument, so the original type used to define the resource can also be used to define the data source.

The first type parameter is the type of the resource. This can be a type directly from the SDK, or a custom type defined in the provider that embeds the SDK type.

The second type parameter is the type of the extra attributes that should be provided to the data source. These are the attributes that the user can specify in the data source configuration, but are not part of the resource type. If there are no extra attributes, this should be `struct{}`. If there are any fields with the same JSON name as fields in the resource type, these fields will override the values from the resource type.

The single argument is a function that will be called to read the data from the workspace API, returning the requested resource. The function should return an error if the data cannot be read or the resource cannot be found.

Example usage:

 type MwsWorkspace struct { ... }

 type MwsWorkspaceDataParams struct {
	     Id   string `json:"id" tf:"computed,optional"`
	     Name string `json:"name" tf:"computed,optional"`
 }

 AccountDataWithParams(
     func(ctx context.Context, data MwsWorkspaceDataParams, a *databricks.AccountClient) (*MwsWorkspace, error) {
         // User-provided attributes are present in the `data` parameter.
         // The resource should be populated in the `workspace` parameter.
         ...
	  })

func DataResource deprecated

func DataResource(sc any, read func(context.Context, any, *DatabricksClient) error) Resource

Deprecated: migrate to WorkspaceData

func WorkspaceData added in v1.10.1

func WorkspaceData[T any](read func(context.Context, *T, *databricks.WorkspaceClient) error) Resource

WorkspaceData is a generic way to define workspace data resources in Terraform provider.

Example usage:

type catalogsData struct {
	Ids []string `json:"ids,omitempty" tf:"computed,slice_set"`
}
return common.WorkspaceData(func(ctx context.Context, data *catalogsData, w *databricks.WorkspaceClient) error {
	catalogs, err := w.Catalogs.ListAll(ctx)
	...
})

func WorkspaceDataWithParams added in v1.34.0

func WorkspaceDataWithParams[T, P any](read func(context.Context, P, *databricks.WorkspaceClient) (*T, error)) Resource

WorkspaceDataWithParams defines a data source that can be used to read data from the workspace API. It differs from WorkspaceData in that it separates the definition of the computed fields (the resource type) from the definition of the user-supplied parameters.

The first type parameter is the type of the resource. This can be a type directly from the SDK, or a custom type defined in the provider that embeds the SDK type.

The second type parameter is the type representing parameters that a user may provide to the data source. These are the attributes that the user can specify in the data source configuration, but are not part of the resource type. If there are no extra attributes, this should be `struct{}`. If there are any fields with the same JSON name as fields in the resource type, these fields will override the values from the resource type.

The single argument is a function that will be called to read the data from the workspace API, returning the value of the resource type. The function should return an error if the data cannot be read or the resource cannot be found.

Example usage:

 type SqlWarehouse struct { ... }

 type SqlWarehouseDataParams struct {
	     Id   string `json:"id" tf:"computed,optional"`
	     Name string `json:"name" tf:"computed,optional"`
 }

 WorkspaceDataWithParams(
     func(ctx context.Context, data SqlWarehouseDataParams, w *databricks.WorkspaceClient) (*SqlWarehouse, error) {
         // User-provided attributes are present in the `data` parameter.
         // The resource should be returned.
         ...
     })

func (Resource) ToResource

func (r Resource) ToResource() *schema.Resource

ToResource converts to Terraform resource definition

type ResourceProvider added in v1.36.3

type ResourceProvider interface {
	CustomizeSchema(*CustomizableSchema) *CustomizableSchema
}

Generic interface for ResourceProvider. Using CustomizeSchema function to keep track of additional information on top of the generated go-sdk struct.

type ResourceProviderWithAlias added in v1.39.0

type ResourceProviderWithAlias interface {
	ResourceProvider
	// Aliases() returns a two dimensional map where the top level key is the name of the struct, the second level key is the name of the field,
	// the values are the alias for the corresponding field under the specified struct.
	// Example:
	//
	//	{
	//	    "compute.ClusterSpec": {
	//	        "libraries": "library"
	//	    }
	//	}
	// Note: In case the struct is derived from another struct example:
	// type LibraryList compute.InstallLibraries
	// The top level key would still be the name of the struct i.e. LibraryList in this example.
	Aliases() map[string]map[string]string
}

Interface for ResourceProvider instances that need aliases for fields.

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL