Skip to content

Commit d7d0625

Browse files
add new files for module
adding test for regulated workspace module commiting terraform lock file adding lock file
1 parent bb16fdc commit d7d0625

39 files changed

+235
-61
lines changed

Diff for: .gitignore

+40-36
Original file line numberDiff line numberDiff line change
@@ -1,40 +1,44 @@
1-
./aws/base/.terraform/*
2-
./aws/base/.terraform.lock.hcl
3-
./aws/base/terraform.tfstate
4-
./aws/base/terraform.tfstate.backup
5-
./aws/base_customer_vpc/.terraform/*
6-
./aws/base_customer_vpc/.terraform.lock.hcl
7-
./aws/base_customer_vpc/terraform.tfstate
8-
./aws/base_customer_vpc/terraform.tfstate.backup
9-
./aws/security/.terraform/*
10-
./aws/security/.terraform.lock.hcl
11-
./aws/security/terraform.tfstate
12-
./aws/security/terraform.tfstate.backup
13-
./aws/fs_lakehouse/.terraform/*
14-
./aws/fs_lakehouse/.terraform.lock.hcl
15-
./aws/fs_lakehouse/terraform.tfstate
16-
./aws/fs_lakehouse/terraform.tfstate.backup
17-
./azure/base/.terraform/*
18-
./azure/base/.terraform.lock.hcl
19-
./azure/base/terraform.tfstate
20-
./azure/base/terraform.tfstate.backup
21-
./azure/managed_vnet/.terraform/*
22-
./azure/managed_vnet/.terraform.lock.hcl
23-
./azure/managed_vnet/terraform.tfstate
24-
./azure/managed_vnet/terraform.tfstate.backup
25-
./gcp/.terraform/*
26-
./gcp/.idea/*
27-
./gcp/terraform.tfstate
28-
./gcp/terraform.tfstate.backup
29-
./gcp/.terraform.lock.hcl
30-
./azure/.idea/*
31-
./azure/terraform.tfstate
32-
./azure/terraform.tfstate.backup
33-
./azure/.terraform.lock.hcl
1+
# MacOS
2+
.DS_Store
343

4+
# IntelliJ
5+
.idea/
6+
7+
# Include override files you do wish to add to version control using negated pattern
8+
# !example_override.tf
9+
10+
# Include tfplan files to ignore the plan output of command: terraform plan -out=tfplan
11+
# example: *tfplan*
12+
13+
# Local .terraform directories
14+
**/.terraform/*
15+
16+
# .tfstate files
17+
*.tfstate
18+
*.tfstate.*
19+
20+
# Crash log files
21+
crash.log
22+
crash.*.log
23+
24+
# Exclude all .tfvars files, which are likely to contain sensitive data, such as
25+
# password, private keys, and other secrets. These should not be part of version
26+
# control as they are data points which are potentially sensitive and subject
27+
# to change depending on the environment.
28+
*.tfvars
29+
*.tfvars.json
30+
31+
# Ignore override files as they are usually used to override resources locally and so
32+
# are not checked in
33+
override.tf
34+
override.tf.json
35+
*_override.tf
36+
*_override.tf.json
37+
38+
# Ignore CLI configuration files
39+
.terraformrc
40+
terraform.rc
3541

3642
.idea
3743
./.idea
38-
./.idea/*
39-
/azure/managed_vnet/.terraform/
40-
/azure/.terraform/
44+
./.idea/*

Diff for: aws/fs_lakehouse/data/pii_data.csv

-4
This file was deleted.

Diff for: aws/fs_lakehouse/init.tf

-18
This file was deleted.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

Diff for: aws/base_customer_vpc/README.md renamed to modules/aws_customer_managed_vpc/README.md

+27-3
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,11 @@
22
page_title: "Provisioning Secure Databricks Workspaces on AWS with Terraform"
33
---
44

5-
# How to Deploy a Lakehouse Blueprint using Best Practices and Industry Helper Libraries
5+
# Module to Deploy a Lakehouse Blueprint using Best Practices and Industry Helper Libraries
66

7-
This guide uses the following variables. Please all variables here in a terraform.tfvars file (outside the github project) and reference it using `terraform apply -var-file="<location for tfvars file>/terraform.tfvars"`.
7+
This modules uses the following input variables. Please all variables here in a terraform.tfvars file (outside the github project) and reference it using `terraform apply -var-file="<location for tfvars file>/terraform.tfvars"`.
88

9+
### Input Variables
910
- `databricks_account_id`: The ID per Databricks AWS account used for accessing account management APIs. After the AWS E2 account is created, this is available after logging into [https://accounts.cloud.databricks.com](https://accounts.cloud.databricks.com).
1011
- `databricks_account_username`: E2 user account email address
1112
- `databricks_account_password` - E2 account password
@@ -19,10 +20,33 @@ This guide uses the following variables. Please all variables here in a terrafor
1920
- `security_group_id` - security group ID used for VPC subnets
2021
- `cross_account_arn` - existing cross-account role arn
2122

22-
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module.
23+
### Output Variables
2324

25+
- `workspace_url` - URL which allows users to log into the created regulated workspace
26+
- `workspace_id` - Numeric ID mapping to the newly created regulated workspace
2427

2528

29+
### Usage
30+
31+
```hcl
32+
module "aws_customer_managed_vpc" {
33+
source = "databricks/aws_customer_managed_vpc/"
34+
35+
databricks_account_id = # see description above
36+
databricks_account_username = # see description above
37+
databricks_account_password = # see description above
38+
region = # see description above
39+
relay_vpce_service = # see description above
40+
workspace_vpce_service = # see description above
41+
vpce_subnet_cidr = # see description above
42+
vpc_id = # see description above
43+
subnet_ids = # see description above
44+
security_group_id = # see description above
45+
cross_account_arn = # see description above
46+
47+
}
48+
```
49+
2650
## Provider initialization
2751

2852
Initialize [provider with `mws` alias](https://www.terraform.io/language/providers/configuration#alias-multiple-provider-configurations) to set up account-level resources.

Diff for: aws/base_customer_vpc/init.tf renamed to modules/aws_customer_managed_vpc/init.tf

+5
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,9 @@
11
terraform {
2+
backend "s3" {
3+
bucket = "databricks-terraform-blueprints"
4+
key = "aws_customer_managed_vpc.tfstate"
5+
region = "us-east-1"
6+
}
27
required_providers {
38
databricks = {
49
source = "databricks/databricks"
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

Diff for: aws/base_customer_vpc/workspace.tf renamed to modules/aws_customer_managed_vpc/workspace.tf

+10
Original file line numberDiff line numberDiff line change
@@ -32,4 +32,14 @@ resource "databricks_mws_workspaces" "this" {
3232
private_access_settings_id = databricks_mws_private_access_settings.pas.private_access_settings_id
3333
pricing_tier = "ENTERPRISE"
3434
depends_on = [databricks_mws_networks.this]
35+
}
36+
37+
output "workspace_url" {
38+
value = databricks_mws_workspaces.this.workspace_url
39+
description = "URL for newly created Databricks workspace"
40+
}
41+
42+
output "workspace_id" {
43+
value = databricks_mws_workspaces.this.id
44+
description = "Workspace numeric ID"
3545
}
File renamed without changes.
File renamed without changes.

Diff for: modules/aws_fs_lakehouse/data/pii_data.csv

+4
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
Name|Address|Credit_Card_Number|Card_Type
2+
Robert Aragon|10 South Point Lane|*4295|VISA
3+
Ashley Borden|233 Langley Drive|*3020|AMEX
4+
Thomas Conley|9181 Bisbey Court|*8111|MC
File renamed without changes.

Diff for: modules/aws_fs_lakehouse/init.tf

+25
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
terraform {
2+
backend "s3" {
3+
bucket = "databricks-terraform-blueprints-aws-fs-lakehouse"
4+
key = "aws_regulated_lakehouse.tfstate"
5+
region = "us-east-1"
6+
}
7+
required_providers {
8+
databricks = {
9+
source = "databricks/databricks"
10+
}
11+
aws = {
12+
source = "hashicorp/aws"
13+
}
14+
}
15+
}
16+
17+
provider "aws" {
18+
region = local.region
19+
}
20+
21+
provider "databricks" {
22+
host = var.workspace_url
23+
username = var.databricks_account_username
24+
password = var.databricks_account_password
25+
} # Authenticate using preferred method as described in Databricks provider

Diff for: modules/aws_fs_lakehouse/main.tf

+7
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
#module "aws_regulated_workspace" {
2+
# source = "../aws_customer_managed_vpc/"
3+
# providers = {
4+
# databricks = databricks
5+
# }
6+
# display_name = "aws-regulated-workspace"
7+
#}

Diff for: aws/fs_lakehouse/vars.tf renamed to modules/aws_fs_lakehouse/vars.tf

+11
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,17 @@ variable "crossaccount_role_name" {
33
description = "Role that you've specified on https://accounts.cloud.databricks.com/#aws"
44
}
55

6+
variable "workspace_url" {
7+
8+
}
9+
variable "databricks_account_username" {
10+
11+
}
12+
13+
variable "databricks_account_password" {
14+
15+
}
16+
617
locals {
718
region = "us-east-1"
819
}
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

Diff for: modules/test_aws_customer_managed_vpc/.terraform.lock.hcl

+30
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

Diff for: modules/test_aws_customer_managed_vpc/init.tf

+30
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
terraform {
2+
backend "s3" {
3+
bucket = "databricks-terraform-blueprints"
4+
key = "test_aws_customer_managed_vpc.tfstate"
5+
region = "us-east-1"
6+
}
7+
required_providers {
8+
databricks = {
9+
source = "databricks/databricks"
10+
version = "0.5.0"
11+
}
12+
aws = {
13+
source = "hashicorp/aws"
14+
version = "3.49.0"
15+
}
16+
}
17+
}
18+
19+
provider "aws" {
20+
region = var.region
21+
}
22+
23+
// initialize provider in "MWS" mode for provisioning workspace with AWS PrivateLink
24+
provider "databricks" {
25+
alias = "mws"
26+
host = "https://accounts.cloud.databricks.com"
27+
username = var.databricks_account_username
28+
password = var.databricks_account_password
29+
}
30+

Diff for: modules/test_aws_customer_managed_vpc/main.tf

+19
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
module "aws_customer_managed_vpc" {
2+
source = "../aws_customer_managed_vpc/"
3+
4+
databricks_account_id = var.databricks_account_id
5+
databricks_account_username = var.databricks_account_username
6+
databricks_account_password = var.databricks_account_password
7+
region = var.region
8+
relay_vpce_service = var.relay_vpce_service
9+
workspace_vpce_service = var.workspace_vpce_service
10+
vpce_subnet_cidr = var.vpce_subnet_cidr
11+
vpc_id = var.vpc_id
12+
subnet_ids = var.subnet_ids
13+
security_group_id = var.security_group_id
14+
cross_account_arn = var.cross_account_arn
15+
}
16+
17+
output "module_workspace_url" {
18+
value = module.aws_customer_managed_vpc.workspace_url
19+
}

Diff for: modules/test_aws_customer_managed_vpc/vars.tf

+27
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
variable "databricks_account_id" {}
2+
variable "databricks_account_username" {}
3+
variable "databricks_account_password" {}
4+
variable "region" {}
5+
6+
variable "workspace_vpce_service" {}
7+
variable "relay_vpce_service" {}
8+
variable "vpce_subnet_cidr" {}
9+
10+
variable "private_dns_enabled" { default = true}
11+
variable "tags" { default = {}}
12+
13+
variable "cidr_block" {
14+
default = "10.4.0.0/16"
15+
}
16+
17+
variable "vpc_id" {}
18+
variable "subnet_ids" {
19+
type=list(string)
20+
}
21+
variable "security_group_id" {}
22+
23+
variable "cross_account_arn" {}
24+
25+
locals {
26+
prefix = "private-link-ws"
27+
}

0 commit comments

Comments
 (0)