Skip to content

Commit 6166fe2

Browse files
committed
lint readme
1 parent 33744f1 commit 6166fe2

File tree

8 files changed

+23
-16
lines changed

8 files changed

+23
-16
lines changed

Diff for: README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -12,15 +12,15 @@ This set of terraform templates is designed to allow every industry practitioner
1212

1313
What's include in this sequence of Terraform modules?
1414

15-
#### AWS | Azure
15+
### AWS | Azure
1616

17-
There are 4 main modules which can be composed together. (1-4). There is also a full end-to-end example of a workspace deployment with governance and industry quickstarts included. See the `test_aws_full_lakehouse_example` for this version.
17+
There are 4 main modules which can be composed together. (1-4). There is also a full end-to-end example of a workspace deployment with governance and industry quickstarts included. See the `test_aws_full_lakehouse_example` for this version.
1818

1919
1. Creation of Databricks-compliant VPC in `aws_base` `azure_spoke_vnet` (AWS | Azure)
2020
2. Platform Security Built in to Workspace deployment (Private Link, VPC endpoints, and secure connectivity) (AWS | Azure)
2121
3. Unity Catalog Installation (AWS | Azure)
2222
4. Industry Quickstarts with Sample Job and Pre-installed Libraries for Time Series, Common Domain Models
23-
5. Full End-to-End example (AWS) for Composition of all modules above (see below). Similarly, this works for Azure modules.
23+
5. Full End-to-End example (AWS) for Composition of all modules above (see below). Similarly, this works for Azure modules.
2424

2525
```hcl
2626
module "aws_base" {

Diff for: examples/test_azure_data_exfiltration_protection/README.md

+7-5
Original file line numberDiff line numberDiff line change
@@ -2,26 +2,28 @@
22
page_title: "Provisioning Azure Databricks Hub and Spoke Deployment as per Data Exfiltration Protection with Terraform"
33
---
44

5+
# Provisioning Azure Databricks Hub and Spoke Deployment as per Data Exfiltration Protection with Terraform
6+
57
[Reference documentation and blog](https://databricks.com/blog/2020/03/27/data-exfiltration-protection-with-azure-databricks.html)
68
This Terraform configuration is an implementation of the above blog post.
79
Note: the firewall rules deviate slightly in that outbound traffic from the firewall is allowed to Databricks resources instead of specifying Databricks worker subnets.
810
This is to simplify outbound routing in the event that multiple `spoke`s are desired.
911

10-
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module.
12+
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module.
1113

1214
It uses the following variables in configurations:
1315

14-
### Required
16+
## Required
1517

1618
- `project_name`: (Required) The name of the project associated with the infrastructure to be managed by Terraform
1719
- `location`: (Required) The location for the resources in this module
1820
- `databricks_workspace_name`: (Required) The name of the Azure Databricks Workspace to deploy in the spoke vnet
19-
- `scc_relay_address_prefixes`: (Required) The IP address(es) of the Databricks SCC relay (see https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#control-plane-nat-and-webapp-ip-addresses)
21+
- `scc_relay_address_prefixes`: (Required) The IP address(es) of the Databricks SCC relay (see <https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#control-plane-nat-and-webapp-ip-addresses>)
2022
- `privatelink_subnet_address_prefixes`: (Required) The address prefix(es) for the PrivateLink subnet
2123
- `firewall_name`: (Required) The name of the Azure Firewall deployed in your hub Virtual Network
2224
- `firewall_private_ip`: (Required) The hub firewall's private IP address
2325
- `webapp_and_infra_routes`: (Required) Map of regional webapp and ext-infra CIDRs.
24-
Check https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#ip-addresses for more info
26+
Check <https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#ip-addresses> for more info
2527
Ex., for eastus:
2628
{
2729
"webapp1" : "40.70.58.221/32",
@@ -30,7 +32,7 @@ It uses the following variables in configurations:
3032
"ext-infra" : "20.57.106.0/28"
3133
}
3234

33-
### Optional
35+
## Optional
3436

3537
- `hub_resource_group_name`: (Optional) The name of the existing Resource Group containing the hub Virtual Network
3638
- `hub_vnet_name`: (Optional) The name of the existing hub Virtual Network

Diff for: gcp/README.md

+4-3
Original file line numberDiff line numberDiff line change
@@ -20,14 +20,15 @@ This guide uses the following variables in configurations:
2020
- `zone` - Zone specified for the GCP Terraform provider
2121
- `network_id` - UUID which is created once the Network Configuration is created via the Databricks account console for GCP. This should be created using the IP range, secondary ranges, and Google Compute Network is created.
2222

23-
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module. Note that for BYO VPC (a.k.a. customer managed VPC), there is a 2-step process as full automation via the tradition `databricks_mws_networks` is not supported at this time for GCP.
23+
This guide is provided as-is and you can use this guide as the basis for your custom Terraform module. Note that for BYO VPC (a.k.a. customer managed VPC), there is a 2-step process as full automation via the tradition `databricks_mws_networks` is not supported at this time for GCP.
2424

25-
Workaround:
25+
Workaround:
2626

27-
1. Run `apply` without the workspace.tf file to create all infrastructure and Databricks-compliant VPC with Cloud NAT.
27+
1. Run `apply` without the workspace.tf file to create all infrastructure and Databricks-compliant VPC with Cloud NAT.
2828
2. Run `apply` with workspace.tf to create the workspace once the `network_id` (see description above in variables) is created with the GCP accounts console (Cloud Resouces).
2929

3030
To get started, this code walks you through the following high-level steps:
31+
3132
- Initialize the required providers
3233
- Configure GCP objects
3334
- A VPC which satisfies the Databricks GCP networking [requirements](https://docs.gcp.databricks.com/administration-guide/cloud-configurations/gcp/customer-managed-vpc.html#network-requirements-1)

Diff for: modules/aws_base/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
page_title: "Create an AWS Databricks-compatible VPC using Terraform"
33
---
44

5+
# Create an AWS Databricks-compatible VPC using Terraform
56

67
This module uses the following input variables. Please all variables here in a terraform.tfvars file (outside the github project) and reference it using `terraform apply -var-file="<location for tfvars file>/terraform.tfvars"`.
78

@@ -22,7 +23,6 @@ This module uses the following input variables. Please all variables here in a t
2223
- `cross_account_role_arn` - Cross-account role ARN for deploying Databricks workspace
2324
- `root_bucket` - Name of S3 bucket used as Databricks root bucket
2425

25-
2626
```hcl
2727
module "aws_base" {
2828
source = "databricks/aws_base/"

Diff for: modules/aws_fs_lakehouse/README.md

+1
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@
22
page_title: "Create Resources in Existing Databricks Workspace and Configure Storage, Jobs, IP Access Lists, and Library Installation"
33
---
44

5+
# Create Resources in Existing Databricks Workspace and Configure Storage, Jobs, IP Access Lists, and Library Installation
56

67
This module uses the following input variables. Please all variables here in a terraform.tfvars file (outside the github project) and reference it using `terraform apply -var-file="<location for tfvars file>/terraform.tfvars"`.
78

Diff for: modules/aws_full_governed_ws/README.md

+2-1
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,10 @@
22
page_title: "Create Cloud Resources for Unity Catalog, Assign Metastores, and Create Sample Catalog"
33
---
44

5+
# Create Cloud Resources for Unity Catalog, Assign Metastores, and Create Sample Catalog
56

67
This module uses the following input variables. Please all variables here in a terraform.tfvars file (outside the github project) and reference it using `terraform apply -var-file="<location for tfvars file>/terraform.tfvars"`.
78

8-
99
## Input Variables
1010

1111
- `databricks_account_id`: The ID per Databricks AWS account used for accessing account management APIs. After the AWS E2 account is created, this is available after logging into [https://accounts.cloud.databricks.com](https://accounts.cloud.databricks.com).
@@ -18,6 +18,7 @@ This module uses the following input variables. Please all variables here in a t
1818
- `databricks_users` - List of all Databricks users which should be provisioned as part of the quickstart. This is mostly instructive and the best practice should be to use SCIM to sync users to the Databricks workspace - terraform can be used for the assignment (assuming users already exist)
1919
- `databricks_metastore_admins` - List of all metastore admins to be added to the Databricks group
2020
- `unity_admin_group` - Name of the group to be used as the admins for metastore, catalog, schema, and databases. This should be restricted to a small group of users with to administrate UC APIs.
21+
2122
```hcl
2223
module "aws_full_governed_ws" {
2324
source = "databricks/aws_full_governed_ws/"

Diff for: modules/azure_spoke_vnet/README.md

+4-2
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,8 @@
22
page_title: "Provisioning Azure Spoke VNET as per Data Exfiltration Protection with Terraform"
33
---
44

5+
# Provisioning Azure Spoke VNET as per Data Exfiltration Protection with Terraform
6+
57
[Reference documentation and blog](https://databricks.com/blog/2020/03/27/data-exfiltration-protection-with-azure-databricks.html)
68

79
This guide uses the following variables in configurations:
@@ -15,10 +17,10 @@ This guide uses the following variables in configurations:
1517
- `spoke_resource_group_name` - (Required) The name of the Resource Group to create
1618
- `project_name` - (Required) The name of the project associated with the infrastructure to be managed by Terraform
1719
- `spoke_vnet_address_space` - (Required) The address space for the spoke Virtual Network
18-
- `scc_relay_address_prefixes` - (Required) The IP address(es) of the Databricks SCC relay (see https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#control-plane-nat-and-webapp-ip-addresses)
20+
- `scc_relay_address_prefixes` - (Required) The IP address(es) of the Databricks SCC relay (see <https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#control-plane-nat-and-webapp-ip-addresses>)
1921
- `privatelink_subnet_address_prefixes` - (Required) The address prefix(es) for the PrivateLink subnet
2022
- `webapp_and_infra_routes` - (Required) Map of regional webapp and ext-infra CIDRs.
21-
Check https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#ip-addresses for more info
23+
Check <https://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/udr#ip-addresses> for more info
2224
Ex., for eastus:
2325
{
2426
"webapp1" : "40.70.58.221/32",

Diff for: modules/azure_vnet_injected_databricks_workspace/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
page_title: "Provisioning Azure Databricks Workspaces with Private Link and Data Exfiltration Protection with Terraform"
33
---
44

5-
# How to Deploy a Secure Azure Databricks Workspace, Hardened To Prevent Data Exfiltration.
5+
# How to Deploy a Secure Azure Databricks Workspace, Hardened To Prevent Data Exfiltration
66

77
This guide uses the following variables in configurations:
88

0 commit comments

Comments
 (0)