You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/analytics-platform-system/polybase-configure-azure-blob-storage.md
+17-17Lines changed: 17 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
-
title: "Use PolyBase to access external data in Azure Blob storage"
3
-
description: Explains how to use PolyBase on a Parallel Data Warehouse (APS) to query external data in Azure Blob storage.
2
+
title: "Use PolyBase to access external data in Azure Blob Storage"
3
+
description: Explains how to use PolyBase on a Parallel Data Warehouse (APS) to query external data in Azure Blob Storage.
4
4
author: mzaman1
5
5
ms.prod: sql
6
6
ms.technology: data-warehouse
@@ -10,28 +10,28 @@ ms.author: murshedz
10
10
ms.reviewer: martinle
11
11
ms.custom: seo-dt-2019
12
12
---
13
-
# Configure PolyBase to access external data in Azure Blob storage
13
+
# Configure PolyBase to access external data in Azure Blob Storage
14
14
15
-
The article explains how to use PolyBase on a SQL Server instance to query external data in Azure Blob storage.
15
+
The article explains how to use PolyBase on a SQL Server instance to query external data in Azure Blob Storage.
16
16
17
17
> [!NOTE]
18
-
> APS currently only supports standard general purpose v1 locally redundant (LRS) Azure Blob storage.
18
+
> APS currently only supports standard general purpose v1 locally redundant (LRS) Azure Blob Storage.
19
19
20
20
## Prerequisites
21
21
22
-
- Azure Blob storage in your subscription.
23
-
- A container created in the Azure Blob storage.
22
+
- Azure Blob Storage in your subscription.
23
+
- A container created in the Azure Blob Storage.
24
24
25
-
### Configure Azure Blob storage connectivity
25
+
### Configure Azure Blob Storage connectivity
26
26
27
-
First, configure APS to use Azure Blob storage.
27
+
First, configure APS to use Azure Blob Storage.
28
28
29
-
1. Run [sp_configure](../relational-databases/system-stored-procedures/sp-configure-transact-sql.md) with 'hadoop connectivity' set to an Azure Blob storage provider. To find the value for providers, see [PolyBase Connectivity Configuration](../database-engine/configure-windows/polybase-connectivity-configuration-transact-sql.md).
29
+
1. Run [sp_configure](../relational-databases/system-stored-procedures/sp-configure-transact-sql.md) with 'hadoop connectivity' set to an Azure Blob Storage provider. To find the value for providers, see [PolyBase Connectivity Configuration](../database-engine/configure-windows/polybase-connectivity-configuration-transact-sql.md).
30
30
31
31
```sql
32
32
-- Values map to various external data sources.
33
33
-- Example: value 7 stands for Hortonworks HDP 2.1 to 2.6 on Linux,
34
-
-- 2.1 to 2.3 on Windows Server, and Azure Blob storage
34
+
-- 2.1 to 2.3 on Windows Server, and Azure Blob Storage
@@ -43,15 +43,15 @@ First, configure APS to use Azure Blob storage.
43
43
44
44
## Configure an external table
45
45
46
-
To query the data in your Azure Blob storage, you must define an external table to use in Transact-SQL queries. The following steps describe how to configure the external table.
46
+
To query the data in your Azure Blob Storage, you must define an external table to use in Transact-SQL queries. The following steps describe how to configure the external table.
47
47
48
48
1. Create a master key on the database. It is required to encrypt the credential secret.
49
49
50
50
```sql
51
51
CREATE MASTER KEY ENCRYPTION BY PASSWORD ='S0me!nfo';
52
52
```
53
53
54
-
1. Create a database scoped credential for Azure Blob storage.
54
+
1. Create a database scoped credential for Azure Blob Storage.
55
55
56
56
```sql
57
57
-- IDENTITY: any string (this is not used for authentication to Azure storage).
@@ -75,7 +75,7 @@ To query the data in your Azure Blob storage, you must define an external table
75
75
1. Create an external file format with [CREATE EXTERNAL FILE FORMAT](../t-sql/statements/create-external-file-format-transact-sql.md).
76
76
77
77
```sql
78
-
-- FORMAT TYPE: Type of format in Azure Blob storage (DELIMITEDTEXT, RCFILE, ORC, PARQUET).
78
+
-- FORMAT TYPE: Type of format in Azure Blob Storage (DELIMITEDTEXT, RCFILE, ORC, PARQUET).
79
79
-- In this example, the files are pipe (|) delimited
80
80
CREATE EXTERNAL FILE FORMAT TextFileFormat WITH (
81
81
FORMAT_TYPE = DELIMITEDTEXT,
@@ -118,7 +118,7 @@ The following queries provide example with fictional car sensor data.
118
118
119
119
### Ad hoc queries
120
120
121
-
The following ad hoc query joins relational with data in Azure Blob storage. It selects customers who drive faster than 35 mph, joining structured customer data stored in SQL Server with car sensor data stored in Azure Blob storage.
121
+
The following ad hoc query joins relational with data in Azure Blob Storage. It selects customers who drive faster than 35 mph, joining structured customer data stored in SQL Server with car sensor data stored in Azure Blob Storage.
@@ -149,10 +149,10 @@ ON Insured_Customers.CustomerKey = SensorD.CustomerKey
149
149
150
150
### Exporting data
151
151
152
-
The following query exports data from APS to Azure Blob storage. It can be used to archive relational data to Azure Blob storage while still be able to query it.
152
+
The following query exports data from APS to Azure Blob Storage. It can be used to archive relational data to Azure Blob Storage while still be able to query it.
153
153
154
154
```sql
155
-
-- Export data: Move old data to Azure Blob storage while keeping it query-able via an external table.
155
+
-- Export data: Move old data to Azure Blob Storage while keeping it query-able via an external table.
Copy file name to clipboardExpand all lines: docs/azure-data-studio/extensions/extension-authoring.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@ To develop an extension, you need [Node.js](https://nodejs.org/) installed and a
30
30
To create your new extension, you can use the Azure Data Studio extension generator. The Yeoman [extension generator](https://www.npmjs.com/package/generator-azuredatastudio) is a beneficial starting point for extension projects. To start the generator, enter the following command in a command prompt:
31
31
32
32
```console
33
-
npm install -g yo generator-azuredatastudio # Install the generator
| Azure Arc extension |[Known Issue:](https://github.com/microsoft/azuredatastudio/issues/13319) The "Script to Notebook" button for Arc MIAA & PG deployments does not do field validation before scripting the notebook. This means that if users enter the password wrong in the password confirm inputs then they may end up with a notebook that has the wrong value for the password.| The "Deploy" button works as expected though so users should use that instead. |
48
-
| Object Explorer | Releases of ADS before 1.24.0 have a breaking change in object explorer due to the engine's changes related to [Azure Synapse Analytics serverless SQL pool](https://docs.microsoft.com/azure/synapse-analytics/sql/on-demand-workspace-overview). | To continue utilizing object explorer in Azure Data Studio with Azure Synapse Analytics serverless SQL pool, you need to use Azure Data Studio 1.24.0 or later. |
56
+
| Object Explorer | Releases of ADS before 1.24.0 have a breaking change in object explorer due to the engine's changes related to [Azure Synapse Analytics serverless SQL pool](/azure/synapse-analytics/sql/on-demand-workspace-overview). | To continue utilizing object explorer in Azure Data Studio with Azure Synapse Analytics serverless SQL pool, you need to use Azure Data Studio 1.24.0 or later. |
49
57
50
58
You can reference [Azure Data Studio feedback](https://github.com/microsoft/azuredatastudio) for other known issues and to provide feedback to the product team.
Copy file name to clipboardExpand all lines: docs/big-data-cluster/active-directory-deployment-aks.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,7 +13,7 @@ ms.technology: big-data-cluster
13
13
14
14
# Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)
15
15
16
-
SQL Server Big Data Clusters support [Active Directory (AD) deployment mode](deploy-active-directory.md) for **Identity and Access Management (IAM)**. IAM for **Azure Kubernetes Service (AKS)** has been challenging because industry-standard protocols such as OAuth 2.0 and OpenID Connect which is widely supported by Microsoft identity platform is not supported by SQL Server.
16
+
SQL Server Big Data Clusters support [Active Directory (AD) deployment mode](./active-directory-prerequisites.md) for **Identity and Access Management (IAM)**. IAM for **Azure Kubernetes Service (AKS)** has been challenging because industry-standard protocols such as OAuth 2.0 and OpenID Connect which is widely supported by Microsoft identity platform is not supported by SQL Server.
17
17
18
18
This article explains how to deploy a big data cluster (BDC) in AD mode while deploying in [Azure Kubernetes Service (AKS)](/azure/aks/intro-kubernetes).
19
19
@@ -61,4 +61,4 @@ For a BDC deployment in AD mode, the solution to [integrate on-premises Active D
61
61
62
62
## Next steps
63
63
64
-
[Tutorial: Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)](active-directory-deployment-aks-tutorial.md)
64
+
[Tutorial: Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)](active-directory-deployment-aks-tutorial.md)
Copy file name to clipboardExpand all lines: docs/big-data-cluster/cluster-monitor-ads.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ This article explains how to view the status of a big data cluster using Azure D
18
18
19
19
## <aid="datastudio"></a> Use Azure Data Studio
20
20
21
-
After downloading the latest **insiders build** of [Azure Data Studio](https://aka.ms/getazuredatastudio), you can view service endpoints and the status of a big data cluster with the SQL Server big data cluster dashboard. Some of the features below are only first available in the insiders build of Azure Data Studio.
21
+
After downloading the latest **insiders build** of [Azure Data Studio](../azure-data-studio/download-azure-data-studio.md), you can view service endpoints and the status of a big data cluster with the SQL Server big data cluster dashboard. Some of the features below are only first available in the insiders build of Azure Data Studio.
22
22
23
23
1. First, create a connection to your big data cluster in Azure Data Studio. For more information, see [Connect to a SQL Server big data cluster with Azure Data Studio](connect-to-big-data-cluster.md).
24
24
@@ -78,4 +78,4 @@ You can directly click on these links. You will be required to authenticate when
78
78
79
79
## Next steps
80
80
81
-
For more information about [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ss-nover.md)], see [What are [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ver15.md)]?](big-data-cluster-overview.md).
81
+
For more information about [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ss-nover.md)], see [What are [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ver15.md)]?](big-data-cluster-overview.md).
Copy file name to clipboardExpand all lines: docs/big-data-cluster/private-deploy.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ This section shows you deploy a BDC cluster in Azure Kubernetes Service (AKS) pr
31
31
32
32
## Create a private AKS cluster with advanced networking
33
33
34
-
```console
34
+
```bash
35
35
36
36
export REGION_NAME=<your Azure region >
37
37
export RESOURCE_GROUP=< your resource group name >
@@ -65,7 +65,7 @@ echo $SUBNET_ID
65
65
66
66
To be able to get to next step, you need to provision an AKS cluster with Standard Load Balancer with private cluster feature enabled. Your command will look like as follows:
67
67
68
-
```console
68
+
```bash
69
69
az aks create \
70
70
--resource-group $RESOURCE_GROUP \
71
71
--name $AKS_NAME \
@@ -85,21 +85,21 @@ After a successful deployment, you can go to `<MC_yourakscluster>` resource grou
85
85
86
86
## Connect to an AKS cluster
87
87
88
-
```console
88
+
```azurecli
89
89
az aks get-credentials -n $AKS_NAME -g $RESOURCE_GROUP
90
90
```
91
91
92
92
## Build Big Data Cluster (BDC) deployment profile
93
93
94
94
After connecting to an AKS cluster, you can start to deploy BDC, and you can prepare the environment variable and initiate a deployment:
Copy file name to clipboardExpand all lines: docs/big-data-cluster/quickstart-big-data-cluster-deploy-aro.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -39,7 +39,7 @@ The default big data cluster deployment used here consists of a SQL Master insta
39
39
40
40
The script uses Azure CLI to automate the creation of an ARO cluster. Before running the script, you must log in to your Azure account with Azure CLI at least once. Run the following command from a command prompt.
41
41
42
-
```terminal
42
+
```azurecli
43
43
az login
44
44
```
45
45
@@ -53,7 +53,7 @@ az login
53
53
54
54
1. Run the script using:
55
55
56
-
```terminal
56
+
```console
57
57
python deploy-sql-big-data-aro.py
58
58
```
59
59
@@ -77,7 +77,7 @@ If you are testing [!INCLUDE[big-data-clusters-2019](../includes/ssbigdatacluste
77
77
78
78
Run the following Azure CLI command to remove the big data cluster and the ARO service in Azure (replace `<resource group name>` with the **Azure resource group** you specified in the deployment script):
0 commit comments