Skip to content

Commit 243b14b

Browse files
author
v-makouz
authored
Merge branch 'master' into v-makouz-ae_money
2 parents c8afe14 + a5cdbbe commit 243b14b

920 files changed

Lines changed: 4912 additions & 3127 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.openpublishing.redirection.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -59114,6 +59114,11 @@
5911459114
"source_path": "docs/ssms/quickstarts/connect-query-sql-server.md",
5911559115
"redirect_url": "/sql/ssms/quickstarts/ssms-connect-query-sql-server",
5911659116
"redirect_document_id": false
59117+
},
59118+
{
59119+
"source_path": "docs/relational-databases/security/encryption/always-encrypted-enclaves-query-columns-ssms.md",
59120+
"redirect_url": "/sql/relational-databases/security/encryption/always-encrypted-enclaves-query-columns",
59121+
"redirect_document_id": true
5911759122
}
5911859123
]
5911959124
}

docs/ado/guide/data/calling-a-stored-procedure-with-a-command.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -128,4 +128,4 @@ End Function
128128
```
129129

130130
## See Also
131-
[Knowledge Base article 117500](https://go.microsoft.com/fwlink/?LinkId=117500)
131+
[Knowledge Base article 117500](https://www.betaarchive.com/wiki/index.php?title=Microsoft_KB_Archive/185125)

docs/analytics-platform-system/polybase-configure-azure-blob-storage.md

Lines changed: 17 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
2-
title: "Use PolyBase to access external data in Azure Blob storage"
3-
description: Explains how to use PolyBase on a Parallel Data Warehouse (APS) to query external data in Azure Blob storage.
2+
title: "Use PolyBase to access external data in Azure Blob Storage"
3+
description: Explains how to use PolyBase on a Parallel Data Warehouse (APS) to query external data in Azure Blob Storage.
44
author: mzaman1
55
ms.prod: sql
66
ms.technology: data-warehouse
@@ -10,28 +10,28 @@ ms.author: murshedz
1010
ms.reviewer: martinle
1111
ms.custom: seo-dt-2019
1212
---
13-
# Configure PolyBase to access external data in Azure Blob storage
13+
# Configure PolyBase to access external data in Azure Blob Storage
1414

15-
The article explains how to use PolyBase on a SQL Server instance to query external data in Azure Blob storage.
15+
The article explains how to use PolyBase on a SQL Server instance to query external data in Azure Blob Storage.
1616

1717
> [!NOTE]
18-
> APS currently only supports standard general purpose v1 locally redundant (LRS) Azure Blob storage.
18+
> APS currently only supports standard general purpose v1 locally redundant (LRS) Azure Blob Storage.
1919
2020
## Prerequisites
2121

22-
- Azure Blob storage in your subscription.
23-
- A container created in the Azure Blob storage.
22+
- Azure Blob Storage in your subscription.
23+
- A container created in the Azure Blob Storage.
2424

25-
### Configure Azure Blob storage connectivity
25+
### Configure Azure Blob Storage connectivity
2626

27-
First, configure APS to use Azure Blob storage.
27+
First, configure APS to use Azure Blob Storage.
2828

29-
1. Run [sp_configure](../relational-databases/system-stored-procedures/sp-configure-transact-sql.md) with 'hadoop connectivity' set to an Azure Blob storage provider. To find the value for providers, see [PolyBase Connectivity Configuration](../database-engine/configure-windows/polybase-connectivity-configuration-transact-sql.md).
29+
1. Run [sp_configure](../relational-databases/system-stored-procedures/sp-configure-transact-sql.md) with 'hadoop connectivity' set to an Azure Blob Storage provider. To find the value for providers, see [PolyBase Connectivity Configuration](../database-engine/configure-windows/polybase-connectivity-configuration-transact-sql.md).
3030

3131
```sql
3232
-- Values map to various external data sources.
3333
-- Example: value 7 stands for Hortonworks HDP 2.1 to 2.6 on Linux,
34-
-- 2.1 to 2.3 on Windows Server, and Azure Blob storage
34+
-- 2.1 to 2.3 on Windows Server, and Azure Blob Storage
3535
sp_configure @configname = 'hadoop connectivity', @configvalue = 7;
3636
GO
3737

@@ -43,15 +43,15 @@ First, configure APS to use Azure Blob storage.
4343

4444
## Configure an external table
4545

46-
To query the data in your Azure Blob storage, you must define an external table to use in Transact-SQL queries. The following steps describe how to configure the external table.
46+
To query the data in your Azure Blob Storage, you must define an external table to use in Transact-SQL queries. The following steps describe how to configure the external table.
4747

4848
1. Create a master key on the database. It is required to encrypt the credential secret.
4949

5050
```sql
5151
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'S0me!nfo';
5252
```
5353

54-
1. Create a database scoped credential for Azure Blob storage.
54+
1. Create a database scoped credential for Azure Blob Storage.
5555

5656
```sql
5757
-- IDENTITY: any string (this is not used for authentication to Azure storage).
@@ -75,7 +75,7 @@ To query the data in your Azure Blob storage, you must define an external table
7575
1. Create an external file format with [CREATE EXTERNAL FILE FORMAT](../t-sql/statements/create-external-file-format-transact-sql.md).
7676

7777
```sql
78-
-- FORMAT TYPE: Type of format in Azure Blob storage (DELIMITEDTEXT, RCFILE, ORC, PARQUET).
78+
-- FORMAT TYPE: Type of format in Azure Blob Storage (DELIMITEDTEXT, RCFILE, ORC, PARQUET).
7979
-- In this example, the files are pipe (|) delimited
8080
CREATE EXTERNAL FILE FORMAT TextFileFormat WITH (
8181
FORMAT_TYPE = DELIMITEDTEXT,
@@ -118,7 +118,7 @@ The following queries provide example with fictional car sensor data.
118118

119119
### Ad hoc queries
120120

121-
The following ad hoc query joins relational with data in Azure Blob storage. It selects customers who drive faster than 35 mph, joining structured customer data stored in SQL Server with car sensor data stored in Azure Blob storage.
121+
The following ad hoc query joins relational with data in Azure Blob Storage. It selects customers who drive faster than 35 mph, joining structured customer data stored in SQL Server with car sensor data stored in Azure Blob Storage.
122122

123123
```sql
124124
SELECT DISTINCT Insured_Customers.FirstName,Insured_Customers.LastName,
@@ -149,10 +149,10 @@ ON Insured_Customers.CustomerKey = SensorD.CustomerKey
149149

150150
### Exporting data
151151

152-
The following query exports data from APS to Azure Blob storage. It can be used to archive relational data to Azure Blob storage while still be able to query it.
152+
The following query exports data from APS to Azure Blob Storage. It can be used to archive relational data to Azure Blob Storage while still be able to query it.
153153

154154
```sql
155-
-- Export data: Move old data to Azure Blob storage while keeping it query-able via an external table.
155+
-- Export data: Move old data to Azure Blob Storage while keeping it query-able via an external table.
156156
CREATE EXTERNAL TABLE [dbo].[FastCustomers2009]
157157
WITH (
158158
LOCATION='/archive/customer/2009',

docs/azure-data-studio/download-azure-data-studio.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ author: yualan
88
ms.author: alayu
99
ms.reviewer: maghan
1010
ms.custom: seodec18
11-
ms.date: 12/11/2020
11+
ms.date: 1/22/2020
1212
---
1313

1414
# Download and install Azure Data Studio
@@ -21,9 +21,9 @@ Azure Data Studio offers a modern editor experience with IntelliSense, code snip
2121

2222
| Platform | Download | Release date | Version |
2323
|----------|----------|--------------|---------|
24-
| Windows | [User Installer (recommended)](https://go.microsoft.com/fwlink/?linkid=2150927)<br>[System Installer](https://go.microsoft.com/fwlink/?linkid=2150928)<br>[.zip](https://go.microsoft.com/fwlink/?linkid=2151312) | December 11, 2020 | 1.25.1 |
25-
| macOS | [.zip](https://go.microsoft.com/fwlink/?linkid=2151311) | December 11, 2020 | 1.25.1 |
26-
| Linux | [.deb](https://go.microsoft.com/fwlink/?linkid=2151506)<br>[.rpm](https://go.microsoft.com/fwlink/?linkid=2151407)<br>[.tar.gz](https://go.microsoft.com/fwlink/?linkid=2151508) | December 11, 2020 | 1.25.1 |
24+
| Windows | [User Installer (recommended)](https://go.microsoft.com/fwlink/?linkid=2150927)<br>[System Installer](https://go.microsoft.com/fwlink/?linkid=2150928)<br>[.zip](https://go.microsoft.com/fwlink/?linkid=2151312) | January 22, 2020 | 1.25.2 |
25+
| macOS | [.zip](https://go.microsoft.com/fwlink/?linkid=2151311) | January 22, 2020 | 1.25.2 |
26+
| Linux | [.deb](https://go.microsoft.com/fwlink/?linkid=2151506)<br>[.rpm](https://go.microsoft.com/fwlink/?linkid=2151407)<br>[.tar.gz](https://go.microsoft.com/fwlink/?linkid=2151508) | January 22, 2020 | 1.25.2 |
2727

2828
**For details about the latest release, see the [release notes](./release-notes-azure-data-studio.md).**
2929

@@ -145,6 +145,7 @@ Azure Data Studio runs on Windows, macOS, and Linux and is supported on the foll
145145
- macOS 10.14 Mojave
146146
- macOS 10.13 High Sierra
147147
- macOS 10.12 Sierra
148+
- macOS 11.1 Big Sur
148149

149150
### Linux
150151

docs/azure-data-studio/extensions/extension-authoring.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ To develop an extension, you need [Node.js](https://nodejs.org/) installed and a
3030
To create your new extension, you can use the Azure Data Studio extension generator. The Yeoman [extension generator](https://www.npmjs.com/package/generator-azuredatastudio) is a beneficial starting point for extension projects. To start the generator, enter the following command in a command prompt:
3131

3232
```console
33-
npm install -g yo generator-azuredatastudio # Install the generator
33+
npm install -g yo generator-azuredatastudio
3434
yo azuredatastudio
3535
```
3636

docs/azure-data-studio/release-notes-azure-data-studio.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,21 @@ author: yualan
88
ms.author: alayu
99
ms.reviewer: maghan
1010
ms.custom: seodec18
11-
ms.date: 12/9/2020
11+
ms.date: 1/22/2020
1212
---
1313

1414
# Release notes for Azure Data Studio
1515

1616
**[Download and install the latest release!](./download-azure-data-studio.md)**
1717

18+
## December 2020 (hotfix)
19+
20+
January 22, 2020 &nbsp; / &nbsp; version: 1.25.2
21+
22+
| Change | Details |
23+
| ------ | ------- |
24+
| Fix bug [#13899](https://github.com/microsoft/azuredatastudio/issues/13899)| Scrolling to the appropriate cross-reference links in Notebooks |
25+
1826
## December 2020
1927

2028
December 9, 2020 &nbsp; / &nbsp; version: 1.25.0
@@ -45,7 +53,7 @@ November 12, 2020 &nbsp; / &nbsp; version: 1.24.0
4553
| New Item | Details | Workaround |
4654
|----------|---------|------------|
4755
| Azure Arc extension | [Known Issue:](https://github.com/microsoft/azuredatastudio/issues/13319) The "Script to Notebook" button for Arc MIAA & PG deployments does not do field validation before scripting the notebook. This means that if users enter the password wrong in the password confirm inputs then they may end up with a notebook that has the wrong value for the password.| The "Deploy" button works as expected though so users should use that instead. |
48-
| Object Explorer | Releases of ADS before 1.24.0 have a breaking change in object explorer due to the engine's changes related to [Azure Synapse Analytics serverless SQL pool](https://docs.microsoft.com/azure/synapse-analytics/sql/on-demand-workspace-overview). | To continue utilizing object explorer in Azure Data Studio with Azure Synapse Analytics serverless SQL pool, you need to use Azure Data Studio 1.24.0 or later. |
56+
| Object Explorer | Releases of ADS before 1.24.0 have a breaking change in object explorer due to the engine's changes related to [Azure Synapse Analytics serverless SQL pool](/azure/synapse-analytics/sql/on-demand-workspace-overview). | To continue utilizing object explorer in Azure Data Studio with Azure Synapse Analytics serverless SQL pool, you need to use Azure Data Studio 1.24.0 or later. |
4957

5058
You can reference [Azure Data Studio feedback](https://github.com/microsoft/azuredatastudio) for other known issues and to provide feedback to the product team.
5159

docs/big-data-cluster/active-directory-deployment-aks.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.technology: big-data-cluster
1313

1414
# Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)
1515

16-
SQL Server Big Data Clusters support [Active Directory (AD) deployment mode](deploy-active-directory.md) for **Identity and Access Management (IAM)**. IAM for **Azure Kubernetes Service (AKS)** has been challenging because industry-standard protocols such as OAuth 2.0 and OpenID Connect which is widely supported by Microsoft identity platform is not supported by SQL Server.
16+
SQL Server Big Data Clusters support [Active Directory (AD) deployment mode](./active-directory-prerequisites.md) for **Identity and Access Management (IAM)**. IAM for **Azure Kubernetes Service (AKS)** has been challenging because industry-standard protocols such as OAuth 2.0 and OpenID Connect which is widely supported by Microsoft identity platform is not supported by SQL Server.
1717

1818
This article explains how to deploy a big data cluster (BDC) in AD mode while deploying in [Azure Kubernetes Service (AKS)](/azure/aks/intro-kubernetes).
1919

@@ -61,4 +61,4 @@ For a BDC deployment in AD mode, the solution to [integrate on-premises Active D
6161

6262
## Next steps
6363

64-
[Tutorial: Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)](active-directory-deployment-aks-tutorial.md)
64+
[Tutorial: Deploy SQL Server Big Data Clusters in AD mode on Azure Kubernetes Services (AKS)](active-directory-deployment-aks-tutorial.md)

docs/big-data-cluster/cluster-monitor-ads.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ This article explains how to view the status of a big data cluster using Azure D
1818

1919
## <a id="datastudio"></a> Use Azure Data Studio
2020

21-
After downloading the latest **insiders build** of [Azure Data Studio](https://aka.ms/getazuredatastudio), you can view service endpoints and the status of a big data cluster with the SQL Server big data cluster dashboard. Some of the features below are only first available in the insiders build of Azure Data Studio.
21+
After downloading the latest **insiders build** of [Azure Data Studio](../azure-data-studio/download-azure-data-studio.md), you can view service endpoints and the status of a big data cluster with the SQL Server big data cluster dashboard. Some of the features below are only first available in the insiders build of Azure Data Studio.
2222

2323
1. First, create a connection to your big data cluster in Azure Data Studio. For more information, see [Connect to a SQL Server big data cluster with Azure Data Studio](connect-to-big-data-cluster.md).
2424

@@ -78,4 +78,4 @@ You can directly click on these links. You will be required to authenticate when
7878

7979
## Next steps
8080

81-
For more information about [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ss-nover.md)], see [What are [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ver15.md)]?](big-data-cluster-overview.md).
81+
For more information about [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ss-nover.md)], see [What are [!INCLUDE[big-data-clusters-2019](../includes/ssbigdataclusters-ver15.md)]?](big-data-cluster-overview.md).

docs/big-data-cluster/private-deploy.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ This section shows you deploy a BDC cluster in Azure Kubernetes Service (AKS) pr
3131

3232
## Create a private AKS cluster with advanced networking
3333

34-
```console
34+
```bash
3535

3636
export REGION_NAME=<your Azure region >
3737
export RESOURCE_GROUP=< your resource group name >
@@ -65,7 +65,7 @@ echo $SUBNET_ID
6565

6666
To be able to get to next step, you need to provision an AKS cluster with Standard Load Balancer with private cluster feature enabled. Your command will look like as follows:
6767

68-
```console
68+
```bash
6969
az aks create \
7070
--resource-group $RESOURCE_GROUP \
7171
--name $AKS_NAME \
@@ -85,21 +85,21 @@ After a successful deployment, you can go to `<MC_yourakscluster>` resource grou
8585

8686
## Connect to an AKS cluster
8787

88-
```console
88+
```azurecli
8989
az aks get-credentials -n $AKS_NAME -g $RESOURCE_GROUP
9090
```
9191

9292
## Build Big Data Cluster (BDC) deployment profile
9393

9494
After connecting to an AKS cluster, you can start to deploy BDC, and you can prepare the environment variable and initiate a deployment:
9595

96-
```console
96+
```azurecli
9797
azdata bdc config init --source aks-dev-test --target private-bdc-aks --force
9898
```
9999

100100
Generate and config BDC custom deployment profile:
101101

102-
```console
102+
```azurecli
103103
azdata bdc config replace -c private-bdc-aks/control.json -j "$.spec.docker.imageTag=2019-CU6-ubuntu-16.04"
104104
azdata bdc config replace -c private-bdc-aks/control.json -j "$.spec.storage.data.className=default"
105105
azdata bdc config replace -c private-bdc-aks/control.json -j "$.spec.storage.logs.className=default"
@@ -118,13 +118,13 @@ In case you are [deploying a SQL Server Big Data Cluster (SQL-BDC) with high ava
118118

119119
The following example sets the `ServiceType` as `NodePort`:
120120

121-
```console
121+
```azurecli
122122
azdata bdc config replace -c private-bdc-aks /bdc.json -j "$.spec.resources.master.spec.endpoints[1].serviceType=NodePort"
123123
```
124124

125125
## Deploy BDC in AKS private cluster
126126

127-
```console
127+
```azurecli
128128
export AZDATA_USERNAME=<your bdcadmin username>
129129
export AZDATA_PASSWORD=< your bdcadmin password>
130130

docs/big-data-cluster/quickstart-big-data-cluster-deploy-aro.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ The default big data cluster deployment used here consists of a SQL Master insta
3939

4040
The script uses Azure CLI to automate the creation of an ARO cluster. Before running the script, you must log in to your Azure account with Azure CLI at least once. Run the following command from a command prompt.
4141

42-
```terminal
42+
```azurecli
4343
az login
4444
```
4545

@@ -53,7 +53,7 @@ az login
5353

5454
1. Run the script using:
5555

56-
```terminal
56+
```console
5757
python deploy-sql-big-data-aro.py
5858
```
5959

@@ -77,7 +77,7 @@ If you are testing [!INCLUDE[big-data-clusters-2019](../includes/ssbigdatacluste
7777
7878
Run the following Azure CLI command to remove the big data cluster and the ARO service in Azure (replace `<resource group name>` with the **Azure resource group** you specified in the deployment script):
7979

80-
```terminal
80+
```azurecli
8181
az group delete -n <resource group name>
8282
```
8383

0 commit comments

Comments
 (0)