You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -638,11 +638,13 @@ Provides the connectivity protocol and path to the external data source.
638
638
| Generic ODBC |`odbc`|`<server_name>[:port]`| Starting with [!INCLUDE[sql-server-2019](../../includes/sssql19-md.md)] - Windows only |
639
639
| Bulk Operations |`https`|`<storage_account>.blob.core.windows.net/<container>`| Starting with [!INCLUDE[ssSQL17](../../includes/sssql17-md.md)]|
640
640
| Azure Data Lake Storage Gen2 |`abfs[s]`|`abfss://<container>@<storage _account>.dfs.core.windows.net`| Starting with [!INCLUDE[sql-server-2019](../../includes/sssql19-md.md)] CU11+. |
641
+
|[!INCLUDE[ssbigdataclusters-ss-nover](../../includes/ssbigdataclusters-ss-nover.md)] data pool |`sqldatapool`|`sqldatapool://controller-svc/default`| Starting with [!INCLUDE[ssbigdataclusters-ss-nover](../../includes/ssbigdataclusters-ss-nover.md)]|
642
+
|[!INCLUDE[ssbigdataclusters-ss-nover](../../includes/ssbigdataclusters-ss-nover.md)] storage pool |`sqlhdfs`|`sqlhdfs://controller-svc/default`| Starting with [!INCLUDE[ssbigdataclusters-ss-nover](../../includes/ssbigdataclusters-ss-nover.md)]|
641
643
|||||
642
644
643
645
#### Location path:
644
646
645
-
-`<`Namenode`>` = the machine name, name service URI, or IP address of the `Namenode` in the Hadoop cluster. PolyBase must resolve any DNS names used by the Hadoop cluster. <!-- For highly available Hadoop configurations, provide the Nameservice ID as the `LOCATION`. -->
647
+
-`<Namenode>` = the machine name, name service URI, or IP address of the `Namenode` in the Hadoop cluster. PolyBase must resolve any DNS names used by the Hadoop cluster. <!-- For highly available Hadoop configurations, provide the Nameservice ID as the `LOCATION`. -->
646
648
-`port` = The port that the external data source is listening on. In Hadoop, the port can be found using the `fs.defaultFS` configuration parameter. The default is 8020.
647
649
-`<container>` = the container of the storage account holding the data. Root containers are read-only, data can't be written back to the container.
648
650
-`<storage_account>` = the storage account name of the Azure resource.
@@ -659,6 +661,7 @@ Additional notes and guidance when setting the location:
659
661
- The `abfs` or `abfss` APIs are supported when accessing Azure Storage Accounts starting with [!INCLUDE[sql-server-2019](../../includes/sssql19-md.md)] CU11. For more information, see [the Azure Blob Filesystem driver (ABFS)](/azure/storage/blobs/data-lake-storage-abfs-driver).
660
662
- The Hierarchical Namespace option for Azure Storage Accounts(V2) using `abfs[s]` is supported via Azure Data Lake Storage Gen2 starting with [!INCLUDE[sql-server-2019](../../includes/sssql19-md.md)] CU11+. The Hierarchical Namespace option is otherwise not supported, and this option should remain **disabled**.
661
663
- To ensure successful PolyBase queries during a Hadoop `Namenode` fail-over, consider using a virtual IP address for the `Namenode` of the Hadoop cluster. If you don't, execute an [ALTER EXTERNAL DATA SOURCE][alter_eds] command to point to the new location.
664
+
- The `sqlhdfs` and `sqldatapool` types are supported for connecting between the master instance and storage pool of a big data cluster. For Cloudera CDH or Hortonworks HDP, use `hdfs`. For more information on using `sqlhdfs` for querying [!INCLUDE[ssbigdataclusters-ss-nover](../../includes/ssbigdataclusters-ss-nover.md)] storage pools, see [Query HDFS in a SQL Server big data cluster](../../big-data-cluster/tutorial-query-hdfs-storage-pool.md).
0 commit comments