Skip to content

Commit 39b54a1

Browse files
authored
Merge pull request #36140 from rwestMSFT/rw-1229-fix-ssis-fabric
Full article refresh
2 parents 6f4d279 + 524cc45 commit 39b54a1

1 file changed

Lines changed: 62 additions & 49 deletions

File tree

docs/integration-services/fabric-integration/integrate-fabric-data-warehouse.md

Lines changed: 62 additions & 49 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,8 @@ title: "Tutorial: Integrating SSIS with Fabric Data Warehouse"
33
description: Learn how to integrate SSIS with Fabric Data Warehouse
44
author: chugugrace
55
ms.author: chugu
6-
ms.date: 08/23/2024
6+
ms.reviewer: randolphwest
7+
ms.date: 12/29/2025
78
ms.service: sql
89
ms.subservice: integration-services
910
ms.topic: tutorial
@@ -13,80 +14,92 @@ ms.custom:
1314
---
1415
# Tutorial: Integrating SSIS with Fabric Data Warehouse
1516

16-
[!INCLUDE[sqlserver-ssis](../../includes/applies-to-version/sqlserver-ssis.md)]
17+
[!INCLUDE [sqlserver-ssis](../../includes/applies-to-version/sqlserver-ssis.md)]
1718

18-
This document focuses on the best practices to use existing SSIS packages to work with Data warehouse in Fabric platform.
19+
This article focuses on the best practices to use existing SSIS packages to work with Data warehouse in Fabric platform.
1920

2021
## Introduction
2122

22-
****Microsoft Fabric**** is a comprehensive analytics platform that covers every aspect of an organization's data estate. One of its key experiences is Fabric Data Warehouse, which serves as a simplified SaaS solution for a fully transactional warehouse. It stores data in OneLake using an open format called Delta Parquet, ensuring that data can be accessed by other experiences within Fabric and other client applications that connect using SQL drivers.
23+
Microsoft Fabric is a comprehensive analytics platform that covers every aspect of an organization's data estate. One of its key experiences is Fabric Data Warehouse, which serves as a simplified SaaS solution for a fully transactional warehouse. It stores data in OneLake using an open format called Delta Parquet, ensuring that data can be accessed by other experiences within Fabric and other client applications that connect using SQL drivers.
2324

24-
****Microsoft Fabric****, as an analytics platform, exclusively supports authentication through ****Microsoft Entra ID**** for users and Service Principals (SPNs). This deliberate choice ensures centralized and identity-based security, aligning with modern security practices. So, SQL authentication and other authentication methods aren't supported in Fabric Data Warehouse within the Fabric ecosystem.
25+
As an analytics platform, Microsoft Fabric exclusively supports authentication through Microsoft Entra ID for users and service principals (SPNs). This deliberate choice ensures centralized and identity-based security, aligning with modern security practices. So, SQL authentication and other authentication methods aren't supported in Fabric Data Warehouse within the Fabric ecosystem.
2526

2627
## Integration with Fabric Data Warehouse
27-
Microsoft SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database that is an ETL solution. SSIS is widely used by enterprise customers to perform ETL on premises by many customers.
2828

29-
Two key modifications are required in SSIS package to work seamlessly with Fabric Data Warehouse, outlined as follows.
29+
Microsoft SQL Server Integration Services (SSIS) is a component of the Microsoft SQL Server database that's an ETL solution. Many enterprise customers widely use SSIS to perform on-premises ETL.
30+
31+
To work seamlessly with Fabric Data Warehouse, you need to make two key modifications to your SSIS package.
3032

3133
### Authentication
32-
If you're using SQL Authentication or Windows Authentication, reconfigure it to utilize Microsoft Entra ID User or Service Principal Name (SPN). Keep in mind that if you're using a User account, multifactor authentication (MFA) must be disabled, as SSIS doesn't support pop-up prompts. It also needs respective drivers as mentioned below:
3334

34-
****To use [OLEDB connection manager](../connection-manager/ole-db-connection-manager.md)****:
35-
- Install [OLE DB Driver for SQL Server](../../connect/oledb/features/using-azure-active-directory.md) version that supports Microsoft Entra ID
36-
- Set Authentication to ****ActiveDirectoryServicePrincipal**** or ****ActiveDirectoryPassword****.
35+
If you're using SQL Authentication or Windows Authentication, reconfigure it to use Microsoft Entra ID User or Service Principal Name (SPN). If you use a user account, disable multifactor authentication (MFA), because SSIS doesn't support pop-up prompts. You also need the respective drivers as mentioned in the following sections:
36+
37+
To use [OLEDB connection manager](../connection-manager/ole-db-connection-manager.md):
38+
39+
- Install [Use Microsoft Entra ID](../../connect/oledb/features/using-azure-active-directory.md) version that supports Microsoft Entra ID.
40+
41+
- Set Authentication to `ActiveDirectoryServicePrincipal` or `ActiveDirectoryPassword`.
42+
3743
- OLEDB only works for [Execute SQL Task](../control-flow/execute-sql-task.md), doesn't work for [OLE DB Destination](../data-flow/ole-db-destination.md).
38-
:::image type="content" border="false" source="media/ole-db-connection-1.png" alt-text="Screenshot of oledb connection manager part 1." lightbox="media/ole-db-connection-1.png":::
39-
:::image type="content" source="media/ole-db-connection-2.png" alt-text="Screenshot of oledb connection manager part 2." lightbox="media/ole-db-connection-2.png":::
40-
41-
****To use ADO.NET connection manager****:
42-
- Use Microsoft OLE DB provider for SQL Server for [.NET Framework Data Provider for OLE DB](/dotnet/framework/data/adonet/data-providers).
43-
- Set Authentication to ****ActiveDirectoryServicePrincipal**** or ****ActiveDirectoryPassword****.
44-
:::image type="content" source="media/ado-net-connection.png" alt-text="Screenshot of ado connection manager part 1." lightbox="media/ado-net-connection.png":::
45-
46-
### File ingestion
47-
The ****Fabric Data Warehouse**** recommends utilizing the native T-SQL command 'COPY INTO' for efficient data insertion into the warehouse. So, any DFT operations that currently rely on ****Fast Insert Mode**** or ****BCP IN**** scripts should be replaced with the ****COPY INTO**** statement by utilizing [Execute SQL Task](../control-flow/execute-sql-task.md).
44+
45+
:::image type="content" source="media/ole-db-connection-1.png" alt-text="Screenshot of oledb connection manager part 1.":::
46+
47+
:::image type="content" source="media/ole-db-connection-2.png" alt-text="Screenshot of oledb connection manager part 2." lightbox="media/ole-db-connection-2.png":::
48+
49+
To use ADO.NET connection manager:
50+
51+
- Use Microsoft OLE DB provider for SQL Server for [.NET Framework Data Provider for OLE DB](/dotnet/framework/data/adonet/data-providers).
52+
53+
- Set Authentication to `ActiveDirectoryServicePrincipal` or `ActiveDirectoryPassword`.
54+
55+
:::image type="content" source="media/ado-net-connection.png" alt-text="Screenshot of ado connection manager part 1." lightbox="media/ado-net-connection.png":::
56+
57+
### File ingestion
58+
59+
You should use the native `COPY INTO` T-SQL command for efficient data insertion into your warehouse in Microsoft Fabric. Replace any DFT operations that currently rely on **Fast Insert Mode** or `BCP IN` scripts with the `COPY INTO` statement by using [Execute SQL Task](../control-flow/execute-sql-task.md).
4860

4961
### SSIS writing data into Data Warehouse in Fabric
5062

51-
It's a common ETL scenario where data is read from different sources like transactional databases, network file shares, local/network etc., perform transformation steps and write back to a designated DW like a SQL server, synapse dedicated pool or any other SQL compliant data store (like shown below in the diagram).
63+
In common ETL scenarios, you read data from different sources like transactional databases, network file shares, local or network locations. You can perform transformation steps and write the data back to a designated data warehouse like a SQL server, Synapse dedicated pool, or any other SQL compliant data store (as shown in the following diagram).
64+
65+
:::image type="content" source="media/etl-data-warehouse-destination.png" alt-text="Diagram of etl data warehouse as destination.":::
66+
67+
To make the same SSIS package write to Fabric Data Warehouse, first update the authentication to Microsoft Entra ID based if it's not already used. Second, temporarily stage the data in an ADLS Gen2. Then pass the path to the COPY INTO command in [Execute SQL Task](../control-flow/execute-sql-task.md).
68+
69+
[Flexible File Destination](../data-flow/flexible-file-destination.md) component enables an SSIS package to write data to [Azure Data Lake Storage Gen2 (ADLS Gen2)](/azure/storage/blobs/data-lake-storage-introduction). Inside Data Flow task, after loading and transformation, add a [Flexible File Destination](../data-flow/flexible-file-destination.md), in which you can define destination file name and location in ADLS Gen2.
5270

53-
:::image type="content" border="false" source="media/etl-data-warehouse-destination.png" alt-text="Diagram of etl data warehouse as destination.":::
71+
:::image type="content" source="media/flexible-file-1.png" alt-text="Screenshot of Flexible file destination part 1.":::
5472

55-
In order to make same SSIS package to write to Fabric Data Warehouse, First, update the authentication to Microsoft Entra ID based if not already used. Second, temporarily stage the data in an ADLS Gen2. Then pass the path to COPY INTO command in [Execute SQL Task](../control-flow/execute-sql-task.md).
56-
73+
:::image type="content" source="media/flexible-file-2.png" alt-text="Screenshot of Flexible file destination part 2.":::
5774

58-
[Flexible File Destination](../data-flow/flexible-file-destination.md) component enables an SSIS package to write data to [Azure Data Lake Storage Gen2 (ADLS Gen2)](/azure/storage/blobs/data-lake-storage-introduction). Inside Data Flow task, after loading and transformation, add a [Flexible File Destination](../data-flow/flexible-file-destination.md), in which you can define destination file name and location in ADLS Gen2.
75+
:::image type="content" source="media/flexible-file-3.png" alt-text="Screenshot of Flexible file destination part 3.":::
5976

60-
:::image type="content" source="media/flexible-file-1.png" alt-text="Screenshot of Flexible file destination part 1." lightbox="media/flexible-file-1.png":::
61-
:::image type="content" source="media/flexible-file-2.png" alt-text="Screenshot of Flexible file destination part 2." lightbox="media/flexible-file-2.png":::
62-
:::image type="content" source="media/flexible-file-3.png" alt-text="Screenshot of Flexible file destination part 3." lightbox="media/flexible-file-3.png":::
77+
You can ingest data landed in Azure Data Lake Storage (ADLS) Gen2 into Warehouse with the `COPY` statement directly via [Execute SQL Task](../control-flow/execute-sql-task.md).
6378

64-
Data landed in Azure Data Lake Storage (ADLS) Gen2 can be ingested into Warehouse using COPY statement directly via [Execute SQL Task](../control-flow/execute-sql-task.md).
79+
For example (replace `<storage_account>`, `<storage_account_key>` and `account_key` with valid values):
6580

66-
For example:
6781
```sql
68-
COPY INTO <table_name>
69-
FROM 'https://<Your_storage_account>.dfs.core.windows.net/<folder>/'
70-
WITH (
71-
FILE_TYPE = 'CSV',
72-
CREDENTIAL=(IDENTITY= 'Storage Account Key', SECRET= '<Your_account_key>'),
73-
FIELDQUOTE = '"',
74-
FIELDTERMINATOR=',',
75-
ROWTERMINATOR='0x0A',
76-
ENCODING = 'UTF8'
77-
)
82+
COPY INTO table_name FROM 'https://<storage_account>.dfs.core.windows.net/<folder>/'
83+
WITH (FILE_TYPE = 'CSV',
84+
CREDENTIAL = (IDENTITY = '<storage_account_key>',
85+
SECRET = '<account_key>'),
86+
FIELDQUOTE = '"',
87+
FIELDTERMINATOR = ',',
88+
ROWTERMINATOR = '0x0A',
89+
ENCODING = 'UTF8'
90+
);
7891
```
79-
:::image type="content" source="media/execute-sql-task.png" alt-text="Screenshot of Execute sql task." lightbox="media/execute-sql-task.png" :::
8092

81-
More detail instructions refer to [Ingest data into your Warehouse using the COPY statement](/fabric/data-warehouse/ingest-data-copy).
93+
:::image type="content" source="media/execute-sql-task.png" alt-text="Screenshot of Execute SQL Task." lightbox="media/execute-sql-task.png":::
8294

83-
### Known limitations
95+
For more detailed instructions, see [Ingest data into your Warehouse using the COPY statement](/fabric/data-warehouse/ingest-data-copy).
8496

85-
Fabric data Warehouse supports a subset of T-SQL data types and not all T-SQL all commands are currently supported. Your packages might be failed due to unsupported features. For details, please check [Data types in Warehouse](/fabric/data-warehouse/data-types?branch=main) and [T-SQL surface area](/fabric/data-warehouse/tsql-surface-area).
97+
## Limitations
8698

87-
### References
88-
[T-SQL surface area - Microsoft Fabric | Microsoft Learn](/fabric/data-warehouse/tsql-surface-area)
99+
Fabric data Warehouse supports a subset of T-SQL data types and not all T-SQL commands are currently supported. Your packages might fail due to unsupported features. For details, check [Data types in Warehouse](/fabric/data-warehouse/data-types) and [T-SQL surface area in Fabric Data Warehouse](/fabric/data-warehouse/tsql-surface-area).
89100

90-
[Options to get data into the Lakehouse - Microsoft Fabric | Microsoft Learn](/fabric/data-engineering/load-data-lakehouse)
101+
### Related content
91102

92-
[Ingesting data into the warehouse - Microsoft Fabric | Microsoft Learn](/fabric/data-warehouse/ingest-data)
103+
- [T-SQL surface area in Fabric Data Warehouse](/fabric/data-warehouse/tsql-surface-area)
104+
- [Options to get data into the Fabric Lakehouse](/fabric/data-engineering/load-data-lakehouse)
105+
- [Ingest data into the Warehouse](/fabric/data-warehouse/ingest-data)

0 commit comments

Comments
 (0)