Skip to content

Commit 3f33212

Browse files
authored
Merge 3d5153f into 106c93a
2 parents 106c93a + 3d5153f commit 3f33212

2 files changed

Lines changed: 112 additions & 1 deletion

File tree

0_Azure/2_AzureAnalytics/0_Fabric/demos/30_dynamic_pipeline_nbkparameters.md renamed to 0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Costa Rica
55
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
66
[brown9804](https://github.com/brown9804)
77

8-
Last updated: 2025-03-03
8+
Last updated: 2025-03-05
99

1010
----------
1111

Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,111 @@
1+
# Microsoft Fabric: Automating Pipeline Execution with Activator
2+
3+
Costa Rica
4+
5+
[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/)
6+
[brown9804](https://github.com/brown9804)
7+
8+
Last updated: 2025-03-03
9+
10+
----------
11+
12+
> This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run.
13+
14+
<details>
15+
<summary><b>List of References </b> (Click to expand)</summary>
16+
17+
</details>
18+
19+
20+
## Set Up the First Pipeline
21+
22+
1. **Create the Pipeline**:
23+
- In [Microsoft Fabric](https://app.fabric.microsoft.com/), create the first pipeline that performs the required tasks.
24+
25+
> [!NOTE]
26+
> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format.
27+
28+
https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d
29+
30+
- Add a `Copy Data` activity as the final step in the pipeline.
31+
32+
2. **Generate the Trigger File**:
33+
- Configure the `Copy Data` activity to create a trigger file in a specific location, such as `Azure Data Lake Storage (ADLS)` or `OneLake`.
34+
- Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder).
35+
3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully.
36+
37+
https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831
38+
39+
## Configure Activator to Detect the Event
40+
41+
> [!TIP]
42+
> Event options:
43+
44+
https://github.com/user-attachments/assets/022c195f-5af0-4382-8f57-c2efe6728e54
45+
46+
1. **Set Up an Event**:
47+
- Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). In your workspace, click on `+ New Item` and create an `Activator`.
48+
49+
<img width="550" alt="image" src="https://github.com/user-attachments/assets/076d6cde-2579-4f17-a426-27f7dbabbfb8" />
50+
51+
- Choose the appropriate event type, such as `File Created`.
52+
53+
2. **Test Event Detection**:
54+
- Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation.
55+
- Check the **Event Details** screen in Activator to confirm the event is logged.
56+
57+
---
58+
59+
### **Step 3: Define the Rule in Activator**
60+
1. **Create a New Rule**:
61+
- In Activator, create a rule that responds to the event you just configured.
62+
- Set the condition to match the event details (e.g., file name, path, or metadata).
63+
64+
2. **Set the Action**:
65+
- Configure the rule to trigger the second pipeline.
66+
- Specify the pipeline name and pass any required parameters.
67+
68+
3. **Save and Activate**:
69+
- Save the rule and activate it.
70+
- Ensure the rule is enabled and ready to respond to the event.
71+
72+
---
73+
74+
### **Step 4: Set Up the Second Pipeline**
75+
1. **Create the Pipeline**:
76+
- In Microsoft Fabric, create the second pipeline that performs the next set of tasks.
77+
- Ensure it is configured to accept external triggers.
78+
79+
2. **Publish the Pipeline**:
80+
- Publish the second pipeline and ensure it is ready to be triggered.
81+
82+
---
83+
84+
### **Step 5: Test the Entire Workflow**
85+
1. **Run the First Pipeline**:
86+
- Execute the first pipeline and verify that the trigger file is created.
87+
88+
2. **Monitor Activator**:
89+
- Check the **Event Details** and **Rule Activation Details** in Activator to ensure the event is detected and the rule is activated.
90+
91+
3. **Verify the Second Pipeline**:
92+
- Confirm that the second pipeline is triggered and runs successfully.
93+
94+
---
95+
96+
### **Step 6: Troubleshooting (If Needed)**
97+
- If the second pipeline does not trigger:
98+
1. Double-check the rule configuration in Activator.
99+
2. Verify that the second pipeline is set to accept external triggers.
100+
3. Review the logs in Activator for any errors or warnings.
101+
102+
---
103+
104+
Would you like me to assist with any specific step, such as configuring the event or rule in Activator?
105+
106+
107+
<div align="center">
108+
<h3 style="color: #4CAF50;">Total Visitors</h3>
109+
<img src="https://profile-counter.glitch.me/brown9804/count.svg" alt="Visitor Count" style="border: 2px solid #4CAF50; border-radius: 5px; padding: 5px;"/>
110+
</div>
111+

0 commit comments

Comments
 (0)