From 77623ec0882dc36bf1d69e38453b1ad6fdab3079 Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 10:54:31 -0600 Subject: [PATCH 1/9] renamed --- ...ne_nbkparameters.md => 30_DynamicPipeline_nbkparametersADF.md} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename 0_Azure/2_AzureAnalytics/0_Fabric/demos/{30_dynamic_pipeline_nbkparameters.md => 30_DynamicPipeline_nbkparametersADF.md} (100%) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_dynamic_pipeline_nbkparameters.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md similarity index 100% rename from 0_Azure/2_AzureAnalytics/0_Fabric/demos/30_dynamic_pipeline_nbkparameters.md rename to 0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md From c074548ea8758e6a59c7e57403ae07dc76491d26 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Wed, 5 Mar 2025 16:55:02 +0000 Subject: [PATCH 2/9] Update last modified date in Markdown files --- .../0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md index 979a977d7..46318cee2 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/30_DynamicPipeline_nbkparametersADF.md @@ -5,7 +5,7 @@ Costa Rica [![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/) [brown9804](https://github.com/brown9804) -Last updated: 2025-03-03 +Last updated: 2025-03-05 ---------- From 3d5153ff4e0a65dd019c89e8f7d1874f6d34585e Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 12:01:12 -0600 Subject: [PATCH 3/9] in progress --- .../31_FabricActivatorRulePipeline/README.md | 111 ++++++++++++++++++ 1 file changed, 111 insertions(+) create mode 100644 0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md new file mode 100644 index 000000000..b78ed99b3 --- /dev/null +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -0,0 +1,111 @@ +# Microsoft Fabric: Automating Pipeline Execution with Activator + +Costa Rica + +[![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/) +[brown9804](https://github.com/brown9804) + +Last updated: 2025-03-03 + +---------- + +> This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run. + +
+List of References (Click to expand) + +
+ + +## Set Up the First Pipeline + +1. **Create the Pipeline**: + - In [Microsoft Fabric](https://app.fabric.microsoft.com/), create the first pipeline that performs the required tasks. + +> [!NOTE] +> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. + +https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d + + - Add a `Copy Data` activity as the final step in the pipeline. + +2. **Generate the Trigger File**: + - Configure the `Copy Data` activity to create a trigger file in a specific location, such as `Azure Data Lake Storage (ADLS)` or `OneLake`. + - Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder). +3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully. + +https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831 + +## Configure Activator to Detect the Event + +> [!TIP] +> Event options: + +https://github.com/user-attachments/assets/022c195f-5af0-4382-8f57-c2efe6728e54 + +1. **Set Up an Event**: + - Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). In your workspace, click on `+ New Item` and create an `Activator`. + + image + + - Choose the appropriate event type, such as `File Created`. + +2. **Test Event Detection**: + - Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation. + - Check the **Event Details** screen in Activator to confirm the event is logged. + +--- + +### **Step 3: Define the Rule in Activator** +1. **Create a New Rule**: + - In Activator, create a rule that responds to the event you just configured. + - Set the condition to match the event details (e.g., file name, path, or metadata). + +2. **Set the Action**: + - Configure the rule to trigger the second pipeline. + - Specify the pipeline name and pass any required parameters. + +3. **Save and Activate**: + - Save the rule and activate it. + - Ensure the rule is enabled and ready to respond to the event. + +--- + +### **Step 4: Set Up the Second Pipeline** +1. **Create the Pipeline**: + - In Microsoft Fabric, create the second pipeline that performs the next set of tasks. + - Ensure it is configured to accept external triggers. + +2. **Publish the Pipeline**: + - Publish the second pipeline and ensure it is ready to be triggered. + +--- + +### **Step 5: Test the Entire Workflow** +1. **Run the First Pipeline**: + - Execute the first pipeline and verify that the trigger file is created. + +2. **Monitor Activator**: + - Check the **Event Details** and **Rule Activation Details** in Activator to ensure the event is detected and the rule is activated. + +3. **Verify the Second Pipeline**: + - Confirm that the second pipeline is triggered and runs successfully. + +--- + +### **Step 6: Troubleshooting (If Needed)** +- If the second pipeline does not trigger: + 1. Double-check the rule configuration in Activator. + 2. Verify that the second pipeline is set to accept external triggers. + 3. Review the logs in Activator for any errors or warnings. + +--- + +Would you like me to assist with any specific step, such as configuring the event or rule in Activator? + + +
+

Total Visitors

+ Visitor Count +
+ From 621012b69f15969571d6dcee9e0aae7affa152bb Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Wed, 5 Mar 2025 18:01:30 +0000 Subject: [PATCH 4/9] Update last modified date in Markdown files --- .../0_Fabric/demos/31_FabricActivatorRulePipeline/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md index b78ed99b3..e06d1bc64 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -5,7 +5,7 @@ Costa Rica [![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/) [brown9804](https://github.com/brown9804) -Last updated: 2025-03-03 +Last updated: 2025-03-05 ---------- From 237319c0da66eae1b56bacb066072e3bece0ee52 Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 14:11:47 -0600 Subject: [PATCH 5/9] in progress --- .../31_FabricActivatorRulePipeline/README.md | 49 ++++++++++++------- 1 file changed, 31 insertions(+), 18 deletions(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md index e06d1bc64..5392bc236 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -5,7 +5,7 @@ Costa Rica [![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/) [brown9804](https://github.com/brown9804) -Last updated: 2025-03-05 +Last updated: 2025-03-03 ---------- @@ -41,27 +41,44 @@ https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831 > [!TIP] > Event options: -https://github.com/user-attachments/assets/022c195f-5af0-4382-8f57-c2efe6728e54 +https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d 1. **Set Up an Event**: - - Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). In your workspace, click on `+ New Item` and create an `Activator`. + - Create a new event to monitor the location where the trigger file is created (e.g., ADLS or OneLake). Click on `Real-Time`: - image + image - Choose the appropriate event type, such as `File Created`. + image + + image + + - Add a source: + + image + + image + +https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b + 2. **Test Event Detection**: - Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation. - Check the **Event Details** screen in Activator to confirm the event is logged. ---- +https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd -### **Step 3: Define the Rule in Activator** -1. **Create a New Rule**: - - In Activator, create a rule that responds to the event you just configured. - - Set the condition to match the event details (e.g., file name, path, or metadata). +## Define the Rule in Activator + +1. **Setup the Activator**: + +https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568 -2. **Set the Action**: + +2. **Create a New Rule**: + - In `Activator`, create a rule that responds to the event you just configured. + - Set the condition to match the event details (e.g., file name, path, or metadata). +3. **Set the Action**: - Configure the rule to trigger the second pipeline. - Specify the pipeline name and pass any required parameters. @@ -69,7 +86,8 @@ https://github.com/user-attachments/assets/022c195f-5af0-4382-8f57-c2efe6728e54 - Save the rule and activate it. - Ensure the rule is enabled and ready to respond to the event. ---- + + ### **Step 4: Set Up the Second Pipeline** 1. **Create the Pipeline**: @@ -93,15 +111,10 @@ https://github.com/user-attachments/assets/022c195f-5af0-4382-8f57-c2efe6728e54 --- -### **Step 6: Troubleshooting (If Needed)** +## Troubleshooting (If Needed) - If the second pipeline does not trigger: 1. Double-check the rule configuration in Activator. - 2. Verify that the second pipeline is set to accept external triggers. - 3. Review the logs in Activator for any errors or warnings. - ---- - -Would you like me to assist with any specific step, such as configuring the event or rule in Activator? + 2. Review the logs in Activator for any errors or warnings.
From 71bfd34ae89570bb1ce9a9ba8ca4839e626cfd2e Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Wed, 5 Mar 2025 20:12:03 +0000 Subject: [PATCH 6/9] Update last modified date in Markdown files --- .../0_Fabric/demos/31_FabricActivatorRulePipeline/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md index 5392bc236..662d2bd3f 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -5,7 +5,7 @@ Costa Rica [![GitHub](https://img.shields.io/badge/--181717?logo=github&logoColor=ffffff)](https://github.com/) [brown9804](https://github.com/brown9804) -Last updated: 2025-03-03 +Last updated: 2025-03-05 ---------- From 15bef2cea6ec149107809cf36b71ef871aac176b Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 14:16:31 -0600 Subject: [PATCH 7/9] gen random data files script --- .../31_FabricActivatorRulePipeline/GeneratesRandomData.ipynb | 1 + 1 file changed, 1 insertion(+) create mode 100644 0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/GeneratesRandomData.ipynb diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/GeneratesRandomData.ipynb b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/GeneratesRandomData.ipynb new file mode 100644 index 000000000..ef4d12893 --- /dev/null +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/GeneratesRandomData.ipynb @@ -0,0 +1 @@ +{"cells":[{"cell_type":"code","source":["# Generates Dummy json file in Files/\n","\n","# Import necessary libraries\n","from pyspark.sql import SparkSession\n","from pyspark.sql.types import *\n","import random\n","from datetime import datetime, timedelta\n","\n","# Initialize Spark session (if not already initialized)\n","spark = SparkSession.builder.appName(\"GenerateRandomData\").getOrCreate()\n","\n","# Function to generate random data\n","def generate_random_data(num_entries):\n"," data = []\n"," for i in range(1, num_entries + 1):\n"," name = f\"User{i}\"\n"," entry = {\n"," \"id\": i,\n"," \"name\": name,\n"," \"age\": random.randint(18, 65),\n"," \"email\": f\"{name.lower()}@example.com\",\n"," \"created_at\": (datetime.now() - timedelta(days=random.randint(0, 365))).strftime(\"%Y-%m-%d %H:%M:%S\")\n"," }\n"," data.append(entry)\n"," return data\n","\n","# Generate 10 random entries\n","random_data = generate_random_data(10)\n","\n","# Define schema for the DataFrame\n","schema = StructType([\n"," StructField(\"id\", IntegerType(), True),\n"," StructField(\"name\", StringType(), True),\n"," StructField(\"age\", IntegerType(), True),\n"," StructField(\"email\", StringType(), True),\n"," StructField(\"created_at\", StringType(), True)\n","])\n","\n","# Create a DataFrame from the random data\n","df_random_data = spark.createDataFrame(random_data, schema=schema)\n","\n","# Write the DataFrame to the Lakehouse in the specified path\n","output_path = \"abfss://{WORKSPACE-NAME}@onelake.dfs.fabric.microsoft.com/raw_Bronze.Lakehouse/Files/random_data\" # Replace {WORKSPACE-NAME}\n","df_random_data.write.format(\"delta\").mode(\"overwrite\").save(output_path)\n","\n","print(f\"Random data has been saved to the Lakehouse at '{output_path}'.\")"],"outputs":[],"execution_count":null,"metadata":{"microsoft":{"language":"python","language_group":"synapse_pyspark"}},"id":"8d820f25-3c2e-45b3-8a08-af78f0d45e1d"}],"metadata":{"kernel_info":{"name":"synapse_pyspark"},"kernelspec":{"name":"synapse_pyspark","language":"Python","display_name":"Synapse PySpark"},"language_info":{"name":"python"},"microsoft":{"language":"python","language_group":"synapse_pyspark","ms_spell_check":{"ms_spell_check_language":"en"}},"nteract":{"version":"nteract-front-end@1.0.0"},"spark_compute":{"compute_id":"/trident/default","session_options":{"conf":{"spark.synapse.nbs.session.timeout":"1200000"}}},"dependencies":{}},"nbformat":4,"nbformat_minor":5} \ No newline at end of file From 0d85a96662c067db6436f908f1bdb72e804c7f29 Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 14:18:41 -0600 Subject: [PATCH 8/9] quick note --- .../0_Fabric/demos/31_FabricActivatorRulePipeline/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md index 662d2bd3f..e7ea42415 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -23,7 +23,7 @@ Last updated: 2025-03-05 - In [Microsoft Fabric](https://app.fabric.microsoft.com/), create the first pipeline that performs the required tasks. > [!NOTE] -> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. +> This code generates random data with fields such as id, name, age, email, and created_at, organizes it into a PySpark DataFrame, and saves it to a specified Lakehouse path using the Delta format. Click here to see the [example script](./GeneratesRandomData.ipynb) https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d From 47e0491258f774905a0bd33d3ea76652715479ee Mon Sep 17 00:00:00 2001 From: Timna Brown <24630902+brown9804@users.noreply.github.com> Date: Wed, 5 Mar 2025 16:00:59 -0600 Subject: [PATCH 9/9] visual guidance in place --- .../31_FabricActivatorRulePipeline/README.md | 54 +++++++++---------- 1 file changed, 27 insertions(+), 27 deletions(-) diff --git a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md index e7ea42415..c3dc87546 100644 --- a/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md +++ b/0_Azure/2_AzureAnalytics/0_Fabric/demos/31_FabricActivatorRulePipeline/README.md @@ -12,7 +12,14 @@ Last updated: 2025-03-05 > This process shows how to set up Microsoft Fabric Activator to automate workflows by detecting file creation events in a storage system and triggering another pipeline to run.
-List of References (Click to expand) +List of Content (Click to expand) + + - [Set Up the First Pipeline](#set-up-the-first-pipeline) + - [Configure Activator to Detect the Event](#configure-activator-to-detect-the-event) + - [Set Up the Second Pipeline](#set-up-the-second-pipeline) + - [Define the Rule in Activator](#define-the-rule-in-activator) + - [Test the Entire Workflow](#test-the-entire-workflow) + - [Troubleshooting If Needed](#troubleshooting-if-needed)
@@ -34,7 +41,7 @@ https://github.com/user-attachments/assets/95206bf3-83a7-42c1-b501-4879df22ef7d - Ensure the file name and path are consistent and predictable (e.g., `trigger_file.json` in a specific folder). 3. **Publish and Test**: Publish the pipeline and test it to ensure the trigger file is created successfully. -https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831 + https://github.com/user-attachments/assets/798a3b12-c944-459d-9e77-0112b5d82831 ## Configure Activator to Detect the Event @@ -60,20 +67,28 @@ https://github.com/user-attachments/assets/282fae9b-e1c6-490d-bd23-9ed9bdf6105d image -https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b + https://github.com/user-attachments/assets/43a9654b-e8d0-44da-80b9-9f528483fa3b 2. **Test Event Detection**: - Save the event and test it by manually running the first pipeline to ensure Activator detects the file creation. - Check the **Event Details** screen in Activator to confirm the event is logged. -https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd + https://github.com/user-attachments/assets/6b21194c-54b4-49de-9294-1bf78b1e5acd + +## Set Up the Second Pipeline + +1. **Create the Pipeline**: + - In Microsoft Fabric, create the second pipeline that performs the next set of tasks. + - Ensure it is configured to accept external triggers. +2. **Publish the Pipeline**: Publish the second pipeline and ensure it is ready to be triggered. + + https://github.com/user-attachments/assets/5b630579-a0ec-4d5b-b973-d9b4fdd8254c ## Define the Rule in Activator 1. **Setup the Activator**: -https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568 - + https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568 2. **Create a New Rule**: - In `Activator`, create a rule that responds to the event you just configured. @@ -81,35 +96,20 @@ https://github.com/user-attachments/assets/7c88e080-d5aa-4920-acd6-94c2e4ae0568 3. **Set the Action**: - Configure the rule to trigger the second pipeline. - Specify the pipeline name and pass any required parameters. - 3. **Save and Activate**: - Save the rule and activate it. - Ensure the rule is enabled and ready to respond to the event. + https://github.com/user-attachments/assets/5f139eeb-bab0-4d43-9f22-bbe44503ed75 +## Test the Entire Workflow +1. **Run the First Pipeline**: Execute the first pipeline and verify that the trigger file is created. +2. **Monitor Activator**: Check the `Event Details` and `Rule Activation Details` in Activator to ensure the event is detected and the rule is activated. +3. **Verify the Second Pipeline**: Confirm that the second pipeline is triggered and runs successfully. -### **Step 4: Set Up the Second Pipeline** -1. **Create the Pipeline**: - - In Microsoft Fabric, create the second pipeline that performs the next set of tasks. - - Ensure it is configured to accept external triggers. - -2. **Publish the Pipeline**: - - Publish the second pipeline and ensure it is ready to be triggered. - ---- - -### **Step 5: Test the Entire Workflow** -1. **Run the First Pipeline**: - - Execute the first pipeline and verify that the trigger file is created. - -2. **Monitor Activator**: - - Check the **Event Details** and **Rule Activation Details** in Activator to ensure the event is detected and the rule is activated. - -3. **Verify the Second Pipeline**: - - Confirm that the second pipeline is triggered and runs successfully. + https://github.com/user-attachments/assets/0a1dab70-2317-4636-b0be-aa0cb301b496 ---- ## Troubleshooting (If Needed) - If the second pipeline does not trigger: