Pipeline.Marketing

SharePoint List
Lakehouse to SQL
Staging
DataMart
SharePoint List

1. Data Connection (Ingest)

  • Dataflow Gen2 in Microsoft Fabric to create a dataflow.
  • Use SharePoint Online List as the sitesMarketing  data source.
  • Authenticated with the necessary credentials and choose the lists to extract.

2. Data Transformation (Process)

  • Power Query applied transformations to clean and shape the data.
  • Ensure key identifiers are retained to maintain relationships between lists.
  • Standardise formats, such as dates and numeric values.

3. Load to Lakehouse (Store)

  • Configure the destination as Lakehouse in Microsoft Fabric.
  • Map the transformed SharePoint List data to structured tables in the Lakehouse Database.
  • Choose between incremental refresh or full overwrite based on update frequency.

4. Automate & Monitor

  • Set a scheduled refresh for automated updates.
    • Use Daily/Weekly flag
  • Enable logging and monitoring in Fabric to track ingestion failures or anomalies.
SharePoint List

1. Data Connection (Ingest)

  • Dataflow Gen2 in Microsoft Fabric to create a dataflow.
  • Use SharePoint Online List as the sitesMarketing  data source.
  • Authenticated with the necessary credentials and choose the lists to extract.

2. Data Transformation (Process)

  • Power Query applied transformations to clean and shape the data.
  • Ensure key identifiers are retained to maintain relationships between lists.
  • Standardise formats, such as dates and numeric values.

3. Load to Lakehouse (Store)

  • Configure the destination as Lakehouse in Microsoft Fabric.
  • Map the transformed SharePoint List data to structured tables in the Lakehouse Database.
  • Choose between incremental refresh or full overwrite based on update frequency.

4. Automate & Monitor

  • Set a scheduled refresh for automated updates.
    • Use Daily/Weekly flag
  • Enable logging and monitoring in Fabric to track ingestion failures or anomalies.
Lakehouse to SQL For_Each_SharePoint_List_To_STG
  • Automated SharePoint Data Processing – Extracts and loads SharePoint list data dynamically.
  • Multi-Environment Execution – Handles both Dev and Prod using separate ForEach loops.
  • Staging & Silver Layer Processing – Moves data through staging and silver layers for better transformation control.
  • Dynamic Processing – Uses metadata-driven execution, iterating over SharePoint sites retrieved from myBMT.
  • Sequential Execution – Ensures data is processed in order, preventing concurrency issues.
Staging bmt_dwh_stg_uk/hbs Schema = hbs
DataMart Customer_Marketing

Leave a Comment