Purpose:
Transfer SharePoint List data through azure blob storage and into the DataWarehouse
Requires:
- Lakehouse Database
- Premium Workspace (with Fabric)
- Fabric Data Pipeline
Prerequisites:
- SharePoint list data in Fabric Lakehouse. DataSource.SharepointList โ knowhow (bmt-dwh-uks-app-wp.azurewebsites.net)
- List names correctly configured in myBMT
Process Steps:
- Navigate to Premium Workspace
- [New>]
- More Options

4. Data Pipeline

5. Create pipeline parameters. Create two pipeline parameters.
- Datasource: The name of the Lakehouse where the data is stored.
- FolderName: The name of the directory in the bronze storage container where the parquet files will be saved
5. Activities Tab
6. Look up Activity

Click on the Lookup activity and go to settings, from there if not already set up create a new connection to the myBMT MySQL database. Use the Query button and input the following code as the query
@concat('SELECT src.field_236 as name, DataSource.field_260 as schema_name
FROM app_entity_26 as src
Join app_entity_26 as DataSource on DataSource.id = src.parent_id
WHERE src.field_275 = "true"
AND DataSource.field_236 ="', pipeline().parameters.DataSource,'"')

7. For Each

Navigate to the For Each activity settings and input the following code to to loop through the output of the look up activity
@activity('Lookup_mybmt').output.value
8. Copy Data Activity’s

Inside the for each activity, place four copy data activities. This will enable data to be copied both into the storage container and the staging database. Configure each copy data activity’s source and sink to their respective paths. Finally add a delete data activity which will delete the data from the import directory once its processed