Data Operating Capabilities:

A Data Warehouse & Data Mart Operating Model alone does not create quality and value but requires an intersection of process, people, and technology conceived around the data model to provide its actual value. Building processes and organisational structures to deliver compelling data products to users is the realisation of what the Data Operations Model aims to provide.

Process

  • Data Engineering will…
ActionObjectiveQuality Measures
Data Flow ImplementationImplement data flows to seamlessly connect operational systems with analytics and business intelligence (BI) systems.Ensure data integrity throughout the flow. Validate data consistency and accuracy at each stage.
Source-to-Target Mapping DocumentationDocument clear source-to-target mappings for transparency and traceability.Ensure mappings are comprehensive and up-to-date. Verify mappings against actual data transformation processes.
Data Flow Re-engineeringRe-engineer manual data flows for scalability and repeatability.Assess scalability potential. Test repeatability under various scenarios.
ETL Script OptimisationWrite efficient ETL (Extract, Transform, Load) scripts and code for optimal performance.Conduct performance testing on ETL processes. Optimise scripts for resource efficiency.
Reusable Business Intelligence ReportsDevelop business intelligence reports that are reusable and adaptable.Test report generation under different conditions. Ensure reports meet stakeholder requirements effectively.
Accessible Data for AnalysisBuild accessible datasets to facilitate easy analysis.Validate data accessibility across relevant platforms. Ensure data security and compliance with access controls.
AI Analytics ReadinessPrepare data infrastructure and pipelines to support AI-driven analytics and machine learning models.Assess compatibility with AI frameworks and libraries. Ensure data quality and format suitability for AI model training and inference.

People

Data literate colleagues is key for delivering value to the business.  Building data literacy requires common definitions and understanding of competencies needed for us to work together in treating data as an enterprise asset.  A Competency Framework provides a model to guide literacy efforts, which involves all colleagues working with data (Knowledge Workers and Data Consumers).  The following skills form the basis of the data competency framework, the level and degree colleagues should show these skills is dependent upon their role and the corresponding skill level requirements (i.e. awareness, working, practitioner, expert)

Technology Tracks:

BMT uses advanced technologies to streamline data management, from ingestion to analytics. Key tools include:

  • Microsoft Fabric: Azure-based platform for scalable, secure data storage and management.
  • Common Data Model: Dimensional modelling that structures data for efficient analysis and reporting.
  • Power Platform: Suite of BI tools, dashboards, and models for data access and visualization.

Each technology platform is configured for optimal performance and accessibility, enabling seamless integration with Power BI and other data tools.

Microsoft Fabric

BMT has chosen Microsoft Fabric as the foundation for its data warehouse and data mart environments. This integrated set of data and analytical services, based on the Azure/Synapse Cloud, provides a robust platform for managing BMT’s data assets efficiently.

Key components of Microsoft Fabric include:

  • Azure CloudBased Containers: BMT utilises Azure cloud-based containers to create and manage its data, providing scalability and flexibility in storage and computing resources.
  • Data Pipelines: Designing pipelines to copy data into the Lakehouse, a central repository for structured and unstructured data, ensures seamless data ingestion and availability for analysis.
  • Job Management: Scheduled and triggered job definitions enable BMT to submit batch or streaming jobs for data processing, ensuring timely updates and insights.
  • Notebooks: Leveraging notebooks allows BMT to write code for data ingestion, preparation, and transformation, facilitating flexible and customisable data workflows.

Common Data Model

At the core of BMT’s data warehouse architecture is the Common Data Model, which serves as the foundation for representing the organisation’s core business processes and common form designs. The Common Data Model supports dimensional modelling, where data is structured into measurement facts and descriptive dimensions, enabling efficient querying and analysis.

Dimensional models, instantiated as star schemas or cubes in relational databases, provide a structured framework for organising and accessing data, facilitating reporting and analytics processes.

Power Platform

BMT leverages the Power Platform to construct a broad range of business intelligence (BI) applications, empowering users to access, analyse, and visualise data effectively. This includes:

  • Standardised Reports: Creating standardised reports to provide consistent and actionable insights across the organisation.
  • Parameterised Queries: Constructing parameterised queries to enable users to customise data retrieval based on specific criteria or filters.
  • Dashboards and Scorecards: Developing dashboards and scorecards to monitor key performance indicators (KPIs) and track organisational goals and objectives.
  • Analytic Models and Data Mining: Building analytic models and data mining applications to uncover hidden patterns and trends in the data, facilitating data-driven decision-making.

Integrating the data warehouse or data marts with Power BI involves several key steps:

  • Establishing Connections: Configuring data connectors to extract data from the data warehouse or data marts into Power BI, ensuring seamless data integration and accessibility.
  • Optimising Data Models: Optimising data models for performance by reducing unnecessary data, optimising relationships, and defining appropriate data types and formatting, enhancing the efficiency of Power BI reports and dashboards.
  • Utilising API Connectors: Configuring API connectors in Power BI to authenticate and securely connect to external APIs, enabling data retrieval and integration from diverse data sources.

By following these principles and leveraging Microsoft Fabric, the Common Data Model, and the Power Platform, BMT ensures a robust and integrated approach to data management, analytics, and reporting, driving informed decision-making and business success.

Leave a Comment