Project 03 · SSIS + Power Automate + SQL Server + Power BI

Automated KPI Reporting Pipeline

An end-to-end pipeline replacing 12 hours of manual weekly Excel reporting. SSIS packages extract and transform data from 4 source systems nightly; Power Automate triggers Power BI refresh and delivers role-personalised PDF reports by 7 AM every day.

SSIS SQL Server T-SQL Power Automate Power BI Service Microsoft Forms SharePoint
12 hrs
Saved Weekly
0
Manual Errors
4
Source Systems
100%
On-time Delivery

Business Problem

Finance and operations analysts spent 12+ hours every Monday manually pulling data from 4 disconnected systems (CRM, Billing, Claims, HR), joining it in Excel, formatting reports, and emailing individually to 22 stakeholders. The process was error-prone, often delayed by 1–2 days, and produced inconsistent figures due to differing cut-off times.

Leadership needed consistent, automated daily KPI reports by 7 AM personalised by role with zero manual effort.

Solution Overview

  • Designed 4 SSIS packages one per source system extracting data nightly at 02:00, loading into a centralised SQL Server staging database
  • Built T-SQL stored procedures for data transformation, KPI calculations, and built-in data quality assertions (row count checks, NULL thresholds, range validations)
  • Created a Power BI semantic model (star schema) with scheduled refresh at 06:00 daily
  • Built a Power Automate flow: triggered on successful Power BI refresh → exports PDF per role → sends personalised email to CEO, VPs, Managers, and Analysts
  • Microsoft Forms portal for ad-hoc report requests submissions auto-log to SharePoint list and create a Teams task assigned to the report owner
  • Documented full pipeline architecture and wrote runbook for handover

Pipeline Architecture

🗄️
4 Source Systems
CRM / Billing / Claims / HR
⚙️
SSIS Packages
Nightly 02:00 Extract
🏛️
SQL Server DW
Star Schema + QA
📊
Power BI Refresh
Semantic Model 06:00
Power Automate
PDF → Email by 07:00

Process Walkthrough

1

SSIS Package Design

Each package uses OLE DB source → Derived Column transforms → Data Quality Script task → OLE DB destination. Packages log row counts and error rows to a pipeline_log table.

2

T-SQL Stored Procedures

usp_Transform_KPIs runs after all 4 packages complete applies business logic, calculates 15 KPIs, flags anomalies where today's totals deviate >20% from 30-day average.

3

Power BI Semantic Model

DirectQuery to SQL Server star schema (Fact_KPI + 4 dimension tables). 18 DAX measures. Row-level security: each manager sees only their team's data. Refresh scheduled 06:00.

4

Power Automate Distribution Flow

Trigger: Power BI refresh complete. Actions: loop through recipient list (stored in SharePoint), export role-specific report page as PDF, send personalised email with attachment.

5

Ad-hoc Request Portal

Microsoft Forms collects report title, owner, refresh frequency, and priority. Power Automate creates a SharePoint list item and posts a Teams message to the BI channel.

Before & After: Efficiency Metrics

Weekly Manual Hours – Before vs After

Report Delivery Time (Hours After Business Day Start)

Data Quality Errors Caught by Pipeline (Weekly)

Stakeholder Satisfaction Score (Survey, /10)

Code Samples

-- KPI transformation stored procedure
CREATE PROCEDURE usp_Transform_KPIs
    @run_date DATE = NULL
AS
BEGIN
    SET NOCOUNT ON;
    SET @run_date = ISNULL(@run_date, CAST(GETDATE() AS DATE));

    -- Truncate today's KPI rows
    DELETE FROM dbo.Fact_KPI
    WHERE  report_date = @run_date;

    -- Insert calculated KPIs
    INSERT INTO dbo.Fact_KPI
           (report_date, kpi_name, kpi_value, dept_id, prev_day_value, variance_pct)
    SELECT
        @run_date,
        kpi.kpi_name,
        kpi.kpi_value,
        kpi.dept_id,
        prev.kpi_value                   AS prev_day_value,
        ROUND(
          (kpi.kpi_value - prev.kpi_value)
          / NULLIF(prev.kpi_value, 0) * 100, 2
        )                                AS variance_pct
    FROM  (SELECT  /* Claims KPI */
               'Claims Received' kpi_name,
               COUNT(*)         kpi_value,
               'OPS'            dept_id
           FROM stg.Claims_Today
           UNION ALL
           SELECT  /* Revenue KPI */
               'Premium Collected',
               SUM(amount), 'FIN'
           FROM stg.Billing_Today
           WHERE status = 'PAID'
          ) kpi
    LEFT JOIN dbo.Fact_KPI prev
           ON prev.kpi_name   = kpi.kpi_name
          AND prev.dept_id    = kpi.dept_id
          AND prev.report_date = DATEADD(day,-1,@run_date);

    -- QA: flag anomalies > 20% deviation from 30-day avg
    UPDATE dbo.Fact_KPI
    SET    anomaly_flag = 1
    WHERE  report_date = @run_date
      AND  ABS(variance_pct) > 20;
END;
// SSIS Script Task – Row Count & NULL QA Check
// Runs after each source extract
public void Main()
{
    string table = Dts.Variables["User::StagingTable"].Value.ToString();
    int rowCount, nullCount, expectedMin;

    using (var conn = new SqlConnection(connStr))
    {
        conn.Open();
        // Check row count
        rowCount = (int)new SqlCommand(
            $"SELECT COUNT(*) FROM stg.{table}", conn)
            .ExecuteScalar();

        expectedMin = (int)Dts.Variables["User::ExpectedMinRows"].Value;

        if (rowCount < expectedMin)
        {
            Dts.Events.FireError(0, "QA Check",
                $"Row count {rowCount} below minimum {expectedMin}",
                "", 0);
            Dts.TaskResult = (int)ScriptResults.Failure;
            return;
        }

        // Check NULL rate on key column
        nullCount = (int)new SqlCommand(
            $"SELECT COUNT(*) FROM stg.{table} WHERE key_id IS NULL",
            conn).ExecuteScalar();

        if ((nullCount * 1.0 / rowCount) > 0.01) // >1% NULLs
        {
            Dts.Events.FireWarning(0, "QA Check",
                $"NULL rate {nullCount/rowCount:P} exceeds threshold",
                "", 0);
        }
    }

    // Log to pipeline_log table
    LogPipelineRun(table, rowCount, nullCount);
    Dts.TaskResult = (int)ScriptResults.Success;
}
// Power Automate Flow Pseudo-logic
// Trigger: "When a Power BI dataset refresh completes"
//   Group: KPI_Reports_Workspace
//   Dataset: Daily_Operations_Model

TRIGGER: Power BI Dataset Refresh Succeeded

ACTION 1: Get items from SharePoint list 'ReportRecipients'
  // Columns: Name, Email, Role, ReportPageName

ACTION 2: Apply to each item in ReportRecipients

    ACTION 2a: Export Power BI report as PDF
        WorkspaceId : "KPI_Reports_Workspace"
        ReportId    : "Daily_Ops_Report"
        Pages       : item[ReportPageName]   // role-specific page
        FileFormat  : PDF

    ACTION 2b: Send an email (Outlook)
        To      : item[Email]
        Subject : "📊 Daily KPI Report – @{formatDateTime(utcNow(),'dd MMM yyyy')}"
        Body    : "Hi @{item[Name]}, please find your daily KPI report attached."
        Attach  : PDF from Action 2a

ACTION 3: Condition any anomaly_flag = 1 in SQL table?
    IF YES: Post message to Teams channel "⚠️ Data anomaly detected in today's KPIs"
    IF NO:  Terminate successfully

Pipeline Run Log Sample (Last 7 Days)

Run DateSourceRows LoadedErrorsQA StatusDurationReport Sent
30 Mar 2026All 4 sources284,2110PASS3m 42s07:04 AM
29 Mar 2026All 4 sources281,9440PASS3m 38s07:02 AM
28 Mar 2026All 4 sources279,6000PASS3m 51s07:05 AM
27 Mar 2026All 4 sources283,1002WARN4m 12s07:08 AM
26 Mar 2026All 4 sources280,5400PASS3m 44s07:03 AM