Close Menu
    Facebook LinkedIn YouTube WhatsApp X (Twitter) Pinterest
    Trending
    • IdeaSpark Revolver S titanium screwdriver on Kickstarter
    • From eggs to avocados – Germany’s Orbem raises €55.5 million for AI-powered MRI expansion
    • 7 Best All-Clad Deals From the Factory Seconds Sale (2026)
    • Americans worry sports betting hurts integrity even as participation keeps rising
    • Best Home Ellipticals in 2026: Smash Your Health Goals With These Full-Body Workout Machines
    • From Vietnam Boat Refugee to Reliability Engineering
    • Does Calendar-Based Time-Intelligence Change Custom Logic?
    • The UK government is backing AI that can run its own lab experiments
    Facebook LinkedIn WhatsApp
    Times FeaturedTimes Featured
    Tuesday, January 20
    • Home
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    • More
      • AI
      • Robotics
      • Industries
      • Global
    Times FeaturedTimes Featured
    Home»Artificial Intelligence»Beyond the Flat Table: Building an Enterprise-Grade Financial Model in Power BI
    Artificial Intelligence

    Beyond the Flat Table: Building an Enterprise-Grade Financial Model in Power BI

    Editor Times FeaturedBy Editor Times FeaturedJanuary 11, 2026No Comments12 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr WhatsApp Email
    Share
    Facebook Twitter LinkedIn Pinterest Telegram Email WhatsApp Copy Link


    there: You open Energy BI, drag a messy Excel sheet into the canvas, and begin dropping charts till one thing appears “proper.” It’s simple, it’s intuitive, and truthfully, that’s why Energy BI is certainly one of my favorite instruments for information visualisation.

    However because the world of information shifts towards end-to-end options like Microsoft Material, “simply making it work” isn’t sufficient anymore. Massive organisations want fashions which can be performant, safe, and scalable.

    I’ve determined to problem myself by taking the PL-300: Microsoft Knowledge Analyst Affiliate examination. However as an alternative of simply grinding by apply checks or memorising definitions, I’m going into “Sensible Mode.” If I’m going to get licensed, I need to show I can clear up the issues actual companies truly face.

    The Mission: The Enterprise-Grade Monetary Suite

    For my first mission, I’m tackling the Govt Monetary Well being Suite.

    Why finance? As a result of within the enterprise world, it’s the final word check of your Knowledge Modeling and DAX expertise. Most “generic” tutorials use a single, flat desk. However in an actual firm, information is fragmented. You may have “Actuals” (what occurred) sitting in a single place and “Budgets” (the purpose) sitting in one other, often at totally different ranges of element.

    On this mission, I’m going to doc how I:

    • Deconstruct a “Flat Desk” right into a clear, skilled Star Schema.
    • Deal with the “Grain” Drawback (evaluating every day gross sales vs. month-to-month budgets).
    • Grasp DAX for these high-stakes metrics like Yr-to-Date (YTD) and Variance %.

    I’m sharing my journey in public in order that when you’re additionally getting ready for the PL-300, you’ll be able to comply with alongside, construct these options with me, and perceive the why behind the structure — not simply the how.
    For this mission, we’re utilizing the Microsoft Monetary Pattern. It’s the proper “clean canvas” as a result of it comes as a flat, “messy” desk that we’ve to re-engineer professionally.

    Find out how to get it: In Energy BI Desktop, go to Residence > Pattern Dataset > Load Pattern Knowledge. Choose the financials desk.

    Let’s get our fingers soiled in Energy Question.

    Section 1: Knowledge Transformation (Energy Question)

    Earlier than touching DAX or visuals, I slowed myself down and spent actual time in Energy Question. That is the half I used to hurry by. Now I deal with it as the inspiration of all the things that follows.
    If the info mannequin is shaky, no quantity of intelligent DAX will prevent.

    Step 1: Knowledge Profiling (a fast actuality test)

    As soon as I loaded the Microsoft Monetary Pattern dataset, the very first thing I did was activate column profiling:

    • Column high quality
    • Column distribution
    • Column profile

    Once I activate Column high quality, distribution, and profile, I’m not attempting to be thorough for the sake of it. I’m scanning for model-breaking points earlier than they flip into DAX complications.

    Column profiling instantly tells you:

    • The place nulls are hiding
    • Which columns are pretending to be dimensions
    • Which fields look numeric however behave like textual content

    1. Nulls & Knowledge Kind Mismatches

    I believe we’re good. Empty values are 0% all by, legitimate are 100%, and errors are 0%. Knowledge varieties are all good, additionally. In all probability as a result of we’re utilizing the pattern financials dataset, there shouldn’t be any points

    2. Cardinality: What Needs to Be a Dimension

    Cardinality is just what number of distinctive values a column has. Energy BI surfaces this instantly in Column distribution, and when you begin taking note of it, modeling choices get a lot simpler.

    Right here’s my rule of thumb:

    • Low cardinality (values repeat lots) → seemingly a dimension
    • Excessive cardinality (values are principally distinctive) → fact-level element

    Once I activate column distribution, I’m asking two questions:

    • What number of distinct values does this column have?
    • Do these values repeat sufficient to be helpful for filtering or grouping?

    If a column appears categorical however has hundreds of distinct values, that’s a pink flag.

    As soon as I turned on Column distribution, the dataset began sorting itself for me.

    Some columns instantly confirmed low cardinality — they repeated usually and behaved like true classes:

    • Section
    • Nation
    • Product
    • Low cost Band
    • Manufacturing Worth
    • Gross sales Worth
    • Date attributes (Yr, Month Quantity, Month Identify)

    These columns had comparatively few distinct values and clear repetition throughout rows. That’s a robust sign: these need to be used for grouping, slicing, and relationships. In different phrases, they naturally belong on the dimension aspect of a star schema.

    Then there have been the columns on the opposite finish of the spectrum.

    Measures like:

    • Models Bought
    • Gross Gross sales
    • Reductions
    • Gross sales
    • COGS
    • Revenue

    …confirmed very excessive cardinality. Many values had been distinctive or practically distinctive per row, with huge numeric ranges. That’s precisely what I anticipate from fact-level metrics — they’re meant to be aggregated, not filtered on.

    That perception straight knowledgeable my subsequent step: utilizing Reference in Energy Question to spin off Dim_Product and Dim_Geography, as an alternative of guessing or forcing the construction.

    Step 2: Spinning Dimensions with Reference (Not Duplicate)

    That is the purpose the place I ended treating the dataset as a report-ready desk and began treating it as a model-in-progress.
    In Energy Question, it’s tempting to right-click a desk and hit Duplicate. I used to do this on a regular basis. It really works — but it surely quietly creates issues you solely really feel later.

    As a substitute, I used Reference.
    Why reference as an alternative of duplicate? You may ask

    While you create a referenced desk:

    • It inherits all upstream transformations
    • It stays logically tied to the supply
    • Any repair within the reality desk routinely flows downstream

    From a real-world perspective, it’s simply… safer.

    Right here’s how I created Dim_Product & Dim_Geography

    Ranging from the principle monetary desk:

    • I right-clicked the question and chosen Reference
    • Renamed the brand new question to Dim_Product
    • Saved solely product-related columns (Product, Section, Low cost Band)
    • Eliminated duplicates
    • Ensured clear information varieties and naming

    What I ended up with was a small, steady desk with low cardinality — excellent for slicing and grouping.

    I repeated the identical method for geography:

    • Reference the actual fact desk
    • Maintain the Nation column
    • Take away duplicates
    • Clear textual content values

    P.S. On this dataset, geography is represented solely on the nation degree. Somewhat than forcing a area or metropolis hierarchy that doesn’t exist, I modeled Nation as a lean, single-column dimension.

    Step 3: Create a Dynamic Date Desk

    Right here’s the place I see a variety of Energy BI fashions quietly fail PL-300 requirements.

    • I didn’t import a static calendar.
    • I didn’t manually generate dates.
    • I constructed a dynamic date desk in Energy Question primarily based on the info itself.

    Why this issues:

    • It ensures no lacking dates
    • It routinely adjusts when new information arrives
    • It aligns completely with Microsoft’s modeling finest practices

    To create a dynamic date desk. Simply click on on Load -> Clean Question -> Superior Editor and paste this code in

    Beneath is the precise M code I used

    let
    Supply = Financials,
    MinDate = Date.From(Record.Min(Supply[Date])),
    MaxDate = Date.From(Record.Max(Supply[Date])),
    DateList = Record.Dates(
    MinDate,
    Length.Days(MaxDate — MinDate) + 1,
    #period(1, 0, 0, 0)
    ),
    DateTable = Desk.FromList(DateList, Splitter.SplitByNothing(), {“Date”}),
    AddYear = Desk.AddColumn(DateTable, “Yr”, every Date.Yr([Date]), Int64.Kind),
    AddMonthNum = Desk.AddColumn(AddYear, “Month Quantity”, every Date.Month([Date]), Int64.Kind),
    AddMonthName = Desk.AddColumn(AddMonthNum, “Month Identify”, every Date.MonthName([Date]), kind textual content),
    AddQuarter = Desk.AddColumn(AddMonthName, “Quarter”, every “Q” & Quantity.ToText(Date.QuarterOfYear([Date])), kind textual content),
    AddDay = Desk.AddColumn(AddQuarter, “Day”, every Date.Day([Date]), Int64.Kind)
    in
    AddDay

    This calendar:

    • Covers each date within the dataset
    • Scales routinely
    • Is prepared for time intelligence the second it hits the mannequin

    As soon as loaded, I marked it as a Date Desk within the mannequin view — non-negotiable for PL-300.

    By the tip of Section 1, I had:

    • A clear reality desk
    • Correct dimension tables created through Reference
    • A dynamic, gap-free date desk
    • Transformations I may truly clarify to a different analyst

    Nothing flashy but — however that is the part that makes all the things after it simpler, sooner, and extra dependable.

    Within the subsequent part, I’ll transfer into information modeling and relationships, the place this construction actually begins paying dividends.

    Section 2: Knowledge Modeling (From Tables to a Star Schema)

    That is the part the place Energy BI begins behaving like a semantic mannequin.

    By the point I switched to the Mannequin view, I already had:

    • A clear reality desk
    • Lean dimensions created through Reference
    • A dynamic, gap-free date desk

    Now the purpose was easy: join all the things cleanly and deliberately.

    Step 1: Establishing the Star Schema

    I aimed for a traditional star schema:

    • One central reality desk (monetary metrics)
    • Surrounding dimension tables (Dim_Date, Dim_Product, Dim_Geography)

    Each relationship wanted to reply three questions:

    • Which desk is the “one” aspect?
    • Which desk is the “many” aspect?
    • Does this relationship make sense on the grain of the info?

    You may discover that I didn’t introduce surrogate keys for the actual fact or dimension tables. On this dataset, the pure keys — Nation, Product, and Date — are steady, low-cardinality, and unambiguous. For this mannequin, including synthetic IDs would enhance complexity with out bettering readability or efficiency.

    Right here’s how the general mannequin appears:

    Step 2: Relationship Path (Single, on Goal)

    All relationships had been set to:

    • Many-to-one
    • Single route, flowing from dimension → reality

    For PL-300 and real-world fashions alike, single-direction filters are the default till there’s a robust motive to not use them.

    Step 3: Date Desk because the Anchor

    The dynamic date desk I created earlier now turned the spine of the mannequin.

    I:

    • Associated Dim_Date[Date] to the actual fact desk’s date column
    • Marked Dim_Date because the official Date Desk
    • Hid the uncooked date column within the reality desk

    This does three necessary issues:

    • Allows time intelligence
    • Prevents unintentional use of the flawed date subject
    • Forces consistency throughout measures

    From right here on out, each time-based calculation flows by this desk — no exceptions.

    Step 4: Hiding What Customers Don’t Want

    This can be a small step with an outsized affect. PL-300 explicitly checks this concept that the mannequin shouldn’t be simply right — it ought to be usable.

    I hid:

    • Overseas keys (Date, Product, Nation). If a column exists solely to create relationships, it doesn’t want to look within the Fields pane.
    • Uncooked numeric columns that ought to solely be utilized in measures. After creating my DAX measures (e.g. Whole Gross sales, Whole Revenue). I can go forward and conceal uncooked numeric columns (like Models Bought, Gross Gross sales, Reductions, Gross sales, COGS, Revenue) from my reality desk. These nudges customers towards right and constant aggregations
    • Duplicate date attributes within the reality Desk (Yr, Month, Month Quantity). These exist already within the date desk.

    Step 5: Validating the Mannequin (Earlier than Writing DAX)

    Earlier than touching any measures, I did a fast sanity test:

    • Does slicing by Nation behave appropriately?
    • Do Product and Section filter as anticipated?
    • Do dates combination cleanly by Yr and Month?

    If one thing breaks right here, it’s a modeling problem — not a DAX problem.

    To check this, I created a fast visible checking the Sum of Revenue by Yr. Right here’s the way it turned out:

    To this point so good! Now we are able to transfer on to creating our DAX measures.

    Section 3: DAX Measures & Variance Evaluation (The place the Mannequin Begins to Shine)

    That is the part the place the work I’d finished in Energy Question and the mannequin actually began paying off. Actually, it’s the primary time shortly that writing DAX didn’t really feel like combating the desk. The star schema made all the things… predictable.

    Step 1: Base Measures — the inspiration of sanity

    I resisted my outdated intuition to tug uncooked columns into visuals. As a substitute, I created express measures for all the things I cared about:

    Whole Gross sales :=
    SUM ( financials[ Sales])
    
    Whole Revenue :=
    SUM ( financials[Profit] )
    
    Whole Models Bought :=
    SUM ( financials[Units Sold] )
    
    Whole COGS :=
    SUM ( financials[COGS])

    Step 2: Time intelligence with out surprises

    As a result of I already had a whole, correctly marked date desk, issues like year-to-date or prior yr comparisons had been easy.

    Gross sales Yr-to-Date

    Gross sales YTD :=
    TOTALYTD (
    [Total Sales],
    ‘Dim_Date’[Date]
    )

    Gross sales Prior Yr

    Gross sales PY :=
    CALCULATE (
    [Total Sales],
    SAMEPERIODLASTYEAR ( ‘Dim_Date’[Date] )
    )

    Step 3: Variance measures — turning numbers into perception

    As soon as I had Precise vs Prior Interval, I may calculate variance with nearly no additional effort:

    Gross sales YoY Variance :=
    [Total Sales] — [Sales PY]
    
    Gross sales YoY % :=
    DIVIDE (
    [Sales YoY Variance],
    [Sales PY]
    )

    Identical method for month-over-month:

    Gross sales PM :=
    CALCULATE (
    [Total Sales],
    DATEADD ( 'Dim_Date'[Date], -1, MONTH )
    )
    
    Gross sales MoM Variance :=
    [Total Sales] - [Sales PM]
    
    Gross sales MoM % :=
    DIVIDE (
    [Sales MoM Variance],
    [Sales PM]
    )

    Step 4: Why this truly feels “simple”

    Right here’s the trustworthy half: writing DAX didn’t really feel like the toughest factor. The laborious half was all the things that got here earlier than:

    • Cleansing the info
    • Profiling columns
    • Spinning out dimensions with Reference
    • Constructing a strong date desk

    By the point I acquired right here, the DAX was simply including worth as an alternative of patching holes.

    Good DAX isn’t intelligent — it’s predictable, reliable, and straightforward to elucidate.

    Conclusion

    The magic wasn’t in any single DAX formulation — it was in how the mannequin got here collectively. By profiling the info early, understanding cardinality, and spinning dimensions with Reference, I constructed a construction that simply works. A dynamic date desk and clear relationships meant time intelligence measures and variance calculations ran effortlessly.

    Hiding pointless fields and grouping measures thoughtfully made the mannequin approachable, even for another person exploring it for the primary time. By the point I wrote the DAX for Precise vs. Prior Interval or Month-over-Month variance, all the things felt predictable and reliable.

    If you wish to see the complete semantic mannequin in motion, together with all of the tables, relationships, and measures, you’ll be able to obtain it here and discover the way it ties collectively. There’s no higher strategy to study than seeing a working mannequin in Energy BI and experimenting with it your self.

    Wanna join? Be happy to say hello on any of the platforms beneath

    Medium

    LinkedIn

    Twitter

    YouTube



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Editor Times Featured
    • Website

    Related Posts

    Does Calendar-Based Time-Intelligence Change Custom Logic?

    January 20, 2026

    IVO’s $55M Boost Signals AI-Driven Law Future (and It’s Just Getting Started)

    January 20, 2026

    You Probably Don’t Need a Vector Database for Your RAG — Yet

    January 20, 2026

    Time Series Isn’t Enough: How Graph Neural Networks Change Demand Forecasting

    January 19, 2026

    Bridging the Gap Between Research and Readability with Marco Hening Tallarico

    January 19, 2026

    Using Local LLMs to Discover High-Performance Algorithms

    January 19, 2026
    Leave A Reply Cancel Reply

    Editors Picks

    IdeaSpark Revolver S titanium screwdriver on Kickstarter

    January 20, 2026

    From eggs to avocados – Germany’s Orbem raises €55.5 million for AI-powered MRI expansion

    January 20, 2026

    7 Best All-Clad Deals From the Factory Seconds Sale (2026)

    January 20, 2026

    Americans worry sports betting hurts integrity even as participation keeps rising

    January 20, 2026
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    About Us
    About Us

    Welcome to Times Featured, an AI-driven entrepreneurship growth engine that is transforming the future of work, bridging the digital divide and encouraging younger community inclusion in the 4th Industrial Revolution, and nurturing new market leaders.

    Empowering the growth of profiles, leaders, entrepreneurs businesses, and startups on international landscape.

    Asia-Middle East-Europe-North America-Australia-Africa

    Facebook LinkedIn WhatsApp
    Featured Picks

    Samsung Galaxy S25, S25+, and S25 Ultra Review: AI Snoozefest

    February 6, 2025

    Gut microbes use common nutrient to fight type 2 diabetes

    December 9, 2025

    Book review: Enshittification: Why Everything Suddenly Got Worse and What to Do About It

    December 5, 2025
    Categories
    • Founders
    • Startups
    • Technology
    • Profiles
    • Entrepreneurs
    • Leaders
    • Students
    • VC Funds
    Copyright © 2024 Timesfeatured.com IP Limited. All Rights.
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us

    Type above and press Enter to search. Press Esc to cancel.