Should I use a star schema or a denormalized table for a sales dashboard?
#1
I’m trying to decide between a star schema and a denormalized flat table for our new sales dashboard, and I keep going in circles. My team wants the speed of a single table for all their metrics and dimensions, but I’m worried about maintaining data integrity and handling future changes. Has anyone else faced this kind of modeling trade-off for a reporting layer?
Reply
#2
Weve wrestled with this too. We started with a star schema and a flat table approach for a sales dashboard. It looked fast on the initial dashboard, but as we added campaigns, regions, and time grain changes, the numbers began to drift and our ETL grew brittle. We eventually kept a slim core set of facts and conformed dimensions, plus a couple of materialized views for the most used reports. It slowed the maintenance cycle but steadied the numbers.
Reply
#3
Another angle was to keep one table, but lock down governance hard and build automated checks. We saw performance bleed when people drilled into year over year across many products; solution was to precompute aggregations and rely on those for the dashboard, while still sourcing truth from the single source. It felt like a patch, not perfection, and it required careful TTLs on caches.
Reply
#4
I tried to argue for a pure approach, but honestly the problem isn't always the model, it's the queries and the BI tool. If the dashboard queries pull small selects per tile versus a big join, speed can vary a lot. I think you need to test with real users and watch the bottlenecks before swapping models completely.
Reply
#5
Quick question: is real time freshness the priority, or is stable, reconciled numbers more important for your stakeholders?
Reply


[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Forum Jump: