Last post Aug 30, 2018 06:45 AM by Nan Yu
Aug 29, 2018 08:07 PM|KeepLearning|LINK
As of now in sql server , we have a data and a series normalized table structure that is connected to a C# web app (Human Resource / Recruiting System) (slightly medium size as of now) that does create , read , update and delete?
In the future (2 years time) , we are planning to turn this database into a data warehouse. however in data warehousing normalization in not exactly needed.
My senior programmer(who is not a programmer, but more on business intelligence) suggest that we should denormalize all tables on the same database (Kimball's data warehouse) connect that to the C# web app and if test succeeds the data put it in azure data
So my main question is this a good thing or is there a better way? like extracting the data from the normalized table and putting it into a new database with the denormalized table.
Please excuse , I'm more on software development usually using a ORM and not much on business intelligence.
Thank you to All.
Aug 30, 2018 06:45 AM|Nan Yu|LINK
Hi KeepLearning ,
Denormalization is one way to improve query performance in certain situations , we usually try to denormalize things that will not change frequently, But anytime you denormalize, you have to write triggers or some other process to make sure the data
stays in sync , so that it will slow inserts, updates and deletes, especially if you need to do large batches of data. So we try to denirmalize unless we hit a performance problem and performance testing is always needed during the convert process.
But for data warehouses, the key efficiency needs are in querying and data retrieval. The data is generally updated automatically on a schedule rather than one record at a time by a user , so denormalize is consider recommended in this scenario .