Last post Jan 05, 2011 06:52 AM by Deleo
Dec 29, 2010 11:32 AM|yani|LINK
What is the approach? calc: BL vs. DB?
Using Asp.net3.5/ sql2005.
What is the approach? What should calculate in the database and what to implement in the business lyre??
If I will calculate in the db – I will have less round trips and less resources on the server side host – but less flexible programmatically side..
When to use object oriented programming and when to implement the calculation on database – when what I am looking for is first performance..
Dec 29, 2010 12:38 PM|N_EvilScott|LINK
I think the most common approach is if its really simple stuff like just summing or determining the product of things, then you can run it against the DB. I personally just dump it in my LINQ queries and use the .Sum() method etc..
If you require some more advanced calculations based on MANY factors, then its better to do it in code. Don't be too afraid of connections and round trips to the database as far as this goes. SQL Server is built to be a BEAST that can handle thousands or
TENS of thousands of simultaneous connections :P
Dec 29, 2010 01:17 PM|yani|LINK
What about if you have to get a large db record set(rs and from a few tables) let say in 2 round trip for having that “dip calculation” -> and that against just to send just ones some array string to a usp?? still go for BL?
Dec 30, 2010 11:06 AM|doyleits|LINK
While SQL Server is capable of handling lots of connections and reads/writes, your application will suffer the larger the datasets become. For example, an order tracking system that has only a few orders performs great, even if you pull all orders and then
for each order, pull the count and sum of line items. However, the application will slow over time due to the exponential delay of retrieving all the data. Sure, if your company can afford to fix this by throwing beefy hardware at the problem, great, but that
is not ideal or cost-efficient.
I just revamped a client's architecture from strictly OO, where each object was filling itself - and the large datasets would take sometimes minutes to populate, each object retrieving the data from the database. We modified the application to assemble the
horizontal details within the database, so the objects are filled in a single call. Databases are just designed to do this more efficiently than the business layer, so the result was snappy performance even under heavy data loads.
If you can get away with calculating as much as possible in the database, I recommend doing that. You can still return details that were used in the calculation, but you hand both to the BL, and hopefully reduce the application workload.
Dec 30, 2010 12:49 PM|N_EvilScott|LINK
I agree exactly with what was said! But I also have to say... I haven't seen anyone work with DataSets (XSD files) directly in several years. With the release of LINQ to SQL and LINQ to Entities I rarely see businesses using DataSets, and even when they
have an old program using them, they tend to move away from them by upgrading certain layers of the program.
Also keep in mind there is a large discrepency between the speed at which it takes to Fill a dataset, and the time it takes to return a Table using LINQ. What takes minutes in a DataSet only takes seconds in your code behind using LINQ.
Jan 01, 2011 02:48 PM|yani|LINK
But linq to sql… you have so much code there and everything is static as well itshaving numerous dynamic sql round trip(or if you using usp) so at the end.. why not to have a tool that generate db strong type on DL? (well its what I do..)
Jan 01, 2011 03:16 PM|N_EvilScott|LINK
You think Linq is a lot of code? :P its just static classes, try openning up a simple XSD file one time and look at all that stuff.
Bottom line is, use what works for you.
Jan 03, 2011 10:30 AM|Deleo|LINK
My thoughts about SQL functions Vs OO:
I think that you should use .NET code to call DB for Business logic objects, and use SQL functions internally in the sql to pherhaps complement the .NET code...if that made any sense..
.NET pseudo code: var res= SqlConnector.GetMeInsurances(insurance=>insurance.cost < 1000 & insurance.weight > 1.00)
Now to be able to return Insurances as the business layer needs, you might need SQL functions to check div variables and creating views, etc...
The business layer or Connection Layer expects a nice table filled with relevant data that can be mapped to BO (business objects), and how that table is created, might be up to the SQL server with internal functions and stored procedures.
LINQ has its overhead and should be carefully monitored in a scenarion which is speed reliant. That goes with LINQ to SQL and Plain vanilla LINQ. There has been several test concluding that enumerating over a very large set of data is faster with a for-statement
rather than LINQ.foreach or LINQ.where, etc...
The same is for the FOREACH iterations, where it comes with an overhead when its casting the enumerated object for convenience purposes :) Nothing beats the real McCoy: For- iteration :p
Jan 04, 2011 05:20 AM|Saravanan M|LINK
CLR integration feature is the best approch for this scenaro that helps database application developers and architects can take advantage of them to write user-defined procedures, functions, and triggers, as well as define new types and aggregates.
Jan 04, 2011 08:25 AM|Deleo|LINK
Wow, thanks. I didnt know they existed.
Now i have something to try out, again; thanks :)
Jan 04, 2011 10:56 AM|doyleits|LINK
While the SQL CLR integration is a nice concept, I wouldn't say it is the best approach for the scenario described. You have to decide where the most effecient processing will be performed... within the SQL Server engine, application code, etc.
CLR integration works well for custom code/logic that does not change frequently, as the deployment is rather involved. Automating that deployment is not very simple. Business logic that may change frequently probably shouldn't be within a SQL CLR assembly.
Jan 04, 2011 03:12 PM|yani|LINK
Correct me if I am wrong but for getting a better performance: pooling all the data from DB, and then calc on a bigger BL, and then present just a bit of the pooling data…?? –and just for the comfort of BL?? Well…
Now I don’t know this (~ cool sql integration) regarding performance but its worth to check it out.
Jan 05, 2011 06:52 AM|Deleo|LINK
I read up on the CLR integration, and i must say I am amazed by the .NET team. CLR integration allows you to create SQL types and methods in .NET and the CLR will compile it to SQL with almost no overhead at all. However there are some overhead, and therefore
CLR integration should not be used for simple Transact-SQL data fetching, but in almost all other cases; CLR integration will out perform procedure sql.
You can mix CLR with either taking direct call to the SQL and do most logic in the .NET environment, or you could call .NET code from your SQL server directly inside your select statement. Where CLR integration really shines is with string manipulations
or data manipulations in general. There was an example on msn where they created an SQL type with CLR integration that had a method which run RegEx on input parameters, and they called that method inside an insert statement in SQL to validate the data. In
the article they said that by using CLR integration over SQL procedure calls, the performance increased significantly.
Another good thing about CLR integration is that it is under the protection of garbage collector, and SQL server can easily monitor resources and handle errors.
I am going to try out CLR integration in a new project and check it out if it truly is as good as it says. If it is, I will use it whenever i have to do some work on data inside SQL.