Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Jan 25, 2012 08:37 PM|LINK
One suggestion I have is based upon the amount of total data that could be used for any or all of the 20 reports. If it is not too much overhead to bring back all possible data (again this is under the premise that the data is not absolutely huge)
used for reports and cache (if static enough) and then use LINQ to Objects to query the data.
So let's say all data is 1000 rows and multiple reports need to be ran off of it. Do you want 20 If statements, 20 different stored proces, dynamic SQL, etc? Using LINQ you could pull back all needed data and then query the data in .NET to generate and filter
on the report. I use this method often as it can drastically cut down on the number of custom stored procs or dynamically built queries needed when pulling back all data is a
However if the total of all data is millions of records and will bring down the memory in IIS by trying to bring it back you will need another solution (or hybrid one - I would still use LINQ to Objects to filter for small variations in larger sets of data
that can provide for a variety of needs). In this case you may need to look into building out
Expression Trees to build out dynamic Linq queries. Have a look to the following for more information:
Dynamic LINQ (Part 1: Using the LINQ Dynamic Query Library):
Dynamic Query Sample:
How to: Use Expression Trees to Build Dynamic Queries (C# and Visual Basic):
Hope this helps!