Last post Jan 25, 2012 09:37 PM by atconway
Jan 25, 2012 12:42 PM|geezer|LINK
I have an ASP.NET page that builds a report, the user can filter the output of the report by choosing different options from multiple dropdown boxes and checkboxes. They can choose one of these options or all of them, or even some of them - there are 20+
I want to know what the best way to code this is? Do I create 20+ IF statements with each block containing a different SQLCommand based on the combination chosen by the user? I know a CASE statement would perhaps be better, but I just wanted to know if there
was a better way...
Any help or advice would be appreciated.
Jan 25, 2012 12:47 PM|Ashutosh Shukla|LINK
Its upto you choose which way handle this problem my point of view use Case
bcoz its work according to condition
Jan 25, 2012 12:56 PM|oblomov86|LINK
Why you don't take variant with simple 'WHERE'?
where @is_cool = 1
and @is_fast = 0
and type_id = @type_id
Jan 25, 2012 01:48 PM|geezer|LINK
oblomov86 - That's what I am doing at the moment i.e. many different SQL Commands with parameterised variables within many IF statements. I just wondered if there was any other way to simplify the whole procedure. I think I'll go down the SELECT CASE road...
Thanks for your help
Jan 25, 2012 09:37 PM|atconway|LINK
One suggestion I have is based upon the amount of total data that could be used for any or all of the 20 reports. If it is not too much overhead to bring back all possible data (again this is under the premise that the data is not absolutely huge)
used for reports and cache (if static enough) and then use LINQ to Objects to query the data.
So let's say all data is 1000 rows and multiple reports need to be ran off of it. Do you want 20 If statements, 20 different stored proces, dynamic SQL, etc? Using LINQ you could pull back all needed data and then query the data in .NET to generate and filter
on the report. I use this method often as it can drastically cut down on the number of custom stored procs or dynamically built queries needed when pulling back all data is a
However if the total of all data is millions of records and will bring down the memory in IIS by trying to bring it back you will need another solution (or hybrid one - I would still use LINQ to Objects to filter for small variations in larger sets of data
that can provide for a variety of needs). In this case you may need to look into building out
Expression Trees to build out dynamic Linq queries. Have a look to the following for more information:
Dynamic LINQ (Part 1: Using the LINQ Dynamic Query Library):
Dynamic Query Sample:
How to: Use Expression Trees to Build Dynamic Queries (C# and Visual Basic):
Hope this helps!