Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Sep 20, 2010 12:26 PM by frez
Sep 18, 2010 06:13 AM|LINK
We have a scenario where we need to store large tables of data on Session in asp.net page. on a highlevel we have to store following data
1> A table (table 1) with 1000 rows and 8 to 10 columns.
2> A table(table 2) with 500 rows and 8 to 10 columns (actually user can add the data from table 1 to table 2, so as user keeps on adding from table 1 we remove the data and put it in table 2)
3> Another table(table 3) with 1000-1200 rows each having 3 columns.
We are having a webfarm, so we need to store it in either SateServer of Oracle DB, please advise which one is better. What would be the performance implications of storing such huge data on the server.
One more thing is at any point of time max of 70 -80 people will be accessing our website, so will this cause any performance degradation?
Thanks in anticipation
Sep 18, 2010 06:53 AM|LINK
I would avoid using session for this. Presumably, when a user has finished altering the contents of these datasets, you persist that to a database? Why not just make a call to the database at the time to update as each change is made?
Sep 18, 2010 07:10 AM|LINK
Thanks for the reply,
The problem with storing to the database is that, user can add the data from table 1 to table 2 or remove the data added in table 2(this data would go back to table 1) also he can associate more data to the data in table 2 and this data would go to table 3.
Once he is done with all these changes he will click on Save then we have to store it in the final tables in the database, at any point of time he can leave the website or click on cancel, so we can't store the data into the DB on fly, either we need to
maintain a temporary table for every final table and store the data in temp tables or store it in the Sessions. Please advise which would be better.
Thanks in advance
Sep 18, 2010 07:16 AM|LINK
Storing it on the client until the user commits their changes is a much better approach. From what you say so far, you don't need to store the whole of table 1 or 2 - just update the UI and log the changes. Would that work?
Sep 18, 2010 09:22 AM|LINK
Till now we were storing it in UI only, but with this amount of data browser is taking too much memory and hanging, so we are planning to take this storing step to the server.
Sep 20, 2010 07:10 AM|LINK
You are storing very huge information in session and you need to look for alternates. For better understanding here is some maths.
Your tables are really big i.e. 1000+500+1000=2500 Rows. If average row size is 10KB then whole table size will be 2500*10 = 25000KB i.e. 25MB. for 70-80 users it will be 75*25 = 1875MB which is around 2GB of data in your session.
Why dont you directly perform updates on database. If not possible then atlease use lightweight Entities to store your data in session instead of using DataTable.
I recommend you to use datapagging to reduce the bytes transfereed and improve performance for details see:-
Sep 20, 2010 12:15 PM|LINK
Why to give so much pain to session??? First if you are using web farm my advice is to use Microsoft web farm framework, to give a start look into this
You will not find anyone here who will support storing tables in session. If you really need to do so try sql caching [not an alternate but for performance].
To keep the data operation task flow smoothly use LINQ or entity framework. Believe me they are fast and easy to play with.
Hope this will help and again my advice is not use session for storing table data.
Sep 20, 2010 12:26 PM|LINK
Your application is not going to scale if you store large amounts of data in session state. Try storing just a unique key in session state to the data held in a database and have the old data cleaned up from time to time to cater for users abandoning their