Last post Dec 31, 2014 02:11 PM by Sameer Shah
Dec 08, 2014 10:56 AM|shoab shah|LINK
Our company is about to develop a product (Web application).
Scenario / Requirement:
The application is going to be used by multiple concurrent users (approx 50 to 70). The applications basic feature is to provide search. This search will be of different entities (name, place, organization, etc.). Further these entities are interlinked.
Issue / Challenge:
So as to have better performance and lesser DB interactions we plan to store these entity objects in server RAM (in memory). And periodically update them. We are not able to finalize what approach to follow to achieve this. Some of the options we think of;
1) Asp.net Data Caching
2) In memory databases. E.g. http://www.mongodb.com/what-is-mongodb
3) some thing else.
Kindly provide us inputs on this.
Any thoughts / comments highly appreciated.
Thanks in advance
Dec 09, 2014 06:35 AM|shetesagar87|LINK
This is a new feature that came in SQL server 2012 which seems to increase the performance of the application /Storing the table in memory in the best way.
Please refer the below link
You can also try disconnected database in asp.net.
Dec 31, 2014 02:11 PM|Sameer Shah|LINK
Since you will need to search data, even in-memory non indexed data will degrade you application's performance.
You can use a caching solution with database, that will allow you to index data and later search based on indexed attributes. In-memory indexes and data will give you fast searching and retrieval.
NCache(http://www.alachisoft.com/ncache/) is one of the possible solutions. NCache allows you to define indexes on class attributes and search indexes using SQL like
query language (OQL-Object Query Language). It also allows to search in-memory data based on multiple indexed attributes to filter out required results. You can use write-behind caching and database dependencies to keep you cache sync with the database.