Last post Apr 11, 2018 02:52 AM by AngelinaJolie
Apr 09, 2018 02:09 PM|crazy coding|LINK
We have a long running DB query that populates a temporary table (we are not supposed to change this behavior) which results 6 to 10 million records, around 4 to 6 GB data.
I need to use .NET 4.5 Web API for fetching data from SQL DB and the API is hosted on IIS. When a request comes from the client to API, query runs minimum 5 minutes based on amount of data in different joining tables and populates temp table. Then API has
to read data from DB temp table and send it to client.
Without blocking client, without loosing DB temp table, without blocking IIS, how can we achieve this requirement?
Will I not loose temp table if I go with async calls?
Is it a right choice if I load all data in server memory from temp DB on first call and from next call, client can request data page by page... so that, the whole logic of creating data can be skipped?
Apr 09, 2018 02:24 PM|PatriceSc|LINK
And you need to build this temp table every time or you can keep it unchanged for some time ? Do you really need to send back that much data ? This is a kind of data synchronization ? (maybe it could be improved by sending back aonly rows that actually
I'm afraid it will be hard to improve a lot if anyway you basically can't change the current design.
Apr 09, 2018 03:23 PM|mgebhard|LINK
Async does not work like you think. An async request takes as long as the longest process.
It sounds like you want to start a process and be notified when the process is complete. The requires that the request returns immediately from the 5 minute request while letting the process run to completion on the server. The 5 minute process will need
to persist the data somewhere as the temp table is torn down when the connection closes. The process should set a flag to let the client know the processing is complete. The client needs to check the status flag at some frequency to determine if the process
has completed then get the persisted data.
Apr 09, 2018 05:17 PM|DA924|LINK
Most likely, by using MS SQL Server Service Broker is an async manner. I worked on a Web application that did reporting on demand where lots of data was fetched from the database to produce the reports. A single user could run the reports that brought the
Web server to its knees. That was circumvented by using Service Broker.
Apr 10, 2018 06:54 AM|crazy coding|LINK
Its a report, until whole data is retrieved from the API server, temp data should not be cleared.
Apr 11, 2018 02:52 AM|AngelinaJolie|LINK
around 4 to 6 GB data.
Sir, are you sure that's all (about 6 GB)you need, maybe we could make the database sorted and classified before you start querying the data from the database.
it means there are different databases according to a different condition( date, or something else) and for better, with Partitioning with Temporal Tables;
With regards, Angelina Jolie