Last post Apr 26, 2018 07:43 PM by bruce (sqlwork.com)
Apr 26, 2018 03:56 PM|nideeshm|LINK
I have a feature to update database with multiple records in one go. I have added a File upload feature where in a CSV file multiple records can be added and uploaded. In code behind I am looping through each record, doing some validation and inserting the
records or throwing an exception. It all works fine except if there 100 records in the file it is taking almost 3 minutes to complete. I have used Task for performing the operation and Task.Whenall to look for completion.
var tasks = new List<Task>();
foreach (var item in BulkModel.Records)
Task allTasks = Task.WhenAll(tasks);
I am thinking if i can give the response in aynchronous manner like as soon as i have result for one record send the result back to browser, Instead of waiting for the 10 minutes for the whole response to complete. Just like in facebook, when feeds start
to appear one by one, I am using MVC 5 with .net framework 4.5.2
I tried in google for ReactiveExtension RxJS, PushStreamContent, yield in Action Method, stream to copyasync
public Task Get()
HttpContext.Response.ContentType = "text/event-stream";
MemoryStream sourceStream = new MemoryStream(Encoding.UTF8.GetBytes("asdjhajsdk hasjdfh afj asjfhasd fjasdfj asjdfk asfjhasdjf hasdjf hasd")); ;
I could not make the code work. I do not wish to use Signal R for one method alone.
Apr 26, 2018 04:48 PM|mgebhard|LINK
I generally bulk load the entire file in a basic temporary table rather than bulk loading single records which seems a bit unorthodox. Then validate and move the data using TSQL. I imagine that will speed up the process.
I am thinking if i can give the response in aynchronous manner like as soon as i have result for one record send the result back to browser,
This is certainly possible. Simply start a background process and return immediately to the client. Then poll or long pool the processes until the process completes. I commonly use a table to hold the process state. The load process writes to the table
and the client reads the response form the table.
Apr 26, 2018 07:43 PM|bruce (sqlwork.com)|LINK
3 minutes for a hundred records is way too slow, 1 to 3 seconds seems a more reasonable goal (I usually shoot for min 10k to 100k/minute). you should really fix the underlying performance problem, rather than bandage over it. really a 100 row csv file should
load into sqlserver in like 100-1000 ms.