Last post Jan 31, 2017 01:10 PM by RichardY
Jan 29, 2017 07:34 AM|KeeEdwins|LINK
Pls I have a big data csv file (500000 rows) that i want to extract into a database. when I try using the code below i get this error.
Can anyone help me with a more efficient way of doing this. Sample code will be much appreciated.
protected void UploadDAT()
fileuploadExcel.SaveAs(@ExcelFolder + fileuploadExcel.FileName);
string path = @ExcelFolder + fileuploadExcel.FileName;
using (SqlConnection con = new SqlConnection(strconnection))
using (StreamReader file = new StreamReader(path))
while ((line = file.ReadLine()) != null)
string fields = line.Split(',');
SqlCommand cmd = new SqlCommand("INSERT INTO Temp(Payor, Amount, Account, Serial) VALUES (@Payor, @Amount, @Account, @Serial)", con);
decimal amount = decimal.Parse(fields.ToString()) * decimal.Parse("0.01");
cmd.Parameters.AddWithValue("@Amount", amount); Security.RemoveSpecialCharacters(fields.ToString()).Trim());
LblMessage.Text = "File successfully uploaded";
idmessage.Visible = true;
catch (Exception ex)
LblError.Text = ex.Message;
iderror.Visible = true;
Jan 29, 2017 01:49 PM|RichardY|LINK
The simplest advice i can give is: Don't do it this way.
Here is an article that details (with example code) 3 acceptable ways to copy large csv files to Sql Server:
My preference is always do it outside of the Asp.Net application. i.e. either from the database (Option 1. above) or using SSIS (Option 3 above).
If you must do it thought the Asp.Net UI, then try using the SqlBulkCopy class as explained in Option 2 above. You might still face timeout issues caused by a long running process.
Jan 30, 2017 12:21 AM|PaulTheSmith|LINK
RichardY gives good advice.
Some points about your coding.
Put some time into understanding data types. Not everything is a string! But strings are already strings.
C# has a method for specifying decimal quantities, don't parse strings. Just put the constant directly in your program.
The datatype of the variable 'fields' is array of string. This means that when you index the array the element you get is a string. That is
is a string. So don't append .ToString() to it. It makes your code hard to read. I doubt that anything inefficient is happening. My guess is that the compiler just ignores .ToString() when applied to a string.
The Trim() method removes whitespace. .Trim().Trim() is pointless.
None of these things will have anything to do with your OutOfMemoryException. They are just bad coding.
To help/fix your OutOfMemory problem you should follow the suggestions that RichardY makes. In particular the "don't do it" advice. This is a variety of 'reinventing the wheel' when the people who created SQLServer have already solved the problem for you
(very efficiently, you are not going to be able to 'out-code' them).
(I can't help but throw in another 2c. If you decide that you really must do it the difficult and inefficient way then at least look into the idea of transactions. Every, say, 1000 rows commit a transaction and start a new one)
Jan 31, 2017 10:17 AM|Eric Du|LINK
As far as I know, this problem usually occurs when some process such as loading huge data to memory stream and your system memory is not capable of storing so much of data. Try clearing temp folder by giving the command
start -> run -> %temp%
Here are someone who meet the same problem as you, please refer to this solution:
Workaround for "System.OutOfMemoryException Exception of type 'System.OutOfMemoryException' was thrown." exception:
Jan 31, 2017 01:10 PM|RichardY|LINK
I did not answer the OP's question because the approach is wrong. However, I believe the Out of Memory exception is because the OP creates but never disposes a new command object for every line in the csv file.