Last post Sep 11, 2018 02:47 PM by limno
Sep 10, 2018 07:14 PM|aazizasp|LINK
I have a simple web application that I use to upload data from excel sheet to a sql table. This part works great. I use SqlBulkCopy object to to write to the server. My problems started to happen when users upload more excel files and some of the rows are
already inserted in the database by previous uploads. How can I handle this?
Sep 11, 2018 06:59 AM|Ackerly Xu|LINK
You could use distinct keyword in sqlserver to help you.
Below is my sql. I have an original table named diffdata with repeated data .
Please remember to backup your data before you delete it.
select * into myBackUp from diffdata --back up original data if you find error, you could find back the original data
begin tran --begin transaction
declare @table table( --declare a variable to store the unique data of the original table
id int primary key identity(1,1) ,
column1 int ,
insert into @table
select distinct column1,column2,column3 from diffdata -- insert unique data into the variable
select * from diffdata
select * from @table
truncate table diffdata -- delecte the original data
insert into diffdata select distinct column1,column2,column3 from @table --insert unique data back into original table
commit tran --commit transaction
You could also do it in c# using linq's keyword distinct. Please refer to
Sep 11, 2018 09:59 AM|wmec|LINK
There should be proper table constraint to avoid any duplications or key conflicts in the table.
Sep 11, 2018 12:59 PM|aazizasp|LINK
I implemented your suggestion by adding a unique data constraint on two columns but while testing, it will not insert all of the data from sqlbulkcopy into the table if there is one duplicate. How do I make it insert everything except duplicates?
Sep 11, 2018 02:47 PM|limno|LINK
Use a staging table and merge your stage table with target table afterwards with TSQL.