Last post Jun 13, 2018 05:47 AM by Deepak Panchal
Jun 10, 2018 12:25 PM|kourosh23|LINK
I have a table with too many rows, more than 100 million records. The data is divided into 50 different tables with different numbers (or ids) right now.
For getting queries from these tables I am using T-SQL, which has it own problems. For solving this problem I am using table partitioning at the moment.
But table partitioning has poor performance for my needs.
Which solution do you suggest?
Jun 11, 2018 01:38 AM|Deepak Panchal|LINK
If we talk generally then you may try to consider points below to improve the query performance.
(1) Use indexes.
(2) Fire query on limited data segment.
(3) only select the fields you need. do not select any unnecessary data.
(4) if you had involved the tables in your query which are not necessary then remove it.
(5) Try to avoid Outer Joins in your query.
(6) Avoid use of any calculated fields in joins or where clause.
Further, You can try to refer link below may give you some additional information.
Dynamically Query a 100 Million Row Table-Efficiently
Jun 11, 2018 04:01 AM|kourosh23|LINK
Thank you for your reply!
Suppose I have 30 tables like this:
in each table I have 10 million records, and I also have other table groups, same as above but called: table_B_XXX and table_C_XXX. There are also relations among these tables.
I want to run a query (stored procedure) on tables: 4, 10 and 13.
create PROCEDURE [Documents].[GetElements]
declare @TableN TABLE (TableNumber nvarchar(3))
insert into @TableN select replace(Str(Data, 3), ' ', '0') from Utility.SPLIT(@TableNumbers,';')
where Data <> ''
declare @TSqls1 table (tsql nvarchar(max))
insert into @TSqls1
select 'select t1.ElementId, t2.Rpath, t3.Position from Table_A_' + TableNumber +' t1
inner join Table_B_' + TableNumber +' t2 on t1.id=t2.TableAId
inner join Table_C_' + TableNumber +' t3 on t2.id=t3.TableBId '
declare @tsql nvarchar(max) = 'select * from (' + Stuff((SELECT ' union all ' + tsql FROM @TSqls1 FOR XML PATH(''),TYPE).value('(./text())','VARCHAR(MAX)'), 1, 11, '') + ') t'
execute sp_executesql @tsql
Now the problems are:
1) that when we use T-SQL on a SP we can not use a sp inside another sp!
2) and also we can not store the result of sp1 in a table to use in sp2.
3) we want to avoid the universal temp tables
To solve the problem, we are using table partitioning at the moment, but the performance is too poor unfortunately!
Please let me know if you need further explanation of the problem.
Jun 13, 2018 05:47 AM|Deepak Panchal|LINK
We are not aware about the fields in table and how much data you are trying to fetch.
So it is difficult for us.
If your table contains any date fields then try to fetch data for specific period of date and try to fire a multiple queries may help you to improve your performance little bit.