Last post Feb 10, 2012 02:34 PM by suntower
Jan 26, 2012 02:48 PM|suntower|LINK
I am an ASP novice. My background is in MySQL and PHP. Unfortunately, I am tasked with trying to figure out why an ASP 'classic' web page is running -very- slowly. The actual server, running IIS6 and Win 2003 Server is 3000 miles away. I'm connecting to
it via Remote Desktop.
The web page executes a fairly simple SQL 2008 query of sales history---it selects a dozen or so named columns from a transaction table with inner joins to 6 columns from 2 'header' tables. I include it below.
When I execute the query via SQL Management Studio for sales of the previous day, it returns 1,700 rows in < 1 sec.
When I run the web page---using that same query, it takes over 30 seconds.
If I modify the query to return the rows we -really- want (60 days--- 14,000 rows) and often the page times out with a nasty IIS error.
The connection string on the web page is:
Provider=SQLNCLI;Server=localhost;Persist Security Info=False;User ID=test;Initial Catalog="ourcatalog";Data Source=KSI01S1"
Although the environment is SQL 2008, the database was imported from SQL2k and apparently is in some sort of 'compatibility' mode---so that it can still be 'read' by SQL 2k? I don't know if that makes a difference or not.
My uneducated guess is that the ASP page is somehow connecting by some sort of 'remote' connection---which I don't understand, not being knowledgeable about IIS or ASP, but clearly the ASP connection is not the same as a local user.
Can someone give me some basic pointers on how to troubleshoot this?
SELECT SODETAILS.PRODUCTID AS SODETAILS_PRODUCTID,
SODETAILS.DESCRIPTION AS SODETAILS_DESCRIPTION,
ORDERQUANTITY, SHIPQUANTITY, SODETAILS.UNITPRICE AS SODETAILS_UNITPRICE,
EXTENDEDPRICE, SODETAILS.SELLUOFM AS SODETAILS_SELLUOFM,
SODETAILS.FREIGHTPRICE AS SODETAILS_FREIGHTPRICE,
SODETAILS.ORDERTYPE AS SODETAILS_ORDERTYPE,
SOHEADERS.TRANSACTIONID AS SOHEADERS_TRANSACTIONID, USERID,
SHIPDATE, COSTCENTERID, SOHEADERS.CONTACTID AS SOHEADERS_CONTACTID,
FORMNUMBER, SPECSTYLE, QuickReleaseCategory,
FOB, SODETAILS.ContactID AS SODETAILS_ContactID, onlineordertype,
SOHEADERS.CUSTOMERID AS SOHEADERS_CUSTOMERID
FROM (SODETAILS INNER JOIN SOHEADERS ON SODETAILS.TRANSACTIONID = SOHEADERS.TRANSACTIONID)
INNER JOIN PRODUCTS ON SODETAILS.PRODUCTID = PRODUCTS.PRODUCTID
WHERE(SOHeaders.CustomerID LIKE('WMUSA%')) and SOHEADERS.SHIPDATE >= 73000
Jan 27, 2012 04:45 AM|vytautas.ziurlis|LINK
By the looks of it problem is in ASP code, not SQL. The best way to solve this problem is to add some logging on that page. Normally you would split the code into logical parts (i.e. data retrieval from SQL server, data rendering to the browser etc) and
find out which part is taking the longest to execute. This will give you very clear idea where the problem is.
As I said earlier it's very unlikely that this is caused by SQL. Compatibility mode has almost no effect on performance. Furthermore, you are connection to the local SQL server (Server=localhost in the connection string).
Jan 27, 2012 09:08 PM|decker dong - msft|LINK
I think you can try to make your select statement into a stored procdure，and then use the ADO.NET's SqlCommand to call and get the result。Don't forget to set SqlCommand's timeout to a large number，And I think because Stored procdure will be faster than nested
sql statement without being compiled and executed……
Jan 30, 2012 03:04 PM|suntower|LINK
OK, I did some timings. The recordset opens in < 1 second. But then there is a 'loop' which uses MoveNext to go through each record. And -that- is the slow spot. The code is below. Given this, is there a way to 'optimise' MoveNext for the above query?
Sorry if this is a total noob question. This is fairly alien to me. I am used to MySQL where one generally retrieves the recordset in one go into an Array. I'm assuming there is a cursor involved, right? Is there a way to optimise -that-?
Do While Not Recordset.EOF
PRODUCTID.Value = Recordset.Fields("PRODUCTID")
DESCRIPTION1.Value = Recordset.Fields("DESCRIPTION1")
COSTCENTERID.Value = Recordset.Fields("COSTCENTERID")
SODETAILS_ORDERTYPE.Value = Recordset.Fields("SODETAILS_ORDERTYPE")
SOHEADERS_CUSTOMERID.Value = Recordset.Fields("SOHEADERS_CUSTOMERID")
ORDERID.Value = Recordset.Fields("ORDERID")
SHIPQUANTITY.Value = Recordset.Fields("SHIPQUANTITY")
UNITPRICE.Value = Recordset.Fields("UNITPRICE")
SELLUOFM.Value = Recordset.Fields("SELLUOFM")
SODETAILS_FREIGHTPRICE.Value = Recordset.Fields("SODETAILS_FREIGHTPRICE")
EXTENDEDPRICE.Value = Recordset.Fields("EXTENDEDPRICE")
INVOICE.Value = Recordset.Fields("INVOICE")
SHIPDATE.Value = Recordset.Fields("SHIPDATE")
TotalSum_EXTENDEDPRICE.Value = Recordset.Fields("TotalSum_EXTENDEDPRICE")
Report_CurrentDateTime.Value = Recordset.Fields("Report_CurrentDateTime")
Report_CurrentPage.Value = Recordset.Fields("Report_CurrentPage")
Jan 30, 2012 04:24 PM|suntower|LINK
I just checked the code further (some classes that control RecordSet) and by default it opens with a readonly forward only cursor type.
Does that help narrow it down? (or at least exclude some possibilities?)
Jan 30, 2012 07:39 PM|decker dong - msft|LINK
I don't think it's MoveNext that has caused the problem……Because Single-Forward reading should be a most efficient way to read out data contents……
All in all，I think you can try to use ADO.NET instead of pure ADO。
Jan 31, 2012 03:03 PM|suntower|LINK
Unfortunately, the site was built on an ASP 'template' so I'm not allowed to change anything as fundamental as going to .NET.
But there -must- be something in how ADO communicates with SQL Server that is causing the slowness.... and surely there is some way to troubleshoot this, right?
Feb 01, 2012 04:53 PM|Cathy Mi - MSFT|LINK
Have you looked at the various recommendations here:
Optimizing the Performance of Database Access in IIS
Feb 10, 2012 02:34 PM|suntower|LINK
I read through that article and the only one that seems to apply is: Set the ADO CacheSize property to a larger number than the default (1). I did that and it helped -somewhat-. The query went from 30 seconds down to 23 seconds. But that's
still 22 seconds longer than running from the desktop.
I think I -am- using a native connection string (I posted above). And I am not permitted to convert this to a stored proc. Also, the query will change slightly every time the user runs the report.
Any other ideas? Driving us nuts. Sorry for the 'noob' tone, but I'm no ASP programmer/SQL Server expert. This is a legacy app that will -never- be significantly updated, so the owners don't want to touch it beyond the bare minimum.
Cache results from data sources that are stable, or that vary predictably. You can cache either the recordset returned by a query to the data source, or cache the pure HTML output that was created using the results of the query.
For instance, if you are using ADO to populate a listbox that will contain the cities in which you have offices, the first caller to ADO can insert the ADO query results into Application scope. Subsequent requests for that listbox information could be fulfilled
from the Application object, instead of an expensive call, through ADO, to a data source.
If you wish to cache the resultant recordset directly, you should use a client-side cursor, and disassociate the recordset from the
Command object by setting the ADO ActiveConnection property to
For more information about data caching, see
In general, avoid putting ADO connections in session state, because ODBC (version 3.0 and later) automatically does connection pooling for you, and OLE DB provides session pooling.
Use the native OLE DB connection strings as much as possible. Native OLE DB connection strings are generally faster than most ODBC DNSs. In addition, OLE DB connection strings are more flexible because your application will be able to make
use of any OLE DB provider.
If using a data source that supports them, such as Microsoft SQL Server, use stored procedures whenever possible. A query executed from a stored procedure is faster than a query passed through a SQL query string.
To optimize performance, avoid using the ADO record addition and deletion methods, such as AddNew and Delete. If your application adds and deletes records intensively, your application will perform better if it uses direct SQL statements,
such as INSERT.
Set the ADO CacheSize property to a larger number than the default (1). By forcing ADO to retrieve multiple records in one transaction with the data source, you will eliminate a portion of the overhead involved in that transaction, and your
application may become more scalable. Generally, you are most likely to see benefits if you set
CacheSize to equal to either the number of records expected, or 100, whichever is less.
Use the ADO 2.0 AdExecuteNoRecords flag when executing commands that don't return data rows, or that return rows that you don't need to access or save. This new feature, introduced in ADO 2.0, was created to reduce the amount of overhead
incurred by ADO, and thus increase performance and scalability.
Disable temporary stored procedures, if your data source supports them.