By Bill Cunnien - 5/8/2008
I am not getting it. Earlier this spring I discussed in these forums the utilization of UDFs in filling a BO. The conclusion was to not use them since they have a great deal of overhead involved. So, instead, I went with some raw SQL to achieve the same results. I was able to get one of my more complex queries down to about 3 seconds. That was wonderful. Now, I am simply taking the same script that runs in the SQL Query Analyzer at 3 seconds and placing it into a stored procedure with three parameters. These parameters are declared at the beginning of the script in the query analyzer, too. I run the stored procedure and it takes many minutes to run (last run: 10 min 04 sec). I really don't get this. It is the exact same script. Is there really that much of a difference in quality between raw SQL script and the same SQL script placed into a stored procedure? What am I doing wrong? I have attached the script that I am running in the Query Analyzer and the stored procedure. I know that these are not pretty...I am still working through the details of this query. Any help, tips, criticism is welcome!  Thanks, Bill
|
By Greg McGuffey - 5/8/2008
Bill,
I'm not quite sure what you are doing to get the time differences. Is the sproc taking three seconds, but when you just run the script that is within the sproc it takes longer? I seem to recall that Query Analyzer is just using one of the older db access technologies (ODBE or OLEDB, something like that) to run the sproc/script and SQL Server Management Studio is using ADO.NET, so there should be no differences time between them running there and within your app (unless you app is doing something else to eat up time, of course).
If so, it could be explained because the sproc is "compiled" within SQL server (the query plan is already determined and saved), were as a script needs to determine the query plan, then execute the query. You could improve the speed of the script by providing hints about how the query should be executed, but it would be easier to just call the sproc directly from the BO.
If you are using a script in the app, try just calling the sproc and see if that makes a difference.
|
By Bill Cunnien - 5/8/2008
The raw script executes in 3 sec.The stored procedure, using the exact same script, takes 10 minutes. Here is my code to fill the BO: SqlParameter mItemCode = new SqlParameter("@itemcode", MatCodeTE.Text); ADUserBO mADUser = new ADUserBO(); SqlParameter mDiv = new SqlParameter("@div", mADUser.LocationIndex); SqlParameter mInvDate = new SqlParameter("@invdate", InvDateDE.DateTime.ToShortDateString()); rawMaterialValuationBO1.FillByStoredProcedure("spx_GetRunningInventory_Material", mItemCode, mDiv, mInvDate);This times out in the application because the stored procedure is taking way too long to run. I have gone through the execution plan of the raw script in the Query Analyzer and updated all statistics where it wanted it. I even edited some tables to add indexes where I thought it may help. Nothing is helping that sproc to run quicker. To me, it should run in 1 or 2 seconds since there is caching and compiling and such going on. It is a server-based process...it should be lightning fast. I have never had this happen before. *scratches head* 
|
By Bill Cunnien - 5/8/2008
Interesting note: the execution plan for the raw sql script is way different than the execution plan for the stored procedure!Investigating now.
|
By Bill Cunnien - 5/8/2008
I took the parameters out of the stored procedure and hardcoded them into the sproc just like the raw script.BAM!! 2 sec. to run the sproc!!!! Why would processing parameters cause the sproc to execute 300x slower? I am going to add them back one at a time and see which one is causing the slowdown.
|
By Bill Cunnien - 5/8/2008
If I remove the first parameter (varchar(30)), then the stored procedure works under 4 sec every time. If I reintroduce the parameter, then it goes right back to the 10 minute mark. The third parameter is a varchar(10). I do not think the type is the problem, here.Still investigating.
|
By Greg McGuffey - 5/8/2008
OK, that is different. I understand why you are baffled.
I'd see what replacing the LIKE in your where clause with an equals does, leaving @itemcode as a varchar(30). I.e.
WHERE
-- Items.Code LIKE @itemcode (original code)
Items.Code = @itemcode -- Try this new code
AND Items.Class = 1
AND Items.DefaultDiv = @div
AND Items.inactive = 0
Now, you might need the LIKE, but at least this might help see where the problem lies.
I'm assuming that Items.Code is indexed. Also, is there just one code in the field? I.e. it isn't a list of codes or anything weird like that is it?
|
By Bill Cunnien - 5/8/2008
I'd see what replacing the LIKE in your where clause with an equals does . . . I'm assuming that Items.Code is indexed. . . . Also, is there just one code in the field? I'll try the sproc without the LIKE...I suppose an IF block may work better. The code column is indexed. Only one code would be passed if the user wanted the list limited. Thanks for you attention on this, Greg. Much appreciated. Bill
|
By Bill Cunnien - 5/8/2008
The stored procedure does not like the LIKE.Who woulda thunk it?!?!?! I have removed the LIKE and have followed another approach: IF @itemcode = '' BEGIN 'run script without the Items.Code filter END ELSE BEGIN 'run the script with the Items.Code = 'MyCode' END
Thanks, again, Greg!! Bill
|
By Greg McGuffey - 5/8/2008
I'm not quite sure why LIKE isn't...er...liked by SQL Server, but I do know enough to include that in my list of things to check out when a query is slow. 
Glad you got it working (faster).
|
By Peter Jones - 5/8/2008
Hi Bill,A few random comments that may help: 1) Be careful with data types in a Where clause. I see you have what looks like date parameters defined as varChar. If the database column is a real date and you are comparing with a varChar then SQL Server will not use any index you may have on that column. 2) Big time differences like this will invariable mean that one way is using indexes and the other is using full table scans. The Profiler will show this up. 3) I notice you have: @itemcode varchar(30),IF @itemcode = '' OR @itemcode IS Null BEGIN SET @itemcode = '%%' ENDWHERE Items.Code LIKE @itemcode I think a more efficient approach would be to sort out your parameter and have a defualt and only pass in data if you have a specfic selection criteria. Then you could have: @itemcode varchar(30) = Null,WHERE ((@itemcode Is Null) Or (Items.Code = @itemcode)) Cheers, Peter
|
By Trent L. Taylor - 5/9/2008
All of Peter's comments were excellent...I thought I would toss in a few more things as well:- You can tell a query which index to use by using the WITH and INDEX commands. Sometimes SQL needs a little help...we ran into this the other day. It would look like this:
SELECT * FROM Customers WITH(INDEX(IX_MyIndex))
- The framework is not going to change anything in regards to the execution speed and performance....so if you get it down to 1 second in SQL Server management Studio executing the sproc...this will not change on the framework side unless you have some type of connection issue or something else in the mix.
- DateTime columns are aweful about slowing down queries when in the WHERE...one way to get around this is to store dates as a BigInt data type and then store then DateTimes using Ticks. We then create a Custom property on the BO that wraps this as a DateTime so that while using the BOs inside of your app you interact with a DateTime...but it is stored as Ticks on the SQL Server side...and this will drastically improve performance....by like a ton when you are testing with <, >, or betweens.
|
By Peter Jones - 5/9/2008
Hi Guys,While I've never tried converting a date to a bigint I thought I would just let you have my specific experience in that area. Most of our reports are date range based and use transactional data as their source. For this reason we have clustered index on the 'date created' column in the transaction files. I've just connected to one of our sites where the main transaction file is 11+ million rows. I opened Query in SQL Server entered the following (date randomly selected): Select HIDDateTime From dbo.tblHIDHides Where HIDDateTime Between Convert(DateTime, '2006-05-04', 102) And Convert(DateTime, '2006-05-05', 102) So, no stored procedure, no caching from previous queries. The result: 6501 rows returned in < 1 second. The database is low end Xeon database server with just 2Gb of memory, Windows 2003 Standard that wasn't busy when I did the above test. Cheers, Peter
|
By Trent L. Taylor - 5/11/2008
Thanks for the info...it always good to hear how other developers solve their isssues. In our case, we have some extremely complex queries that take place between 8+ tables and get extremely nested while calculating something called "pending." This basically determine how much a patient owes and an insurance owes....but it has to take into account all of their tran history, insurance plans (primary, secondary, tertiary, etc.), deductibles, write-offs, bad debt, and about 50 other things (not kidding on the 50 ). We tried using dates, indexes, and even tried the a nmber of conversion routines...and once we turned this into ticks with and index versus dates with an index (that was the only change) the query went from 4 1/2 minutes for a single patient with 6000 trans (don't ask me why they have 6000 trans for one patient...we just crunch the numbers ) to 30 ms...so we started doing a little digging and learned that between and ORs are bad words with SQL Server and dates when dealing with any type of complex query. This proved true again just the other day...I had a query running in 4 seconds (way too slow) once we started testing on a large database...changed the dates to ticks...1 ms...crazy.One other thing on this too, we have to be able to have extremely fast queries run on MS Express with siingle core processors as we have a lot of users with 2 GB plus databases that will till use MS SQL Express on existing equipment iin the field...we call this zero impact. It may not be but we have to get as close to that as possible for existing users. Then are the much larger sites they will have a more complex server setup and a full version of SQL Server...so we have to work in a lot of different environments...so absolute optimization is the only way we can do this.
|
By Peter Jones - 5/11/2008
Hi Trent,Thanks for the extra info. Hopefully we never have to get down to that level but it is very interesting that changing the data type made such a humongous improvement. Cheers, Peter
|
|