Efficiency in data processing with BO's


Author
Message
Keith Chisarik
Keith Chisarik
StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
Excellent, thanks a lot.

Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 7K
It is really pretty simple.  This property is on the BO and by default is set to 1.  Simply change this property to another value and it will begin updating on additional threads.  This can be a very effective tool when updating so many records at the same time.  I know that it has probably been a while, but we covered this in class a little.  But we didn't go into great detail, so it is one of those items that doesn't matter...until you need it Smile

What will happen is records will begin to save on multiple threads, which helps to prevent an update bottleneck when updating large datasets.  This is especially important when updating over a VPN or slow connection.  The worse the bandwith, the more performance gain you will get....but this will still dramatically improve an update time when dealing with so many records.

FWIW...and a little sneak peek...we are setting up a knowledge base that will continuously have new articles, samples, and videos.  There will also be howto's, FAQ, etc.  This will make it SO much easier for us to put out new samples, help information, and the like versus requiring a new build, new help docs, and trying to squeeze more samples into the install.  This will be a more fluid environment, and this would be a good topic that may end up there. 

Keith Chisarik
Keith Chisarik
StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
Nope.



DataLayerAccessThreads sounds VERY interesting. I searched the documentation for that and didn't find much. Can you direct me to where I can learn more on that?



Thanks!

Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 7K
Good deal.  One other thing, you know that you can update on more than one thread at a time, right?  On a business object, there is a property called DataLayerAccessThreads.  By default this is set to one.  However, in a situation like this, you may want to increase this to as high as 7.  If I recall, there are diminishing returns after 7...but you can test between 2 and 7 to see where you get the best performance.
Keith Chisarik
Keith Chisarik
StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
newrow() was the fix, lightning fast now like I would expect. Total processing time for the 87k records is under 20 seconds and that is with round trips between threads to update a progress bar in the UI layer.



Now to just get it across the wire Smile



I must have a control somewhere bound to the BO, I haven't found it yet but it must be there.



Adds a new data row to the current data table and sets the CurrentRow to the new row (This method is recommended for programatically adding new rows. To add new rows through a user interface, use Add() as the editing state of the business object will be set and the Navigated event will be raised when the row is added.)

Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 7K
Keith,

I guess without getting my hands on it I cannot give you any more suggestions here.  For whatever it is worth, we ran a test today dealing with 400,000 records (medical transactions) on a combination of tables that may have as many as 50 fields.  We got the calculations and response down to under 1 second.  I know that this doesn't help you with your problem, but we have all been laughing around here lately because everything that we are doing was literally impossible with VFP....because we tried!  Even DB2 is a better database than VFP and can perform at a much better level simply due to the fact that it is a server, supports sprocs, doesn't stream the entire file over the network, etc.

The Add alone is not going to slow the world down.  To prove this, create a quick sample project that uses the SF sample database.  Then using the customers BO, add 80,000+ records dynamically (without committing them back to the server) to see what type of response that you get.  I am willing to bet that it will FAR exceed 109 seconds.  Also, if you are not needing to update the UI, use the NewRow method instead as this will not attempt to update any bounds controls...which can improve performance as well.  At this point the database is irrelevant since you are dealing with the BO alone.

Keith Chisarik
Keith Chisarik
StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
No filter, sort, default values, nothing you mentioned.



It is a very simple BO with a single fill all method.



The 109 represents seconds and just represents a very small portion of the operation for illustration purposes.



I could do 87k adds to a cursor or table in VFP in the blink of an eye. I must be doing something wrong here.



Could it be the fact that I am using DB2 in some way, I wouldn't think so since I am just doing the add in memory but at this point I am grasping. I am going to build a sample project with the same table structure/BO properties using SQL and see what happens.




Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 7K
Is the 109 the amount of time or number of calls?  We too use code-profilers and they are great tools.  But in this example, there is a lot of stuff that can be happening once the BO.Add() is called.  What do you have as a filter?  What do you have in the SetDefaultValues event?  What do you have as a Sort?  There is something else in your code that is causing the slow down.  The fact that you are adding 87k records before a save is somewhat disconcerting, especially considering that this is all going to have to be sent across the wire.  I would be sending things across in smaller batches (wrap them in a transaction if you need to).  But this is really a side-issue.  Look in the areas that I mentioned to see what you find.
Keith Chisarik
Keith Chisarik
StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)StrataFrame VIP (2.4K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
I am still struggling with speed issues and require some help.



In breaking down the problem, I know that I will probably have to write my own insert sprocs on the DB2 side, for now I am ignoring that.



I am having speed issues before I ever save across the wire, just working with the BO. After much trial, it seems that the call to the add() function of the BO is my first bottleneck. I invested in a code profiler and it clearly shows that to be the issue (see attachment), and I can confirm that by replacing the add() line with just about any other function and things are fast.



I was under the assumption that working with the BO was essentially the same as working with the underlying datatable, what am I doing wrong that bo.add() in a loop is taking forever? To add 87k records to the BO is taking about 15 minutes, before BO.save() is ever called. The BO has 6 fields only, 4 strings, a self incrementing integer PK, and an integer row version field, primarykeyisautoincrement is true and row-versioning is set to optimistic row version.



Any thoughts?

Keith Chisarik
Attachments
slow.jpg (131 views, 26.00 KB)
Trent Taylor
Trent Taylor
StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)StrataFrame Developer (14K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 7K
From your response I get the impression that you believe CLR is the way to go with SQL Server. Is this so and, if it is, what made you guys move away from standard SQL stored procs?

We still use standard TSQL stored procs for INSERT, UPDATE, and DELETEs (our CRUD settings).  But anything past that or that requires customization or custom queries on the server, we use CLR.  We haven't changed any of our standards or logic.  CLR stored procedures are much easier to work within and have more flexiblity (especially for .NET developers).  You get all of the benefits of TSQL (as it relates to server side performance [you do take a slight hit using CLR over TSQL but it is not really noticeable]) but get to write all of your code in C# or VB.NET.  Also, the DDT will deploy CLR assemblies for you as well which just adds to the ease of implementation.

GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Similar Topics

Reading This Topic

Login

Explore
Messages
Mentions
Search