seek() slowness


Author
Message
Keith Chisarik
Keith Chisarik
StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
and yes, that is the column I set to be the new PK



Me.RentitemsBO1.CurrentDataTable.PrimaryKey = New DataColumn() {Me.RentitemsBO1.CurrentDataTable.Columns("rent_RPMrecnum")}

Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 6.9K
Trent: /points and laughs

Not at all!!!  Wink

and yes, that is the column I set to be the new PK

The reason that we have never done this is because there are better ways to work through large data sets.  This is, I beleive, what Greg was reffering to in his post regarding "SQL base."  You are correct that VFP could handle 1 million records plus very easily...but SQL can go MUCH bigger.  You are used to working in a "connected" data environment, which was FoxPro and the reason for all of the corrupted memo fields Smile  With SQL Server and business objects, you are working in a disconnected environment so you want to work with as little data on the client side as possible.

When we need to work with very large data sets, for example prescription contraindications which can have a a million+ plus records to deal with.  We create an SQL Project and deploy it to SQL Server and then execute it through a stored procedure.  The DDT will automatically create the stored procedures required in order to execute it for you.  What is nice about this all of the work is done on the server, like a T-SQL stored procedure, but you are talking ADO.NET or whatever you you need.  You can then send your ADO.NET results back to the BO.  This is the best approach when dealing with large data sets....or create a T-SQL stored procedure, but this is not nearly as fun Smile


Greg McGuffey
Greg McGuffey
Strategic Support Team Member (3.4K reputation)
Group: Forum Members
Posts: 2K, Visits: 6.6K
By "set based" I'm referring to the way SQL does things the most efficiently. It works best on sets of records, not on individual records (or rows). Looping through the BO (in SQL Server, cursors) is row based. Row based manipulation provide lots of power as to what can get done, but is not as fast (generally) as using SQL to manipulate a whole set at once (that is not as fast for SQL server to get the work done. Languages like .NET are more efficient at row based manipulation. Which is fastest would probably depend on all sorts of variables, like efficiency of code, hardward, connectivity, ect.).



Set based manipulation is great if you have something you want to do to the data based on some criterion. E.g. lets say you have a database containing a table of movie rentals. You want to mark all rentals more than a year old as needing maintenance (like polishing the DVDs). It is more efficient to use an update statement than to loop through the rows:



Update rentals -- table with rentals

Set MaintenanceStatus = 1 -- lets say this means it need maintenance

Where PurchaseDate <= DateAdd(year,-1,GetDate()) -- gets date one year ago




Since this sort of thing only needs to be done infrequently, the user doesn't actually need to see the data to get the job done, a SQL statement is the way to go. You could create a sproc (for max speed, as it would be precompiled in SQL) or you could just create a method in a BO.



However, there are times when this won't work too well. If what you need to do is complicated or not data related or the data is in another database or the user needs to see all the data to decide what to do (the data would thus be on the client anyway), then row level processing is needed. It turns out that it isn't actually needed very often though.



I hope this helps. I come from MS Access background, not a VFP, so I had to learn a lot about SQL Server as I had it doing lots of work to speed up things. Sounds like VFP was speedy (always heard that, but I wasn't allowed to use it when I started doing db development...long story). The whole disconnected data thing in .net is different and I'm still getting used to it. It may be that this world view is now inaccurate or not as accurate, as it might be more efficient to use a loop in .net than in SQL server. I'm watching this post, hoping to learn a lot also Hehe

Greg McGuffey
Greg McGuffey
Strategic Support Team Member (3.4K reputation)
Group: Forum Members
Posts: 2K, Visits: 6.6K
Man, if I'd just waited a few minutes, I would have see what Trent posted....Blink



Seems I not as much a dinosaur as I thought w00t
Ben Hayat
Ben Hayat
Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)
Group: Forum Members
Posts: 374, Visits: 1.2K
The reason that we have never done this is because there are better ways to work through large data sets. This is, I beleive, what Greg was reffering to in his post regarding "SQL base." You are correct that VFP could handle 1 million records plus very easily...but SQL can go MUCH bigger. You are used to working in a "connected" data environment, which was FoxPro and the reason for all of the corrupted memo fields [Smile] With SQL Server and business objects, you are working in a disconnected environment so you want to work with as little data on the client side as possible.



When we need to work with very large data sets, for example prescription contraindications which can have a a million+ plus records to deal with. We create an SQL Project and deploy it to SQL Server and then execute it through a stored procedure. The DDT will automatically create the stored procedures required in order to execute it for you. What is nice about this all of the work is done on the server, like a T-SQL stored procedure, but you are talking ADO.NET or whatever you you need. You can then send your ADO.NET results back to the BO. This is the best approach when dealing with large data sets....or create a T-SQL stored procedure, but this is not nearly as fun




Hi Trent; This was a very valuable point. Could you please make a note to bring this up in the class to cover it in more detail?



Much appreciated!

..ßen
Keith Chisarik
Keith Chisarik
StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
Part of the problem is that the volume isn't in SQL exclusively, I have to read a table of roughly 1/4 million records from a Dataflex table via ODBC (seems all I do these days is read old data but thats another story).



I am trying to create a sync method that will mirror the table in SQL for me to program against. So for each of the 250,000 records I need to either add a record or check the fields to see of they have been updated. It is the initial "look for this record" that is taking the time. I think what you are saying here is that I want to create a SQL proc to search for the record, not rely in the BO seek() method since ADO wasnt built to do that... do I have it right?



Thanks everyone for your input and time. For the time being I have abandoned doing the whole table at once, and gone for only updating the records the user calls for at any given time, I will eventually need to be able to "refresh" the entire SQL table against the live Dataflex datasource for reporting though.

Keith Chisarik
Ben Hayat
Ben Hayat
Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)Advanced StrataFrame User (548 reputation)
Group: Forum Members
Posts: 374, Visits: 1.2K
Dataflex table




Good old DATAFLEX Smile

Are they still around? I used it from 1983 to 1985. It's proprietary data file format (back then) was very solid.

Are they using Btrieve now? Or Pervasive SQL?

..ßen
Keith Chisarik
Keith Chisarik
StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
I must be reading the old school stuff, .dat files, a file for each index.... stuff like that. I have no clue what they are doing these days, I downloaded Visual Dataflex just for the database tool. I was handed an old application that is no longer supported and has no documentation to work with.

Keith Chisarik
Keith Chisarik
Keith Chisarik
StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)StrataFrame VIP (1.6K reputation)
Group: StrataFrame Users
Posts: 939, Visits: 40K
PS - in 1983 I was 10 Smile

Keith Chisarik
Trent Taylor
Trent Taylor
StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)StrataFrame Developer (10K reputation)
Group: StrataFrame Developers
Posts: 6.6K, Visits: 6.9K
do I have it right?

Basically.  I was not aware that this was the type of database your were working with.  Could you use scalar methods on the database to give you the info you need rather than using the Seek on the ADO.NET datatable?  Just a thought....

PS - in 1983 I was 10

ROFL....not that I have any room to talk....I was 7 BigGrin

GO

Merge Selected

Merge into selected topic...



Merge into merge target...



Merge into a specific topic ID...




Similar Topics

Reading This Topic

Login

Explore
Messages
Mentions
Search