What do you all consider the most efficient way to process through large amounts of data?
1) Load the BO with all data upfront and then "filter", modify data, move on
2) Load the BO upfront and then "seek" (can you seek for more than one record even?)
3) Load the BO with only the small subset you want to work with, inside a loop (many calls to the database), modify data, move on
4) Don't use a BO at all, use ADO or LINQ to SQL.
5) Other
I have to adjust pricing for a very large volume of records, needing to update or insert 1-6 at a time for a given combination of criteria from a total record count of about 200,000.
Just curious as to your opinions or past experience with this. I know I find myself having to worry about the most efficient way of doing things since I never have been able to make the .NET/SQL stuff as fast as VFP.
Keith Chisarik