By Howard Bennett - 4/30/2010
I'm converting a DBC to SQL using DDT...first time - so - having lots of questions!!!
What do you guys do with the default values? Do you enter them in the DDT and have that stored in the SQL database - or - is it best to leave that blank?
Also, what's best choice with regard to default values for primary keys and foreign keys - (-1 or 0)?
Lastly, what's the best way to get the data from VFP to SQL? Is there something in DDT that does that - or - am I on my own to write that code?
I figured some of you have already done this and would be able to help with some "gotchas" and things to avoid. Any help is greatly appreciated!
HB
|
By Charles R Hankey - 4/30/2010
Since I've done a lot VFP -> SQL conversions and bring a VFP backgound to SF I'll take a poke at this.
First, converting actual data from VFP to SQL is not a DDT function. DDT is primarily for keeping a schema in sync - think Stonefield for SQL.
I used a product called Full Convert Enterprise from Spectral Core http://www.spectralcore.com/order.php with great success. You might also try SQL2008 SSIS with the VFPOLEDB driver but I haven't actually done that. If your structures are pretty simple - and especially if they do not use GUID keys - you might even try the improved Upsizer Rick Schummer and Doug wrote in VFPX.
As far as default values, that is best done in the business object (there is an event handler there). The BOMapper will let you handle nulls when creating the strongly typed properties. For example, VFP dates that port over as null ( you need to change all your empty dates to null before porting to sql and allow for those nulls in the sql schema ) In the BOMapper you say you wwnat to replace null values in datetime fields with #1/1/1800# and this will cause SF controls to see that as an "empty" date.
In the DDT, for integer keys just make sure the PKs are marked as Primary Keys and you enable autoincrementing. The framework will handle everything else including using temporary -1 etc in parents and children before the save goes to the backend and then populating with the correct server generated incremental keys ( it is very cool )
Lots of us here speak Fox and its various frameworks ( I'm VFE, Edhy is VPM and I think we have Maxframe and even MM guys as well ) so just ask whenever you get stuck.
I can help quite a bit with Full Convert if you go that route and perhaps can give you some tips on how to massage your VFP data before feeding it to a converter into SQL to get easiest results.
|
By Charles R Hankey - 4/30/2010
One other thing to add in case it applies to you :
In VFP/VFE I used all GUID keys so I could generate keys client side. In SF/SQL there are a lot of good reasons to use integer keys. The framework is very friendly for that and I recommend it. I've worked out the conversion issues of the data going from VFP to SQL in changing key types so if that applies to you let me know and I might be able to help.
|
By Howard Bennett - 4/30/2010
Thanks for your response - very helpful!
I do have Full Convert and it seemed to work well - but - I was afraid of all the NULL values...there seemed to be a lot of people who warned against using NULL's. I seem to recall that some of the tutorials from SF mentioned not allowing NULL's - maybe I'm dreaming that.
But, let's say that I've decided to go ahead with using DDT for the structure and I'm going to write code to take the data from VFP and copy it to the SQL tables (because I've made a lot of progress toward that goal.) If I don't allow NULL's - am I okay - or - do I need to allow for NULL values?
Also, if I do put default values in the actual table structure - is that okay? That is do I have to do that again in the BO Mapper?
Thanks again!
HB
|
By Edhy Rijo - 4/30/2010
Hi Howard,Adding to the good posting from Charles, let me help you clear some issues... But, let's say that I've decided to go ahead with using DDT for the structure and I'm going to write code to take the data from VFP and copy it to the SQL tables (because I've made a lot of progress toward that goal.) If I don't allow NULL's - am I okay - or - do I need to allow for NULL values? The data in SF is view through the SF Business Object's properties (much like a VFP cursor or view) in the SF BOM you have a facility to tell those properties how to handle null values so it will not generate an error while using the BO's properties in your forms, reports, etc. In other words you can tell the BO property that when a database null value is found to replace it with a default value that you will specify like in Charles example above with the datetime field. Also, if I do put default values in the actual table structure - is that okay? That is do I have to do that again in the BO Mapper?
Yes that is OK, but if these tables will not be used outside of SF, it is better to have all those settings in the business object's itself, so when you use the BO from a WinForm or a Web application it will enforce the rules you defined at the BO level. On a side note, be aware that SF supports reading/writing to VFP tables, so if this is a commercial application type which will be installed in several places, you could just build the VFP import process in the SF application by creating VFP business object and do all your logic in SF. I have not done that, but I will use that logic in one of my projects for a VFP application I am re-creating in SF.
|
By Charles R Hankey - 4/30/2010
Full Convert moves the data for you from VFP to SQL - that's what it's for - so there is no writing code to move the data. All you have to do is replace blank dates with nulls before moving.
You pretty much have to allow nulls for the empty dates, unless you want to replace them with {01/01/1800}. But you don't really need to worry about the nulls. The Microfour folks deal with huge datatables and have a lot of performance issues they tweak constantly so there may be some advantage to avoiding them altogether, but for most first time VFP to SF/SQL conversions I'd bet the nulls are not going to be an issue. If so, you can always massage the converted data on the sql side and then change the structure with DDT to close down future nulls.
If your data already has nulls in it, you don't really need to fill them with anything. Just make sure your DDT structure allows for nulls.
Basically the easiest path is to first import your VFP structure into DDT. In DDT make any changes you want to fieldnames, types etc. You can delete the change history in DDT since you are not going to be updating an existing SQL database.
Then write that DDT schema to your SQL database so now you have empty SQL structures that looks the way you want them for your SF/SQL app.
Now use FullConvert to move the data from VFP tables to SQL tables.
From that point on, only change the SQL structure through the DDT.
If you are going to setup any default values on the backend ( autoincrement integer pks for example ) do it through the DDT. I'd recommend at first using backend defaults only for those PKs. I think the rest (a date that defaults to today() etc ) is better handled in the BOs
Just make sure you save your Full Contact project so you can export to sql, play with it, and then export production data for real when you know you have the conversion the way you want it.
Default values :
Two things - the BOMapper handles what to do with null values, not inserting defaults in a new record. It is an issue of translating how the BO will get data from the back end. So if a date is null, what do i send to the control (#1/1/1800#) The control knows how to look at that and show and empty date in the control.
The "default" values of the BO is something like :
In a new record, the State with default to "NY", the EnteredDate with default to Now() etc.
That is handled in the EventHandlers region of the BO :
'''
''' Sets the default values for a new row
'''
'''
Private Sub PropCoverageBO_SetDefaultValues()
me.State = "NY"
me.EnteredDate = Now()
End Sub
The BO stuff is in the BO partial class that you don't manually edit for the strongly type field properties :
'''
''' This field was imported.
'''
BusinessFieldDisplayInEditor(), _
Description("This field was imported."), _
DesignerSerializationVisibility(DesignerSerializationVisibility.Hidden)> _
Public Property [EXPIREDATE]() As System.DateTime
Get
Dim loValue As Object
loValue = Me.CurrentRow.Item("EXPIREDATE")
If loValue Is DBNull.Value Then
Return #1/1/1800#
Else
Return CType(loValue, System.DateTime)
End If
End Get
Set(ByVal value As System.DateTime)
If value <> #1/1/1800# Then
Me.CurrentRow.Item("EXPIREDATE") = value
Else
Me.CurrentRow.Item("EXPIREDATE") = DBNull.Value
End If
End Set
End Property
Notice the null on the backend becomes a "date" the control knows to treat as empty, then if you leave it empty, it sends a null to the back end.
|
By Russell Scott Brown - 4/30/2010
Thanks Charles. Full Convert is definitely something I am going to need in a few months when I get to converting all my VFP tables.This is going to save me huge amounts of time and headaches.
|
By Howard Bennett - 4/30/2010
Thanks to both of you! I think I've got enough to make a determination as to which way to go...
I'm sure I'll be back SOON!
HB
|
By Michel Levy - 4/30/2010
I'm an old fox developer, and here's the way I use for migrating VFP to SQL (in addition to Charles, who is also an old fox).First of all, migrate structure. I use VFP9 Upsizing Wizard to migrate structure, because my dbfs are allways referenced in a dbc, and RI is implemented with its triggers - I had corrected a few bugs in its sources in order to manage varchar(max), and varbinary(max), but it seems that Rick Schummer's version is now without bug. Upsizing wizard allows to add Timestamp fields (they are usefull for a good concurency management), and creates referential integrity in a declarative way (creates constraints, and not triggers). Then, I use DDT on the new structure, on SQL server. I'found that way more simple than importing VFP structure from within DDT, but it is probably because I use VFP upsizing wizard for many years (and DDT only 2 years). once your structure is ok in SQL, let you migrate data. Never use VFP uspizing wizard for that! never! Before data transfer, check and check again the quality of your data, never forget that VFP is a lazzy language and a fuzzy storage (that's why VFP developpers may not be lazzy or fuzzy, and I love VFP , but I know its bugs). For exemple, in VFP, a field N(5,2) should hold nothing larger than 99.99 Try it, and you will see that you can store whatever bigger value you want, and the fox will not raise any error. But when you try to transfer this row in a SQL table, you'll get a splendid crash (I still remember the first time I got it: more than a week in debugging to find it, and write this tiny prg to detect it in all dbfs) With my customers, I use 2 ways to transfer data. Some prefers to push data from VFP into SQL, using remote views in Fox. Others use importing wizard in SSMS (wich in fact uses SSIS). I've tried using dbfs as a linked server in SQL, it runs without problem, but is slower than SSIS. Don't worry about null, but check VFP specific syntax in default values and validation rules in dbf. And enjoy SQL and StrataFrame !
|
By Howard Bennett - 5/3/2010
Thanks for the advice!
I've got most of the structure setup in DDT and will be finishing that up this week - then on to moving the data. I'll be coding all that as there are some significant structure changes and I won't be able just to move data with a third party product. Should be fun!
It's good to know there are people who've done this and that I can come back here for help!
HB
|
By Charles R Hankey - 5/3/2010
Even with significant structure changes at some point you have to move the VFP data into SQL tables. My approach, since I was most familiar with manipulating data in VFP was to do my changes there, creating prgs that work on a copy of of my VFP data, doing ALTER TABLE commands to add fields, copy data, transform data, allow null, generate integer keys etc.
The I pointed Full Convert at that and it moved it to SQL without a hitch.
And if I found something I didn't like I could tweak the prg, tweak the Full convert project, then blow away the SQL database, recreate it from DDT and rerun the process.
Suppose I could have done it with remote views but that turned out to be a lot more work.
|
By Howard Bennett - 5/4/2010
Now - that's a good idea (restructuring tables in VFP - then using Full Convert to move.) Hadn't thought of that.
Thanks again!!!
HB
|
By Michel Levy - 5/4/2010
Charlesplaying with the trial version of FullConvert, I'm unable to automatically convert VFP autoinc fields as SQL int identity, nor to convert PK. Is it a limitation in the trial version? do you really convert all referential integrity with this tool?
|
By Charles R Hankey - 5/5/2010
All my VFP stuff used GUID pks and I was originally drawn to Full Convert because unlike the "upsizer" it would let me map my GUIDs to UIDs. I never used autoinc integers in VFP databases.
Most of my VFP -> conversion was done for other peoples apps as I started developing VFP apps exclusively against SQL starting about 2002.
I usually found that RI ( especially straight VFP RI without Steve Sawyer's brilliant mod ) wasn't something I wanted to even think about converting so I just moved the table data and set up the RI triggers where appropriate in SQL itself after the data was moved.
When on the Table Setting screen I would only check Recreate Data ( having already created the structure I wanted with DDT ) and then check Skip Indexes and Skip Constraints.
I found it was better to move the data from VFP to SQL without indexes or RI or defaults - ie. data only - no metadata - and then set things up in sql the way I wanted it. I don't trust any "automatic" conversion of that kind of stuff as I always assumed ( perhaps incorrectly ) that the VFP implementation of some of that stuff may have been done to look like SQL it really was different ( RI being a good example )
Full Convert's primary utility for me was in mapping the fields and doing the actual data transfer and storing those setting in projects so they were reproducable and tweakable. I found the VFPOLEDB driver flakey or I would have done it from the SQL side with DDT / SSIS.
|
By Michel Levy - 5/5/2010
Thank you for that explanation, Charles.New upsizing wizard allows us to create clustered indexes on PK, and migrate VFP RI triggers in SQL constraints. Of course, it's not a magic tool, and I never trust it, and always set the final conversion in SSMS and DDT. Since VFP6, I use RI with no failure or problem. Moving apps in VFP8, I used autoinc PK. We have different experiences with Fox and SQL, so we can learn each from others.
|
|