Melvin Xuereb
|
|
Group: Forum Members
Posts: 29,
Visits: 128
|
Hi, I'm looking for a software which would be used in our company as a database source control/package creator/deployment tool. With Strataframe database deployment tool is it possible that every developer would do the changes required by himself on his local machine, deploy them to a central database and then with Strataframe DDT create a single package with all the developers changes? I'm currently running a trail of DDT, and if you could provide me with steps on how to do the above would be appreciated.
|
|
|
Ivan George Borges
|
|
Group: StrataFrame MVPs
Posts: 1.9K,
Visits: 21K
|
Hi Melvin.
Welcome to the comunity.
This is basically how we work eveyday. There is no reason to create a "merge" if all developers have access to the database which stored the DDT strutuctures. I generally work off of a SQL Server instance on my local machine, but the DDT structures are on a central database that we all use. So if I make structure changes in the DDT, everyone would see them. However, I deploy, while coding, to my local machine which only affects me. This is the best approach. If you are referring to stored DDT structures on the local then creating a merge of DDt structures to a central strataFrame database, then the DDT was never intended on doing this as it creates a sync nightmare that never works. We have tried this approach in the past with less complicated scenarios on other products and it ultimately becomes a tangled mess that requires one fix after another for so many different scenarios.
Hope it helps you.
Cheers.
|
|
|
Melvin Xuereb
|
|
Group: Forum Members
Posts: 29,
Visits: 128
|
Thanks Ivan.. seems like this is exactly what are we looking for. Thanks again
|
|
|
Ivan George Borges
|
|
Group: StrataFrame MVPs
Posts: 1.9K,
Visits: 21K
|
You are welcome, Melvin.
|
|
|
William Fields
|
|
Group: StrataFrame Users
Posts: 80,
Visits: 239
|
> So if I make structure changes in the DDT, everyone would see them. However, I deploy, while coding, to my local machine which only affects me.
So, you make your changes to a central DDT database/profile, then create a Deployment Data Package and apply that package to your local dev database?
I'm trying to wrap my head around the fact that you've essentially committed your schema changes to the central DDT database prior to development and testing of those changes. What if you're working on some complicated series of stored procedures, or other changes, how does that work?
Do you use source code control with your DDT profiles? Would you mind describing how that works?
FYI - I've just installed StrataFrame, have never used it before, and am working with a colleague on setting up our multi-developer environment with TFS. I work remotely over a VPN WAN connection and our 3 other developers work on a LAN together. I'd really appreciate a white paper along the lines of "best practices on setting up DDT and TFS source code control in a multi-developer environment."
Coming from a VFP environment, we used the StoneField Database Toolkit (SDT) and our workflow was thus:
1. Check out the DBC "schema" and SDT meta data from TFS source control.
2. Work on any changes on our local "dev" copy of the database. While working on those changes, no one else is able to check out or modify the DBC.
3. Check in the DBC and SDT meta data once our local modifications have been completed, developer tested, and are ready for deployment to the other developers.
4. Tell everyone to "get latest version" from TFS source control, and have them run the SDT "Database Update" process on their local databases.
This is very natural to me and I'm having a hard time re-thinking this process in a StrataFrame DDT context. How do you have revision history to refer to in TFS source control? For us, we output a text document which is a text equivalent of everything in the database at that time and check that file into TFS SCC along with the DBC and SDT meta data.
Any help you could give would be appreciated.
Thanks again.
|
|
|
Edhy Rijo
|
|
Group: StrataFrame Users
Posts: 2.4K,
Visits: 23K
|
Hi William, I am a single developer and of course do no have those issues with multi-developers, but I can tell you couple of things about the DDT which I use with all my projects and I was a big fan of SDT. - With DDT the repository is on a MS-SQL database, so you cannot use TFS or any other source control for it.
- DDT will keep a record of previous changes so it will automatically update any version of the database to the current one.
- When deploying the DDT you can choose which database or databases to update. So in your case you may have Data1_Production, Data1_Dev, etc. then update the Dev database which may be local in your computer or remote at the main server and when you are ready with your testing, just run the same package for the production database.
Probably others here may jump with more networking group experience, I know that Greg McGuffey uses scripts instead of DDT for some of his projects.
Edhy Rijo
|
|
|
Greg McGuffey
|
|
Group: Forum Members
Posts: 2K,
Visits: 6.6K
|
William,
Ultimately, database changes of almost any kind require some level of coordination between the team members. If two developers are working on the same table/view/sproc etc. then there are going to be issues likely, no matter how you do it. So, assuming there are clear areas of responsibility, so developers aren't stepping on each other, then we are talking scale: how much is the db changing and by how many developers at the same time and how far reaching is the change.
If the developers are each working in fairly defined areas, then it is OK to save the changes in a central area, because even if the other developers end up deploying others changes, it doesn't impact them. In this case, using a common repository is a big plus because you find out about issues immediately, rather than having two developers making changes in the same area, delaying the merge, then realizing the both their work is messed up. Now you've got the tangled mess Ivan was referring too.
However, as the changes get more extensive, and could affect more areas (like updating a common auditing sproc), then this doesn't work as well. If this is a infrequent level of change, then you just communicate the needed change to the team, the other members stop making changes to the db for a bit, one developer handles the tricky stuff, tests it and then indicates they should all update their local development dbs.
If you have several changes like this or often can't control when others are making changes, then you might need to move to scripts. They can be checked in to source control and you have a bit more control over what others are seeing (especially if you use a distributed source control system like git or mercurial). However, managing the update scripts and creation scripts can be a big challenge (and one of the reasons the DDT is so nice). I've used a tool call DbGhost. It is very nice and manages creating update and/or creation scripts from source sql scripts. It has some limitation, like handling changing names of columns (which DDT does very nicely) and you have to create specific scripts based on the state of the target db (DDT just updates any db to current version). And this could still easily lead to a big mess if they are walking over each other. Lots of coordination is needed here, no matter the tool. Databases are more brittle than other forms of source code (where merging works better in many cases).
In extreme cases, you may need a dba(s) to control changes to the database, if there are hundreds of developers working on a db.
In most cases DDT is a great tool that really, really, really eases the deployment of db changes. As with any environment where multiple developers are working though, coordination is required.
|
|
|
William Fields
|
|
Group: StrataFrame Users
Posts: 80,
Visits: 239
|
Thanks for the feedback. What do you think of the following workflow: - The pkg file and create script(s) are placed under source control. The pkg file is the mechanism for deploying updates to the other developers and the script is used for an easy way to compare what happened between versions.
Question, how big can the pkg file get? (our VFP app has 60 tables and about 150 views)
- If the pkg/script is checked out, you cannot make changes.
- Before you start making changes, you check out the pkg file and "deploy" it to your local dev database. You also check out the script file.
- After making changes, you update the pkg and script files and check them in.
- When deploying to a central test server, or when building a new packaged release, we just use the latest pkg file.
This for the most part follows our current VFP database change workflow (which has worked well for many years). Does it sound like a feasible way to work with the DDT and SQL Server? Am I completely off base, or am I not asking the right questions?
|
|
|
Greg McGuffey
|
|
Group: Forum Members
Posts: 2K,
Visits: 6.6K
|
This seems like a good solution to control db changes, by simply only allowing one developer at a time to make changes. I assume you'll use SSMS to generate the script for the database after you've updated it via DDT.
The size of a structure only package is nice and small. I created a package from a database with 125 tables, 212 sprocs, 163 view and 20 functions. The package was 260K. I assume it would grow as it has to keep track of all the changes, but it is not a huge file at all.
So, it appears that you are on the right track and while the you'll have to track the script separately so you can compare versions, you're going to love deploying the db with DDT.
|
|
|
William Fields
|
|
Group: StrataFrame Users
Posts: 80,
Visits: 239
|
OK, we're starting to work out the workflow as I described (in post dated 12/6/2010 4:15:10 PM) . But I've ran into something unexpected...
The DDT stores all its data in the StrataFrame database, which we've all installed locally on our developer workstations. How do we keep the DDT Profile database in sync across developer workstations? I was envisioning the package file as the "master" dataset that could be opened in the DDT.. but I now understand that the package file is only for deployment use. Correct? Is there a way to take a package file and update a profile with it?
Suggestions?
Random thoughts/questions:
Will we need to have a central StrataFrame database that houses the DDT database so all developers use the same profile? Is there a way to "export" a profile that can be opened/used in a local StrataFrame database by other developers? We'd rather be able to work "offline" when we're mobile and not have to maintain a connection to a central StrataFrame repository.
Thanks, and Happy New Year!
UPDATE:
I've found out how to import a package file into a profile, but instead of updating the database in the profile, it creates a new database with "_1" appended to the name. That, and the object name history is empty.
Is there a way to import/apply a package file against a profile and have it update the matching database as well as bring in the object name history?
|
|
|