By Jiri Brazda - 2/13/2011
Hi,
we are developing an application for multiple customers. The application has common core but for some customers we are preparing special modules that has their own tables. We need to have single database fo each customer and we also want to have a possibility to not deploy tables for unused modules for particular customer. The idae was to have one DDT profile for common core and special DDT profile for each special customer module and to deploy core + only selected DDT profiles for each customer. The problem is that when we create new DDT profile it doesn't allow us to connect to an existing database - there is only an option to import database structure but this is not a solution for us.
Can we manage it somehow?
Thank You
Jiri
|
By Dustin Taylor - 2/14/2011
The DDT doesn't stay persistently connected to a deployed database (only to the StrataFrame database where the DDT itself stores its data.) The DDT wasn't really designed to do what you are describing, but there are a couple ways to get to the desired functionality:
1) You can create multiple profiles that define tables as you described, then distribute the seperate packages to the same target database. The main issue here is that you will need all of your DDT profiles to contain your stored procedures, UDFs and Views since the standard DDT deployment will start by removing those items. This would become a maintenance headache as you suddenly have multiple places to update if you want to change or add a SPROC.
2) You can create a single DDT profile and have a post deployment script drop the unnecessary tables. In this case the script would obviously need access to some type of licensing or customer information when it ran to be able to determine which tables to drop. This may or may not be an option depending on what information is available to the script at deployment time.
To be honest, my own preferred way of handling that type of situation would be to either have the custom user tables in a seperate database (to avoid any unintended interaction), or just include all tables in all deployments, but only use the "custom" tables as needed. Of those two, I would prefer the former (having a second, customer specific database.) A second database with associated connections would, in my opinion, give you the most flexiblity without complicating your deployment or introducing unintended issues by having your core database be dynamic.
The extra overhead of a second databse is fairly negligable. 1) You would need to deploy a second database using the DDT, which is the same amount work as deploying a second package to the same database. 2) You would need to create a second connection string to the new database, but that is negligable if you have already created the main conneciton string. And 3) You would need to set the database key on all of your custom business objects to point to the custom database, which is also pretty straight forward.
FYI, we deploy multiple databases on almost all of our installs. We have a "MicroFour Global" database that contains common information (i.e. Country, State, Zip.) After setting this up once, it has served us well and been easy to maintain as we have moved along.
|
By Jiri Brazda - 2/15/2011
Hi Dustin,
thank You for Your answer, however any of Your approaches is not applicable on our project. Nevertheless - thank You again for Your effort.
Best regards
Jiri Brazda
|
|