Deploying data within a Clone – solved

Hello all,

we had the requirement to include version and build information with a clone. A few conversations and some tinkering later, we now have a technique that even allows us to include large datasets with a clone. For example, this makes it easy to include data patches into the release as well.

If this sounds interesting to you, go ahead and download the demo file from GitHub:



Thanks so much for contributing your demo file. This is a problem that we would all love a solution for.

But I don’t see how this will work. The Value in Set Next Serial Value is migrated along with the data. So although, it is not removed by the cloning process, the migration process will replace it with the value from the old file.

To test it, I tried your sample file. I couldn’t make it work. the build information was not from the Clone but from the Data Source. See the image below.

I really hope I have mis-understood how to use your file, or that I missed a new trick. Please let me know.



Hi Todd,

I did not test the latest version with Otto, I just saved a clone locally and opened it – assuming (haha!) it will also work in a migration. I can reproduce your error.

I will now check with the command line tool, maybe I get a hint there. It could be, that large(r) values are ignored or produce an (uncought) error and are therefore lost in the migration process.

Also when a running a build with OttoDeploy. the clone sometimes misses the correct next value.


btw: when downloading the build, the browser tries to open an odd url:

where it should be:

Is does this for a while, the update this week did not fix it.

Hi Nils,

Saving data in Set Next Serial is something we had been doing for 20 years. It does survive cloning. So in the old days when we managed upgrades through import file script steps it was a good way to save build info.

However once we switched to the DataMigrationTool this method stopped working, because the DMT correctly migrates the Next Serial Value. This of course makes sense and is exactly what you want in most cases as you want your Next Serial Value to come from the old file, so your primary Ids don’t get out of sync.

So although Next Serial Vialues survive cloning, they do not survive the DataMigrationTool’s data migration feature.

I wish it did


Hi Nils,

There a few places you can download a build. Could you tell us which one isn’t working for you



Ok, so back to version 1, which uses sql to alter the field’s default value. This is preserved through migration: ft-buildinfo/ at a9143003bdbaf8a297dbbd0ef8b5e97f38fa0531 · fmgarage/ft-buildinfo · GitHub

This is, where the download fails, it happens with all our servers:

@nils Thanks for the report! This will be fixed in the next release of OttoDeploy

Hi Todd,

v2 is now also working: ft-buildinfo/ at ef7492ea733221bce6d60e01321a8a9a8fe3d9fc · fmgarage/ft-buildinfo · GitHub


Ok so this is clever.

Updating the default value of the field via SQL is not something I have seen before. We had just been using SQL to change a field name. But you could only store a little bit of data that way. This is a nice technique.

Using a plugin isn’t too bad when you are in developer mode.

Thanks for sharing


Interesting topic! :heart_eyes:

OK, so using your approach it all gets down to one simple thing: there is no native way to modify something that would appear in file’s DDR. And that seems correct, the only script step that I can think of that changes something in manage database is the Set Next Serial value and you already tried that.

A different approach :bulb:

Let me offer a different approach I am using; I have a script called deploy version which is always run after a migration. This script will first get the version info from the mother dev file using FileMaker Data API, write it in the relevant table and will then perform script by name Deploy v{{version}}. When developing, we already set what the next version will be -for example v2.1.3- and any scripted changes that will need to happen after the migration will be written in that script.

Then after the migration finishes we run the Deploy version which will fetch the latest version from the dev server and then run the deployment script of that version. This of course makes some assumptions that we adhere to.

Ah, the other cool part is that we have a dedicated status on our PMS that defines which tasks are being delivered with that version. When running this, it also grabs all those tickets, moves them to Client sign off, creates a nice html of the tasks with links and embeds them in the relevant field of that version. This allows us to present a nice webviewer changelog within our FileMaker file of what tasks were delivered, including links that take our users to the relevant task on our PMS.

The final touch is it also sends an e-mail to the client, informing what tasks were delivered on that version.

A crazy idea :rocket:

Here is a crazy idea for @toddgeist: if Otto supported FM Upgrade Tool and the ability to run a script before the migration, you could then (in theory) do the following:

  • Run a script on the original file that would export a version info file as an XML that FM upgrade tool could use for the patching. You could for example update a layout where you kept all this info in a named text object.
  • Use the FM Upgrade Tool to patch the destination file before migration
  • Proceed with the deployment as usual

hmm… Interesting idea to use the update tool.

I think making a patch for updating a Custom function is the easiest thing to do. So we maybe we could update a “VersionInfo” Custom Function as part of a “build”.

Lots of interesting ideas in here. Really good stuff.

Wait to you see what we have coming down the road in this area. It’s gonna make you smile :slight_smile: