Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Get an overview of the things changed and fixed in each version of Umbraco Deploy.
In this section we have summarised the changes to Umbraco Deploy and Deploy Contrib released in each version. Each version is presented with a link to the Deploy issue tracker showing a list of issues resolved in the release. We also link to the individual issues themselves from the detail.
If there are any breaking changes or other issues to be aware of when upgrading they are also noted here.
If you are upgrading to a new major version you can find the details about the breaking changes in the version specific updates article.
This section contains the release notes for Umbraco Deploy 13 including all changes for this version.
Update documentation links in management dashboard to include major version in the URL
Add ValidateDependenciesOnImport
setting to management dashboard
All items from 13.3.0-rc1
Add option to include all schema in a workspace export
Add option to export from transfer queue
Combine migrated grid editor with default Block Grid block configuration (set RowMinSpan
and RowMaxSpan
to 1
and EditorSize
to medium
)
Import on startup
Added ValidateDependenciesOnImport
setting to disable dependency validation on import
Fixed issue where content was not saved when transferring variant content with a release date
Support flexible environments on Umbraco Cloud (remove requirement for environment types to be Development, Staging or Live)
Support migrating legacy Grid editor settings (config/styles) using JSON object pre-values
Fix tree export of members by member type
Ensure release date of invariant content is correctly transferred #233
Parse nested JSON property values from artifact values #234
All items from 13.2.0-rc1
Fixed Could not create Udi node: {Id} and entity type {EntityType}
exception when exporting tree nodes without children
Fixed deploy signatures not getting cleared when members are deleted #230
Allow members to be deployed when selected as items in a multi-node tree picker #231
Apply ExcludedEntityTypes
configuration to disk operations #232
Add migrators to support legacy Grid layout to Block Grid migration
Add [DisableRequestSizeLimit]
attribute to UploadForImport
endpoint to remove application request size limit (server/infrastructure restrictions may still apply)
Determine missing routePath
based on $routeParams
to fix Export, Queue for transfer, and Partial Restore actions on items with children in a list view
Set trashed state when processing content #223
Improve exception message when parent cannot be found when getting artifact #216
All items from 13.1.0-rc1
Fixed issue with transfer of date values within Nested Content, Block List, or Block Grid properties #209
Fixed issue where templates could incorrectly cause schema mismatch errors when running in production mode #187
Fixed issue where importing invalid variant property data would cause a not supported variation exception #8
Fixed deserialization issue causing problems with the compare content feature #212
Add configuration to allow control over suspension and resumption of Examine and document cache events during Deploy operation #211
Add IArtifactTypeResolver
to allow custom type resolving when deserializing to IArtifact
(used by Deploy Contrib, see PR #60)
Add base migrators for importing legacy artifacts (used by Deploy Contrib)
Restored ability to overwrite content properties with empty values
Improved error UX and messaging to only show technical detail option if available
Require a valid license when importing
Only validate ApiKey
and ApiSecret
when workspaces are configured
Explicitly register API controller endpoints
Removed the no-longer supported "live edit" feature (Deploy on Cloud only).
Fixed issue where the removal of a master template couldn't be deployed #201
Ensured configuration for behavior following a "path too long" exception is respected for handling image cropper values #200
Added configuration setting to allow hiding of version details on the settings dashboard.
User experience and message improvements on content flow exception #202
Fixed issue with transfer of empty values to overwrite non-empty ones.
Added configurable option to avoid overwriting of dictionary items with empty values #191
For more details see the page on Deploy's settings.
Fixed regression issue with transfer of date values.
Fixed issue with transfer of content using language variants #193
Fixes the display of the selected schedule date on queue for transfer.
Fixes parsing property values within Nested Content and Block List that were previously saved by the Contrib value connectors.
Fixed incorrectly including media files in export when 'Content files' wasn't selected.
Add maximum file size validation to import file upload.
All items from the 13.0.0 RCs and latest 12.2 features.
Align with latest CMS RC.
Add weight -100 to Deploy package migration (ensuring Deploy database tables are available before other package migrations are executed).
Add fluent API for Deploy webhooks.
Added optional deployment of webhooks as part of schema updates.
Added Deploy specific webhook events.
Exclude automatic relation type aliases (used for reference tracking) from deployment.
Fix cyclic dependency between configuring DeploySettings
and ConnectionStrings
.
Compatibility with Umbraco 13:
See full details of breaking changes under the version specific upgrade guide.
Update Richtext value connector to handle references in blocks.
Migrate nested/recursive legacy pre-value property values #71
All items from 13.2.0-rc1
Skip empty pre-values #63
Add Matryoshka Group Separator artifact migrator #67
Add migrators to support DocTypeGridEditor to Block Grid migration #66
All items from 10.2.0-rc1
Add legacy migrators and type resolver to allow importing from Umbraco 7 in #61
Compatibility with Umbraco 13
You can find the release notes for versions out of support in the Legacy documentation on GitHub and Umbraco Deploy Package page
Umbraco Deploy is a commercial product. You will need a valid license to use the product.
A license for Umbraco Deploy is included when hosting on Umbraco Cloud.
Licenses are sold per domain and will also work on all subdomains. With every license, you will also be able to configure two development/testing domains.
Let's say that you have a license configured for your domain, mysite.com
, and you've configured two development domains, devdomain.com
and devdomain2.com
.
The license will cover the following domains:
localhost
*.mysite.com
www.mysite.com
mysite.com.local
devdomain.com
www.devdomain.com
devdomain2.com
www.devdomain2.com
You can have only 1 license per Umbraco installation.
There are a few differences as to what the licenses cover:
A single license covers one Umbraco solution. It includes all domains hosted by the solution, all production environments (if load-balancing), and all non-production environments.
To clarify the above:
You only need one license when you have a solution covering multiple domains- for example, www.mysite.com and www.mysite.dk - load balanced in production over multiple servers running from the same database, managed from the same backoffice instance, and with any number of non-production environments (staging, QA, etc.)
You need two licenses if you have a web presence that consists of two separate websites hosted on different domains or sub-domains - for example, www.mysite.com and shop.mysite.com - with each of these managed as a separate Umbraco installation using their own database and backoffice in production.
The license for Umbraco Deploy comes with a recurring yearly fee. Learn more about this and pricing on Umbraco.com.
You can look at the pricing, plans, features, and purchase the license on the Umbraco Deploy On-premises page.
Once you've configured your license with the correct domains, you are ready to install the license on your Umbraco installation.
For Umbraco Deploy On-Premise 12 and above, this will be a key provided to you when taking out your subscription to the product. It should be added to your configuration at the key Umbraco:Licenses:Umbraco.Deploy.OnPrem
.
For example, in appsettings.json
:
You might run into issues when using a period in the product name when using environment variables. Use an underscore in the product name instead, to avoid problems.
Umbraco Cloud projects use a license file placed in the /umbraco/Licenses
folder that is provided automatically when your project is created.
On start-up and on a schedule, Umbraco running Deploy On-premise will call out to a service. It will pass the configured license key to determine its validity. The response triggers a notification that the Umbraco Deploy will handle. It will amend the available functionality as appropriate to a valid, invalid or expired license.
You can view the status of the Umbraco Deploy On-premise license in the backoffice. This is available via the Settings section, listed along with any other products using the same licensing service:
UmbracoApplicationUrl
The website domain used for validating the license is determined from your Umbraco instance. To ensure the correct one is used, you can configure the UmbracoApplicationUrl
.
If you are running on a single domain for both your frontend and backend environments, it's not necessary to configure a UmbracoApplicationUrl
.
If you have different domains for your frontend and backend, then it's advised that you configure an UmbracoApplicationUrl
set to your backoffice URL. This helps the licensing engine know which URL should be used for validation checks. Without this configuration setting, the licensing engine will try and work out the domain to validate from the HTTP request object. This can lead to errors when switching between domains.
An UmbracoApplicationUrl
can be configured in your appSettings.json
file like so:
See the Fixed Application URL documentation for more details about this setting.
Some Umbraco installations will have a highly locked down production environment, with firewall rules that prevent outgoing HTTP requests. This will interfere with the normal process of license validation.
On start-up, and periodically whilst Umbraco is running, the license component used by Umbraco Deploy will make an HTTP POST request to https://license-validation.umbraco.com/api/ValidateLicense
.
If it's possible to do so, the firewall rules should be adjusted to allow this request.
If such a change is not feasible, there is another approach you can use.
You will need to have a server, or serverless function, that is running and can make a request to the online license validation service. That needs to run on a daily schedule, making a request and relaying it onto the restricted Umbraco environment.
To set this up, firstly ensure you have a reference to Umbraco.Licenses
version 13.1 or higher. This will be the case if you are running Umbraco Deploy 13.1 or higher. If you are on an earlier version, you can add a direct package reference for Umbraco.Licenses
.
Then configure a random string as an authorization key in configuration. This is used as protection to ensure only valid requests are handled. You can also disable the normal regular license checks - as there is no point in these running if they will be blocked:
Your Internet enabled server should make a request of the following form to the online license validation service:
The response should be relayed exactly via an HTTP request to your restricted Umbraco environment:
A header with a key of X-AUTH-KEY
and value of the authorization key you have configured should be provided.
This will trigger the same processes that occur when the normal scheduled validation completes ensuring your product is considered licensed.
Steps and examples on how Umbraco Deploy can be integrated into an automated build and deployment pipeline
Once Umbraco Deploy has been installed and the schema data has been generated, a CI/CD build server needs to be set up.
The build server will extract the changes that has been pushed to the repository into your production website that has been connected with Umbraco Deploy.
This is something that can be done in many different ways depending on where your website is hosted and your setup.
Umbraco Deploy does not require the use of any particular build or deployment tools and hence we expect that you should be able to continue using the tool or tools of your choice. Any that have support for .NET website deployments and the running of Powershell scripts. such as Azure DevOps or GitHub Actions, would be appropriate.
Above and beyond the normal steps of a build pipeline for a .NET web application - tasks like NuGet restore, solution build, running of tests etc. - Umbraco Deploy requires three additional steps.
The license file needs to be deployed into the target environment's umbraco/Licenses
folder.
The .uda
schema files that are written to disk and included in source control, need to be made available in the build artifact that is deployed to the target environment.
Once the build is complete, the extraction of the updated schema in the target environment needs to be triggered.
The first two steps will be implemented in a similar way. There will need to be a step added to the pipeline that runs after the main build of the website, to copy the license file and data files into the published build output, such that they are included in the build artifacts that are deployed to the target environment.
The third step needs to run last in the pipeline, once the built web application is deployed to the target environment. Umbraco Deploy provides a Powershell script that can be used for the purpose of triggering the extraction of the schema information and update of the target Umbraco installation.
Without a CI/CD pipeline step in place to trigger the extraction, following a deployment, the process would need to be carried out manually. This can be done by logging into the backoffice, navigating to Settings > Deploy and triggering the operation via the dashboard.
Behind the scenes what happens here is a marker file being written to disk - in the /umbraco/Deploy/
folder and with a name of deploy
. It’s by monitoring this directory for a file with this name that Umbraco Deploy knows to trigger the extraction.
Umbraco Deploy also provides an HTTPS endpoint that can be called by an authenticated request. This will write the marker file, which will trigger the extraction.
Umbraco Deploy On-Premises also ships with a Powershell script, that when executed will call the endpoint, which will write the file, and which will trigger the extraction.
So while it may be possible to have the CI/CD step directly write the file or call the endpoint, so long as the build used supports running Powershell scripts this is the method we’d recommend, as it has some necessary error checking and retry logic built-in.
Details the setup of a CI/CD pipeline using GitHub Actions.
Details the setup of a CI/CD pipeline using Azure DevOps.
The troubleshooting section for Umbraco Deploy
In this troubleshooting section, you can find help to resolve issues that you might run into when using Umbraco Deploy.
If you are unable to find the issue you are having, then please reach out to our friendly support team at contact@umbraco.com.
When transferring or restoring content between two Umbraco Deploy environments, you might run into Schema mismatch errors. For more information on how to resolve schema mismatch issues, see the Schema Mismatches article.
Umbraco Deploy maintains a cached set of signatures that represent each schema and content item. They are used when transferring or restoring content between environments to aid performance.
If having resolved schema mismatches you still have reports of errors, it might be that the signatures are out of date. In other words, Deploy is using a cached representation of an item that no longer matches the actual item stored in Umbraco.
This should not be necessary in normal use, but can occur after upgrades. If you have this situation, you can clear the cached signatures in both the upstream and downstream environments. You do this via the Clear Cached Signatures operation available on the Settings > Deploy dashboard:
When transferring or restoring content between environments, Deploy needs to ensure that all related items are updated together. It also checks that any schema dependencies an item has also exist in the target environment. When a large amount of content is selected for transfer or restore, this process of determining all the dependent items can take some time.
In some cases, a hard limit imposed by the Cloud hosting platforms such as Azure, used by Umbraco Cloud, can be reached.
If you find the process slow or reporting a platform timeout, there are a few options you can take.
If restoring, you can choose to pull down a smaller set of content via the partial restore feature. With this you select an item in the remote environment to restore. You can select to include child items. Any items related to the selected ones, for example via content or media pickers, will also be restored.
In addition to transferring content via the backoffice, it is possible to move both content and schema between environments via Deploy's import/export feature. With this, a selection of Umbraco data can be exported from one environment to a .zip file. That file can then be imported into another environment.
As this process requires less inter-environment communication, it's possible to transfer much larger amounts of content without running into the hard platform limits.
Read more about the import/export feature here.
Firstly, you can review and update the timeout settings available with Deploy. Increasing these from the default values may help, but won't necessarily resolve all issues. This is because, as noted, some timeouts are fixed values set by the hosting environment.
There are two places where Deploy operations can be batched. This allows breaking up of a single, long process into multiple, smaller ones. By doing this it's possible to complete each smaller operation within the platform imposed timeout.
If transferring items from a downstream environment to an upstream one, it's possible to configure a batch size. With this in place, transfers will be batched into separate operations, allowing each single operation to complete before any hosting environment-enforced timeout.
This will take effect only for transfers to upstream environments and when multiple items are selected in the backoffice. An example is the selection of a single media folder containing many files.
A package is an ordered structure containing all the items selected for a Deploy operation, plus all the determined dependencies and relations. The processing of this package in the target environment can also be batched via a configuration setting.
When set, if the number of items determined for the package exceeds the batch size, the processing will be chunked into batches.
For transfer or restore operations, it's worth ensuring Deploy's cached signatures are fully populated in both the upstream and downstream environments. This can be done via the Set Cached Signatures operation available on the Settings > Deploy dashboard:
The process make take a few minutes to complete if you have a lot of content or media in your installation. Information is written to the log indicating the signatures calculated for each entity type.
Now the checks Deploy has to do to figure out the items and dependencies to process will complete much more quickly.
Deploy will do comparisons between the entities in different environments to determine if they match and decide whether to include them in the operation. By default, for media files, a check is made on a portion of the initial bytes of the file.
If a lot of files need to be checked, this can be slow, and a faster option is available that uses the file metadata. The only downside of changing this option is a marginally increased chance of Deploy considering a media file hasn't changed when it has. This would omit it from the deployment.
This option can be set in configuration.
When a Deploy operation completes, cache refresher notifications are fired. These are used to update Umbraco's cache and search index.
In production these should always be enabled, to ensure these additional data stores are kept up to date.
If attempting a one-off, large transfer operation, before a site is live, you could disable these via a configuration setting. That would omit the firing and handling of these notifications and remove their performance overhead. Following which you would need to ensure to rebuild the cache and search index manually via the backoffice Settings dashboards.
As well as transferring entities between environments Deploy will also include the relations between them. As of 10.1.2 and 11.0.1, two relation types used for usage tracking are omitted by default. These do not need to be transferred as they are recreated by the CMS as part of the save operation on the entity.
If using an earlier version, or to make further adjustments, modify the settings for relation types in configuration.
When restoring between different media systems exceptions can occur due to file paths. They can happen between a local file system and a remote system based on blob storage. What is accepted on one system may be rejected on another as the file path is too long. Normally this will only happen for files with particularly long names.
If you are happy to continue without throwing exceptions in these instances you can modify the configuration. If this is done such files will be skipped, and although the media item will exist there will be no associated file.
When Umbraco schema items are created, a representation of them is saved to disk as a .uda
file in the /data/revision/
folder. The representation is known as an artifact. It will be refreshed on further updates to represent the current state of the schema item.
Following an upgrade, it's possible the contents of the file will no longer match what would be generated by the current version. For example, if a property has been added to an artifact, this will be absent in the file. It would be added if the file was recreated following the upgrade.
This can lead to situations where Deploy continues to process a file it considers changed, even though the item represented is up-to-date. This in turn means slow updates of Umbraco schema, as Deploy is processing more files than it needs to do.
To resolve this situation, following an upgrade, it is good practice to re-save the .uda
files in the "left-most" environment. This will usually be the local one, or if not using that, the Development environment. You can do this via the Export Schema To Data Files operation available on the Settings > Deploy dashboard:
The updated files should be committed to source control and deployed to upstream environments.
Documentation on how to work with Umbraco Deploy.
Umbraco Deploy is a deployment tool that helps you with the process of transferring code and data between multiple environments. Deploy can be configured for many different setups and is great for both small setups as well as large and more complex infrastructures.
With Umbraco Deploy you get to use the Umbraco Cloud Deployment technology outside of Umbraco Cloud to ease deployment between multiple Umbraco environments. This is done by connecting external hosted Umbraco projects with a local instance of your Umbraco website.
In the Umbraco Deploy documentation can read all about how to set up and work with Umbraco Deploy.
You can find articles about how to set up Umbraco Deploy on a new or an existing website, and articles about the deployment workflow.
How does Umbraco Deploy work and how to get started using Umbraco Deploy
In this article you can learn more about what it takes to get started using Umbraco Deploy. You can also get a high-level overview of how the product works.
Umbraco Deploy works by serializing non-content Umbraco items (called “Schema” items) to disk. These serialized files are located in the /umbraco/Deploy/Revision
folder at the root of your website.
These items are entities like Document Types, Media Types, Data Types, etc, and these files must be committed to source control (for example Git). Umbraco Deploy works by “extracting” this serialized information back into your Umbraco installation. This is done by deployment triggers when a deployment is sent to a target environment.
For example, when working locally you might create a new Document Type. This will automatically create a new on-disk file in the umbraco/Deploy/Revision
folder which is the serialized version of the new Document Type. You would then commit this file to your repository and push this change to your hosted source control (for example GitHub).
When you want this deployed to your next environment, you would trigger your CI/CD process (for example Azure DevOps or GitHub Actions). This will push the changes to your environment. Once the build deployment completes successfully, a Deployment Trigger would be executed as an HTTPS request to your target environment. All changes found in the umbraco/Deploy/Revision
folder will then be extracted into the Umbraco target environment.
There are three main steps you need to go through in order to start using Umbraco Deploy on your website.
Set up a repository and then install a new Umbraco project inside it.
Umbraco Deploy can be installed via NuGet.
Umbraco Deploy needs a CI/CD build server needs to be set up to run when you want changes to be deployed to next upstream environment
This documentation platform covers only major versions of the Umbraco Deploy since Umbraco 9+. If you are using an older version of Umbraco Deploy, you will need to go elsewhere.
The documentation for Umbraco 7 and 8 lives on .
Steps and examples on how to setup a build and deployment pipeline for Umbraco Deploy using Azure DevOps.
In this section, we provide a full example of how Umbraco Deploy running on Umbraco 9 and above can be utilized. This can be used as a part of a build and deployment pipeline using Azure DevOps. You can use this directly or adapt it for your needs.
We have defined a single stage build and deployment pipeline, configured in YAML format. While not as visually intuitive as a drag-and-drop task list, it provides the advantage of source control management.
We then have a number of variables defined, that are used in the build configuration below. By using variables we have the ability to modify the script for use on other web applications. Some values are set in the script, and some via Azure DevOps variables or secrets.
Most tasks in the pipeline are standard steps that will be used in any .NET web application release, such as the first steps:
#1 Install of the NuGet tooling,
#2 Restore of NuGet dependencies,
#3 And the project build.
Additional steps can be added as required, for example for running automated tests.
The Umbraco Deploy license file and the schema data files will automatically be included within the build output.
The deployment part of the pipeline stage consists of two steps.
Firstly a web deployment (#4), takes the packaged build artifact and deploys it, in this case, to an Azure Web App slot.
The final step (#5) is Umbraco Deploy specific - to call a function defined in the PowerShell script and trigger the extraction. Replace ApiSecret
with ApiKey
if you're using the deprecated API key setting instead.
The Microsoft docs contain useful information, if you are unsure of how to set secrets for your pipeline:
Umbraco Deploy is also the engine that runs behind the scenes on . Here it takes care of all the deployment processes of both code, schema and content on projects.
In this article, we will cover the steps in order for you to install and configure Umbraco Deploy on a new or existing website.
Ensure to first read and follow the setup guides for either new or existing projects below:
After the Umbraco files have been committed add the following lines to the .gitignore so that they will not be picked up by Git when we are deploying.
The deploy-specific update here will ensure that temporary files generated by Deploy during its operations will not be included in source control.
Make sure that the updates to the .gitignore file are also committed.
When Umbraco has been installed in a repository, we can install Umbraco Deploy in the project.
To install Umbraco Deploy, run dotnet add package Umbraco.Deploy.OnPrem
from the command line or Install-Package Umbraco.Deploy.OnPrem
from the package manager console in Visual Studio.
To be able to use Umbraco Forms with Umbraco Deploy, you need to install the Umbraco.Deploy.Forms
package as well.
In order to deploy content based on certain rich core and community property editors - including Multi URL Picker and Block List/Grid Editor - there is one further NuGet package to install: Umbraco.Deploy.Contrib
.
With Umbraco Deploy installed, to use it in the project you will need to create and add configuration for an API key/secret.
For improved security, it is recommended to set the ApiSecret
(instead of the ApiKey
) setting to a cryptographically random value of 64 bytes. Using Base64-encoding to get the string representation, will result in a value of 88 characters. For versions prior to Deploy 12 or when not using the API secret setting, the recommendation is to set the ApiKey
to a randomly generated string of 64 characters.
Once the API secret has been added, it is now time to configure the environments, also in the appsettings.json
file.
An example configuration with a single upstream environment file will look like this:
The setting under Project:CurrentWorkspaceName should match the Name provided in the list of Workspaces that match the current environment. Using this Umbraco Deploy will indicate the correct current environment on the "Workspaces" dashboard.
In Umbraco Deploy 9, this value was set using the configuration key Debug:EnvironmentName. Although included under a "Debug" section, this setting is required for the installations of Umbraco Deploy on-premises (i.e. other than on Umbraco Cloud). Hence why it was moved to the "Project" section in Umbraco Deploy 10.
Expected values for Type are "development", "staging" or "live". These settings are required, though strictly only for the latter is it necessary to use the specific value of "live", so other values can be used if you have more than these three environments.
You will need to generate a unique GUID for each environment. This can be done in Visual Studio:
Open "Tools".
Select "Create GUID".
Use the Registry Format.
Copy the GUID into the id
value.
Generate a "New GUID" for each environment you will be adding to your setup.
Or by running the following PowerShell command:
The URL configured for each environment should be the root URL for the website and needs to be accessible by the other environments over HTTPS.
Once the configuration has been set up with the correct information we can now go ahead and make sure that the source control is including our files in the /umbraco/Deploy
folder of our Umbraco project.
This can be done by going to the /umbraco/Deploy/Revision
folder of the project and create a test .uda
file, and then check in either your Git GUI or in the command line and verify whether the test file is being tracked.
We can see that the file has been created and it is being tracked by Git and we can go ahead and delete the test file.
Now that Umbraco Deploy has been installed on the project, we can go ahead and commit the files to the repository.
Do not push the files up yet as a CI/CD build server will first need to be set up and connected to our repository.
Before moving on to setting up the build server, make sure that your license is included in your project.
For Umbraco Deploy On-Premise, this will be a key provided to you when taking out your subscription to the product. It should be added to your configuration at the key Umbraco:Licenses:Umbraco.Deploy.OnPrem
.
For example, in appsettings.json
:
You might run into issues when using a period in the product name when using environment variables. Use an underscore in the product name instead, to avoid problems.
Umbraco Cloud projects use a license file placed in the /umbraco/Licenses
folder that is provided when your project is created.
Read more about the Umbraco Deploy licensing model.
How to import and export content and schema between Umbraco environments and projects
The import and export feature of Umbraco Deploy allows you to transfer content and schema between Umbraco environments. Exports are made from one environment to a .zip
file. And this file is imported into another environment to update the Umbraco data there.
Umbraco Deploy provides two primary workflows for managing different types of Umbraco data:
Umbraco schema (such as document types and data types) are transferred as .uda
files serialized to disk. They are deployed to refresh the schema information in a destination environment along with code and template updates.
Umbraco content (such as content and media) are transferred by editors using backoffice operations.
We recommend using these approaches for day-to-day editorial and developer activities.
Import and export is intended more for larger transfer options, project upgrades, or one-off tasks when setting up new environments.
As import and export is a two-step process, it doesn't require inter-environment communication. This allows us to process much larger batches of information without running into hard limits imposed by Cloud hosting platforms.
We are also able provide hooks to allow for migrations of artifacts (such as data types) and property data when importing. This should allow you to migrate your Umbraco data from one Umbraco major version to a newer one.
To export content and schema, you can select either a specific item of content, or a whole tree or workspace.
When exporting, you can choose to include associated media files. Bear in mind that including media files for a large site can lead to a big zip file. So even with this option, you might want to consider a different method for transferring large amounts of media. For example using direct transfer between Cloud storage accounts or File Transfer Protocol (FTP).
If your account has access to the Settings section, you can also choose to include the schema information and related files as well.
When doing a workspace export, you also have the option to include all schema in the export (introduced in Deploy 13.3), regardless of whether it is in use by any content.
Umbraco Deploy will then serialize all the selected items to individual files, archive them into a zip file and make that available for download. You can download the file using the Download button.
After the download, you should also delete the archive file from the server. You can do this immediately via the Delete button available in the dialog.
If you miss doing this, you can also clean up archive files from the Umbraco Deploy dashboard in the Settings section.
The exported archive files are saved to the Umbraco temp folder in the Deploy\Export
sub-directory. This is a temporary (non-persistent) location, local to the backoffice server and therefore shouldn't be used for long-term storage of exports. You can also only download the file from the export dialog in the backoffice.
Having previously exported content and schema to a zip file, you can import this into a new environment.
You can upload the file via the browser.
Similar to when exporting, you can choose to import everything from the archive file, or only content, schema or files.
Deploy does not touch the default maximum upload size, but you can configure this yourself by following the CMS documentation. On Umbraco Cloud, the upload size limit is 500 MB.
We validate the file before importing. Schema items that content depends on must either be in the upload itself or already exist on the target environment with the same details. If there are any issues that mean the import cannot proceed, it will be reported. You may also be given warnings for review. You can choose to ignore these and proceed if they aren't relevant to the action you are carrying out.
The import then proceeds, processing all the items provided in the zip file.
Once complete or on close of the dialog, the imported file will be deleted from the server. If this is missed, perhaps via a closed browser, you can also delete archive files from the Umbraco Deploy dashboard in the Settings section.
It is possible to migrate schema and content whilst importing. For example, to change Data Type using Nested Content to Block List and ensure content data is imported to the correct Block Editor format.
Deploy contains base classes and implementations to handle common migrations that need to be registered in code, as explained in Import with migrations.
The import and export feature is not available in Deploy 2 for Umbraco 7. We have though released a package to allow creating an export. This needs to be done in code and requires additional legacy migrators to be able to import into a newer version. This is explained in Migrating from Umbraco 7.
Underlying the functionality of import/export with Deploy is the import/export service, defined by the IArtifactImportExportService
.
You may have need to make use of this service directly if building something custom with the feature. For example you might want to import from or export to some custom storage.
The service interface defines two methods:
ExportArtifactsAsync
- takes a collection of artifacts and a storage provider defined by the IArtifactExportProvider
interface. The artifacts are serialized and exported to storage.
IArtifactExportProvider
defines methods for creating streams for writing serialized artifacts or files handled by Deploy (media, templates, stylesheets etc.).
ImportArtifactsAsync
- takes storage provider containing an import defined by the IArtifactImportProvider
interface. The artifacts from storage are imported into Umbraco.
IArtifactImportProvider
defines methods for creating streams for reading serialized artifacts or files handled by Deploy (media, templates, stylesheets etc.).
Implementations for IArtifactExportProvider
and IArtifactImportProvider
are provided for:
A physical directory.
An Umbraco file system.
A zip file.
These are all accessible for use via extension methods available on IArtifactImportExportService
found in the Umbraco.Deploy.Infrastructure.Extensions
namespace.
The following example shows this service in use, importing and exporting from a zip file on startup:
How to import content and schema while migrating them into newer alternatives
As well as importing the content and schema directly, we also provide support for modifying the items as part of the process.
For example, you may have taken an export from an Umbraco 8 site, and are looking to import it into a newer major version. In this situation, most content and schema will carry over without issue. However, you may have some items that are no longer compatible. Usually this is due to a property editor - either a built-in Umbraco one or one provided by a package. These may no longer be available in the new version.
Often though there is a similar replacement. Using Deploy's import feature we can transform the exported content for the obsolete property into that used by the new one during the import. The migration to a content set compatible with the new versions can then be completed.
For example, we can migrate from a Nested Content property in Umbraco 8 to a Block List in Umbraco 13.
We provide the necessary migration hooks for this to happen, divided into two types - artifact migrators and property migrators.
Artifact migrators work by transforming the serialized artifact of any imported artifact, via two interfaces:
IArtifactMigrator
- where the migration occurs at the artifact property level
IArtifactJsonMigrator
- where the migration occurs at the lower level of transforming the serialized JSON itself.
Implementations to handle common migrations of Data Types from obsoleted property editors are available:
ReplaceMediaPickerDataTypeArtifactMigrator
- migrates a Data Type from using the legacy Media Picker to the current version of this property editor
ReplaceNestedContentDataTypeArtifactMigrator
- migrates a Data Type based on the obsolete Nested Content property editor to the Block List
ReplaceGridDataTypeArtifactMigrator
- migrates a Data Type based on the legacy Grid layout into the Block Grid
ReplaceUnknownEditorDataTypeArtifactMigrator
- replaces any unknown editor alias with a label
We've also made available base implementations that you can use to build your own migrations. You may need to handle the transfer of information between other obsolete and replacement property editors that you have in your Umbraco application.
ArtifactMigratorBase<TArtifact>
- migrates the artifact of the specified type
DataTypeArtifactMigratorBase
- migrates Data Type artifacts
ReplaceDataTypeArtifactMigratorBase
- migrates a Data Type from one property editor to another
ArtifactJsonMigratorBase<TArtifact>
- migrates the JSON of the specified artifact type
ReplaceGridDataTypeArtifactMigratorBase
- migrates a Data Type based on the legacy Grid layout into the Block Grid
Property migrators work to transform the content property data itself. They are used in the Deploy content connectors (documents, media and members) when the property editor is changed during an import:
Again we have an interface:
IPropertyTypeMigrator
Implementations for common migrations:
MediaPickerPropertyTypeMigrator
NestedContentPropertyTypeMigrator
GridPropertyTypeMigrator
And a base type to help you build your own migrations:
PropertyTypeMigratorBase
GridPropertyTypeMigratorBase
Property editor changes are determined by comparing the PropertyEditorAliases
dictionary stored in the content artifact to the current Content Type/Data Type configuration. The dictionary contains editor aliases for each content property.
Migrators will run if you have registered them, so you can enable only the ones needed for your solution.
You can do this via a composer, as in the following example. Here we register two of the migrators shipped with Umbraco Deploy:
When an import is started, the following happens:
Artifact signatures are read from the import provider (using IArtifactImportProvider.GetArtifactSignatures()
).
The artifact signatures are sorted based on dependencies with Ordering
enabled (ensuring dependent items are processed in the correct order, like parent items before children and data types before document types).
For each artifact signature:
Check whether the entity type is allowed to be imported.
Publish an ArtifactImportingNotification
(cancelling will skip importing the artifact).
Publish a ValidateArtifactImportNotification
:
Deploy adds a default handler (ValidateArtifactImportDependenciesNotificationHandler
) to validate whether all dependencies are either in the import or already present in the current environment. It emits warnings for missing content artifacts, missing or different artifact checksums and errors for missing schema artifacts.
The import fails on validation errors or 'soft' fails on warnings if the WarningsAsErrors
import option is set.
Create a Deploy scope and context (containing the 'owner' user for auditing purposes and cultures to import, in case the user doesn't have edit permissions for all languages).
For each artifact signature:
Create a (readonly) Stream
for the artifact (using IArtifactImportProvider.CreateArtifactReadStream(Udi)
).
Deserialize the artifact into a generic JSON object (JToken
).
Parse the __version
and __type
properties and resolve the artifact type (using IArtifactTypeResolver
).
Migrate the JSON object (using IArtifactJsonMigrator
).
Deserialize the JSON object into the artifact type.
Migrate the artifact (using IArtifactMigrator
).
Initialize artifact processing (using IServiceConnector.ProcessInit(...)
) and track deploy state with next passes.
For each next process pass (starting at the lowest initial next pass):
Process artifact (using IServiceConnector.Process(...)
).
During processing: service connectors for content, media and members migrate property type values if a property editor alias has changed (using IPropertyTypeMigrator
).
When no next pass is required (the deploy state returns -1 as next pass):
Publish an ArtifactImportedNotification
.
Report the import process (using IProgress.Report(...)
).
The Deploy scope is completed, causing all scoped notifications to be published to handlers implementing IDistributedCacheNotificationHandler
) and completing the import.
Umbraco Deploy ships with migrators to handle the conversion of core property editors as they have changed, been removed or replaced between versions.
The Grid editor introduced in Umbraco 7 can still be used in version 13 but has been removed from Umbraco 14. Its functionality is replaced with the Block Grid editor.
With Deploy migrators we have support for migrating Data Type configuration and property data between these property editors.
You can configure the default migration with the following composer:
The project you are importing into needs to know about any custom legacy Grid editor configurations to migrate to the Block Grid editor correctly. Make sure to either copy the grid.editors.config.js
and package.manifest
(containing grid editors) files or override the GetGridEditors()
method of the artifact migrator to provide this.
These implementations make use of the following conventions to migrate the data:
ReplaceGridDataTypeArtifactMigrator
:
Grid layouts are migrated to an existing or new element type with an alias based on the layout name, prefixed with gridLayout_
(this can be customized by overriding MigrateGridTemplate()
);
Row configurations are migrated to an existing or new element type with an alias based on the row name, prefixed with gridRow_
(this can be customized by overriding MigrateGridLayout()
);
Similarly, grid editors are migrated to an existing or new element type with an alias based on the editor alias, prefixed with gridEditor_
(this can be customized by overriding MigrateGridEditor()
). The available editors are retrieved from the grid.editors.config.js
files (can be overridden in GetGridEditors()
). Each migrated grid editor will have the following property types added to the element type:
The media
grid editor is migrated to multiple properties: the value
property contains the selected media item (using Media Picker v3), altText
the alternate text (using a Textbox) and caption
the caption (also using a Textbox);
The remaining grid editors create a single value
property that uses the following editors:
rte
- the default 'Rich Text Editor', falling back to the first Umbraco.TinyMCE
editor.
headline
- the default 'Textstring', falling back to the first Umbraco.TextBox
editor.
macro
and embed
grid editors are converted into rich text editors.
quote
or any other - use falling back to the first Umbraco.TextArea
editor.
The block label is also updated for the built-in grid editors, ensuring a nice preview is available (the WYSIWYG style previews are incompatible between these editors, so the custom views are not migrated);
Grid settings config and styles are migrated to a new element type with a random alias, prefixed with gridSettings_
(this can be customized by overriding MigrateGridSettings()
). This is because the migration only has context about the Data Type configuration (not the actual Data Type) and multiple Data Type can potentially use the same configuration (for config and styles), so there's no predictable way to create a unique alias. The migrated settings element type will have the property types added for the config and styles:
Each config setting is migrated to a property with an alias based on the key, prefixed with setting_
and added below a 'Settings' property group;
Similarly, each style is migrated to a property with an alias based on the key, prefixed with style_
and added below a 'Styles' property group;
The following property editors are used for these properties based on the config/style view:
radiobuttonlist
- a new 'Radio Button List' Data Type that uses the pre-values;
multivalues
- a new 'Checkbox List' Data Type that uses the pre-values;
textstring
- the default 'Textstring', falling back to the first Umbraco.TextBox
editor.
mediapicker
and imagepicker
- the default 'Media Picker' (v3, single image), falling back to the first Umbraco.MediaPicker3
editor.
boolean
- the default 'Checkbox', falling back to the first Umbraco.TrueFalse
editor.
number
- the default 'Numeric', falling back to the first Umbraco.Integer
editor.
treepicker
, treesource
, textarea
or any other - the default 'Textarea', falling back to the first Umbraco.TextArea
editor.
GridPropertyTypeMigrator
:
Gets the grid layout and row configuration element types based on the alias prefix/name convention used by the Data Type artifact migrator;
The grid editor values are migrated to the respective properties:
The media
grid editor converts the value to a media item with crops (based on the UDI or media path), including the focal point (although this needs to be enabled on the Data Type), alternate text and caption;
All other values are converted to a text value or otherwise to a JSON string;
If a row or cell contains settings config or styles and the corresponding block has a settings element type configured, the settings config and styles are migrated to their respective properties in a similar way, based on the property editor alias:
Umbraco.MediaPicker3
- removes url('
from the beginning and ')
from the end of the value (commonly used as a modifier and added to the stored value), before trying to get the media item by a path.
All other values are returned as-is.
Given the flexibility of the grid editor and Block Grid you may want to take further control over the migration. You can do that by creating your own migrator classes, that make use of our provided base classes. You would then register your own migrators instead of the ones shipped with Umbraco Deploy in your composer.
The base classes provide the following functionality. Methods you should look to override to amend the default behavior have been noted above.
ReplaceGridDataTypeArtifactMigratorBase
- replaces the Umbraco.Grid
Data Type editor alias with Umbraco.BlockGrid
and migrates the configuration:
The number of columns is copied over.
Grid layouts, row configurations and grid editors are migrated to blocks:
If multiple grid layouts are configured or if at least one contains multiple sections or isn't the full width, each grid layout will be migrated to a 'layout block' (an element type without properties).
If multiple row configurations are configured or if at least one contains areas that don't allow all grid editors or has a maximum amount of items set, each row configuration is migrated to a block (this is also always done when there are multiple grid layouts, as each layout can configure allowed row configurations).
All grid editors are migrated to blocks (allowing a single grid editor to be migrated to multiple blocks to support DocTypeGridEditor, as that allows selecting different element types).
The settings config and styles are migrated to a single element type (even though each setting can define whether it's supported for rows and/or cells) and used on the blocks that are allowed.
Block groups are added for Layout and Content and used on the corresponding block types.
GridPropertyTypeMigratorBase
- migrates the property data from the GridValue
into the BlockValue
(using the Umbraco.BlockGrid
layout):
The related Data Type is retrieved to get the configured blocks.
All grid control values are first migrated into their content blocks.
Settings config and styles for 'grid cells' are stored on the area within a row, but areas in the Block Grid can't have settings, so this is migrated into the first migrated grid control content block instead.
If a layout block can be found for the row configuration name, all grid controls are wrapped into that block.
Similarly, if a layout block can be found for the grid layout name, all items are wrapped into that block.
The JSON serialized BlockValue
is returned.
Ensure you are running the latest version of Umbraco.Deploy.Contrib
compatible with your Umbraco major version.
In your new project, register the following migrators to add support for the import from Doc Type Grid Editor grids:
The migrators add the following behavior:
ReplaceDocTypeGridEditorDataTypeArtifactMigrator
extends ReplaceGridDataTypeArtifactMigrator
and ensures any DocTypeGridEditor is migrated to blocks using the allowed element types. If the element types aren't found the default implementation will migrate to new element types.
DocTypeGridEditorPropertyTypeMigrator
extends GridPropertyTypeMigrator
and ensures the Doc Type Grid Editor values are mapped one-to-one to the block item data.
The artifact migrator adds the default DocTypeGridEditor configuration (with alias docType
). This can be disabled by setting the AddDefaultDocTypeGridEditor
property to false
in a custom/inherited class. Similar to the base migrator, any custom DocTypeGridEditor configurations must be available to migrate to the Block Grid editor correctly.
We provide a migrator for this package in Umbraco.Deploy.Contrib
.
This adds support for migrating Matryoshka Group Separators into native property groups. It removes the Matryoshka Data Types during import and migrates the document, media and member types. Native property groups are also changed into tabs, similarly to how they were displayed with Matryoshka installed.
To use, you register the migrators:
As described above, the nested content to block list migration will occur register the corresponding migrator with your application.
To help write your own migrations, we share the source code of an example that ships with Umbraco Deploy. This migration converts Nested Content to Block List.
First we have the artifact migrator that handles the conversion of the configuration stored with a datatype:
And secondly we have the property migrator that handles restructuring the content property data:
Steps and examples on how to setup a build and deployment pipeline for Umbraco Deploy using GitHub Actions.
In this example we will show how you can set up a CI/CD build server using GitHub Actions in Azure Web Apps.
We will not cover how you can set up the site itself as this is beyond this documentation.
The following steps will take you through setting up a build server in Azure Web Apps. Go to the Azure portal and find the empty website that we have set up and want to connect to.
Go to the Deployment Center.
In the Deployment Center we can set up the CI/CD build server. With this example we are going to set up our build server by using GitHub Actions. It is possible to set up the build server however you want as long as it supports executing powershell scripts.
Go to the Settings tab.
Choose which source and build provider to use.
In this case we want to choose GitHub.
Choose the Organization which you created our GitHub repository under.
Choose the repository that was set up earlier in this guide.
Select which branch that we want the build server to build into.
We can see which runtime stack and version we are running, in this example we are running .NET and Version 6.0.
Once the information has been added we can go ahead and preview the YAML file that will be used for the build server:
Save the workflow.
The website and the GitHub repository are now connected.
If we go back to the GitHub repository we can see that a new folder have been created called Workflows:
Inside the folder, we find that the YAML file has been created with the default settings from the Azure Portal. The file will need to be configured so it fits into your set up.
Pull down the new file and folder, so you can work with the YAML file on your local machine.
Configure it to work with our Umbraco Deploy installation.
When it have been configured it will look something like this:
This is only an example of how you can set up the CI/CD pipeline for Umbraco Deploy. It is possible to set it up in a way that works for you and your preferred workflow.
We also need to add the License file and TriggerDeploy.ps1
file in an item group in the csproj
file:
As well as enabling Unattended install in the appsettings.json file so Umbraco installs automatically:
Before the build can work, we will need to set up our generated API key to work with the build server in GitHub Actions.
Open your GitHub repository.
Navigate to Settings.
Go to the Secrets tab.
Select "New repository secret".
Call the new secret "DEPLOYAPISECRET".
Add the API secret from the appsettings.json
file (replace ApiSecet
with ApiKey
in the script and use the corresponding value if you're using the deprecated API key setting instead).
Save the secret.
We can now go ahead and commit the configured YAML file and push up all the files to the repository.
Go to GitHub where you will now be able to see that the CI/CD build has started running:
The build server will go through the steps in the YAML file, and once it is done the deployment have gone through successfully:
You can now start creating content on the local machine. Once you create something like a Document Type, the changes will get picked up in Git.
When you're done making changes, commit them and deploy them to GitHub. The build server will run and extract the changes into the website in Azure.
This will only deploy the schema data for our local site to your website.
You will need to transfer content and media from the backoffice on your local project using the queue for transfer feature.
A description of the proper workflow when working with Umbraco Deploy
Umbraco Deploy uses a deployment model that relies on Git and Umbraco Deploy core technology to move your changes from one environment to another. Umbraco Deploy uses a classic "left to right" deployment model, meaning that changes are first made in the Development or local environment and then deployed to the production environment.
If your project contains a Staging environment, deployments will be made from Development to Staging and then from Staging to Live.
Umbraco Deploy uses a two-part deployment approach where we keep meta data (Document types, templates, etc) and content (Content nodes and Media) as separate parts of a deployment. In order to be able to distinguish between the two types of deployments we use the term transfer for content and media deployments and the term deploy for meta data deployments.
In summary:
Meta data such as Document Types, Templates, Forms, Views and config files are stored in a repository and are deployed between environments. This can be achieved using a CI/CD deployment pipeline with something like GitHub Actions or Azure DevOps.
Content and Media items are not stored in the repository. These need to be transferred directly from the Umbraco backoffice using the "Queue for Transfer" option. Once a content editor has all the items needed for a transfer they will use the Deployment Dashboard in the Content section to transfer the items in the queue.
In order to be able to transfer content and media, the source environment and the target environment needs to have the same setup - meaning they need to be completely in sync and have the same file structure. To achieve this you need to deploy your meta data changes to the target environment.
Moving your content and media between your environments is done through the Umbraco backoffice. You can transfer content from one environment to another, e.g. from local to your development environment. You also have the option to restore content and media to your local or development environment from your production or staging environment.
Transferring and restoring content and media is the same whether you are working locally and transferring between two environments.
Another approach for transferring content and schema between environments is to use import and export. In one environment, you can export selected content, a tree, or the whole workspace to a .zip file. There are options to include related media files, schema and code files such as templates and stylesheets.
That .zip file can then be uploaded into a new environment, where it will be validated and then processed to update Umbraco.
As part of the import process, we provide hooks to allow for migrations of the imported artifacts (like data types) and property data. This should allow you to migrate your Umbraco data from one Umbraco major version to a newer one.
We recommend using the content and media backoffice transfer options for day-to-day editorial activities. Import and export is intended more for larger transfer options, project upgrades, or one-off tasks when setting up new environments.
Read more about the import and export feature.
In Umbraco Deploy we have included a Deploy Dashboard in the Settings section of the Umbraco backoffice to make it easier to run operations like schema deployment from data files and extract schema to data files.
When running the extract schema to data files
operation, Umbraco Deploy will run an echo > deploy-export
in the data folder of your project which is used to generate UDA files based on the schema in your database.
Running the schema deployment from data files
operation will initiate an extraction on the environment
The extraction will end in one of two possible outcomes:
deploy-complete
: The extraction succeeded and your environment is in good shape!
deploy-failed
: The extraction failed - open the deploy-failed file, to see the error message.
It is also possible to see which version of Umbraco Deploy you are running, when the last operation was started and the status of the deployment operation.
How to export content and schema from Umbraco 7 and import them into a newer version
The import and export features are available for Umbraco Deploy supporting Umbraco 8 and above. It's not been ported back to Umbraco 7, hence you cannot trigger an export from the backoffice or use the service.
However, you can still use this feature to help migrate from Umbraco 7 to a supported major version. This requires adding additional logic to your Umbraco 7 project to create an export ZIP archive similar to newer versions.
You can generate an export archive in the same format as the import/export feature. This is done by adding the to your Umbraco 7 project (that already has Deploy and Deploy Contrib installed, see below). This archive can be imported into a newer Umbraco version by configuring the legacy import migrators. You can also apply additional migrators to update obsolete data types and property data into newer equivalents.
This is possible via code, by temporarily applying a composer to an Umbraco 7 project to generate the export archive on start-up:
To import this archive into a newer Umbraco project, you need to install either of these packages:
UmbracoDeploy.Contrib
4.3 for Umbraco 8
Umbraco.Deploy.Contrib
for Umbraco 10.2, 12.1, 13.1 or later
Then you need to configure the legacy artifact type resolver and migratory.
Artifact type resolvers allow resolving changes in the type that's stored in the __type
JSON property of the artifact. This is in case it moved to a different assembly or namespace (or got renamed) in a newer version. The legacy migrators handle the following changes:
Moving the pre-values of data types to the configuration property;
Moving the invariant release and expiration dates of content to the (culture variant) schedule property;
Moving the 'allowed at root' and 'allowed child content types' of content/media/member types to the permissions property;
Migrating the Data Type configuration from pre-values to the correct configuration objects and new editor aliases for:
Umbraco.CheckBoxList
(pre-values to value list)
Umbraco.ColorPickerAlias
to Umbraco.ColorPicker
(pre-values to value list)
Umbraco.ContentPicker2
to Umbraco.ContentPicker
(removes invalid start node ID)
Umbraco.ContentPickerAlias
to Umbraco.ContentPicker
(removes invalid start node ID)
Umbraco.Date
to Umbraco.DateTime
Umbraco.DropDown
to Umbraco.DropDownListFlexible
(pre-values to value list, single item select)
Umbraco.DropDownListFlexible
(pre-values to value list, defaults to multiple item select)
Umbraco.DropdownlistMultiplePublishKeys
to Umbraco.DropDownListFlexible
(pre-values to value list, defaults to multiple item select)
Umbraco.DropdownlistPublishingKeys
to Umbraco.DropDownListFlexible
(pre-values to value list, defaults to single item select)
Umbraco.DropDownMultiple
to Umbraco.DropDownListFlexible
(pre-values to value list, defaults to multiple item select)
Umbraco.MediaPicker2
to Umbraco.MediaPicker
(removes invalid start node ID, defaults to single item select)
Umbraco.MediaPicker
(removes invalid start node ID)
Umbraco.MemberPicker2
to Umbraco.MemberPicker
Umbraco.MultiNodeTreePicker2
to Umbraco.MultiNodeTreePicker
(removes invalid start node ID)
Umbraco.MultiNodeTreePicker
(removes invalid start node ID)
Umbraco.MultipleMediaPicker
to Umbraco.MediaPicker
(removes invalid start node ID, defaults to multiple item select)
Umbraco.NoEdit
to Umbraco.Label
Umbraco.RadioButtonList
(pre-values to value list, change database type from integer to nvarchar)
Umbraco.RelatedLinks2
to Umbraco.MultiUrlPicker
Umbraco.RelatedLinks
to Umbraco.MultiUrlPicker
Umbraco.Textbox
to Umbraco.TextBox
Umbraco.TextboxMultiple
to Umbraco.TextArea
Umbraco.TinyMCEv3
to Umbraco.TinyMCE
Migrating pre-value property values for:
Umbraco.CheckBoxList
Umbraco.DropDown.Flexible
Umbraco.RadioButtonList
The following composer adds the required legacy artifact type resolver and migrators. It also adds a custom resolver that marks the specified Document Type alias testElement
as the element type. Element types are a concept added in Umbraco 8 and are required for Document Types that are used in Nested Content.
It is recommended to start by importing only the schema and schema files (by deselecting 'Content' and 'Content files' in the dialog). Then, you can proceed with importing all content and schema together. The order in which the artifacts are imported depends on their dependencies. By importing the schema first, we ensure that the schema is updated before any content is processed.
Umbraco Deploy for Umbraco 7 is no longer supported and was only available on Umbraco Cloud. It was not released for use on-premise.
As such if you are looking to migrate from an Umbraco Cloud project running on Umbraco 7, you already have Umbraco Deploy installed.
If you have an Umbraco 7 on-premises website, you can use this guide to migrate from on-premises to Umbraco Cloud. Or to upgrade to a newer Deploy On-premises version. You need to obtain and install Umbraco Deploy for Umbraco 7 into your project, solely to use the export feature.
The export feature can be used without a license.
A license is required for the Umbraco project you are importing into. This can be a license that comes as part of an Umbraco Cloud subscription, or a Deploy On-premises one.
Use this guide to migrate from on-premises to Umbraco Cloud or to upgrade to a newer Deploy On-premises version.
Download the required dll
files for Umbraco Deploy for Umbraco 7 from the following links:
Install Umbraco Deploy with the Contrib and Export extensions.
Install Umbraco Deploy
, Deploy.Contrib
, and Deploy.Export
by copying the downloaded .dll
files into your Umbraco 7 site.
When copying the files over from Umbraco Deploy
you should not overwrite the following files (if you already had Umbraco Deploy installed):
Run the project to make sure it runs without any errors
Update the web.config
file with the required references for Umbraco Deploy:
Export Content.
Export your content, schema, and files to zip.
With the Deploy Dashboard, we have made it possible to get an overview of your Umbraco Deploy installation and perform Deploy operations.
In this article, we will show the different sections on the Deploy dashboard and how they can be used.
Here you can see whether the latest deployment has been completed or failed. You can see the version of Umbraco Deploy you are running, and the last time an operation was run.
With the Deploy operations, you can run different operations in Umbraco Deploy.
Below you can read what each operation will do when run through the dashboard.
Running this operation will update the Umbraco Schema based on the information in the .uda
files on disk.
Running this operation will extract the schema from Umbraco and output it to the .uda
files on disk.
Running this operation will clear the cached artifact signatures from the Umbraco environment. This should not be necessary, however, it may resolve reports of schema mismatches when transferring content that has been aligned.
This operation will set the cached artifact signatures for all entities within the Umbraco environment. Use this when signatures have been cleared and you want to ensure they are pre-generated before attempting a potentially longer restore or transfer operation.
Running this operation will download a zip file with all the Deploy artifacts representing the Umbraco schema in the form of .uda
files.
This operation is useful if you want to move to another Umbraco instance and migrate the data with you.
The Schema Comparison table shows the schema information managed by Umbraco Deploy.
You can see a comparison between the information that is held in Umbraco and the information in the .uda
files on disk.
The table shows:
The name of the schema
The file name
Whether the file exists in Umbraco
Whether the file exists
Whether the file is up-to-date
You can also view details about a certain element by selecting "View Details".
This will show the difference between entities stored in Umbraco and the .uda
file stored on disk.
Open source migrators may be built by HQ or the community for property editors found in community packages. They will be made available for and via the Umbraco.Deploy.Contrib
package.
was a community package commonly used with the legacy grid editor. If you are using this with Umbraco 7 and up, you can export and migrate into the Block Grid on Umbraco 13 or above.
was an Umbraco package that added tab support for document types in Umbraco. The feature was subsequently added to the product itself.
: Latest Deploy Version 2 release for Umbraco CMS Version 7 (officially for use on Cloud)
: Latest/only Deploy Contrib Version 2
: For exporting all content/schema in Version 7
In the Configuration details, you can see how Umbraco Deploy has been on your environment. You get an overview of the Setting options, the current value(s), and notes help you understand each of the settings. Updates to the need to be applied in the appsettings.json
file.
How to upgrade Umbraco Deploy
As with all of our products, it is always recommended to run the latest version of Umbraco Deploy.
On the Umbraco Deploy page in the Packages page you can see what the latest version is, as well as read the changelog.
Umbraco Deploy can be upgraded via NuGet.
Open the Package Console in Visual Studio and type:
Update-Package Umbraco.Deploy.OnPrem
You will be prompted to overwrite files. You should choose "No to All" by pressing "L" .
You can open the NuGet Package Manager and select the Updates pane to get a list of available updates. Choose the package called Umbraco.Deploy.OnPrem and click update. This will run through all the files and make sure you have the latest changes while leaving the files you have updated.
Contains version-specific documentation for when upgrading to new major versions of Umbraco Deploy.
How to partially restore content in Umbraco Deploy
In some cases you might not want to restore the entire content tree, but only the parts that you need. Partial restores is a feature that allows for restoring specific parts of your content instead of restoring everything.
You can use Partial Restore on
In this scenario you've cloned down your environment to your local machine or setup a new environment. In both cases the new environment will have an empty Content section as well as an empty Media section.
Be aware that this feature will also restore all dependencies on the selected content.
E.g. when you restore a content node which references media items as well as other content nodes, these will all be restored as well, including any parent nodes that these nodes depend on.
Follow these steps to perform a partial restore to get only the parts you need:
Go to the Content section of the Umbraco backoffice on your environment or locally.
Right-click the Content Tree, or click the three dots an select Do something else.
Choose Partial Restore.
Select the environment that you would like to restore content from.
Click "Select content to restore" - this will open a dialog with a preview of the content tree from the environment you selected.
Select the content node you would like to restore.
Decide whether you also want to restore any child nodes below the selected node.
Start the restore by clicking Restore.
To see the restored content, reload the content tree - right-click the Content tree to find this option.
Keep in mind if you select a content node deeper down the tree, all the parents above it, required for the node to exist, will be restored as well.
Partial Restores on empty environments are especially helpful when you have a large amount of content and media and do not necessarily need it all for the task you need to do.
Instead of having to restore everything which could potentially take a long time, doing a partial restore can be used to shorten the waiting time by only restoring the parts you need. This will ensure that you can quickly get on your way with the task at hand.
It is also possible to use the Partial Restore feature on environments where you already have content in the Content tree.
Imagine that you are working with your Umbraco project locally. One of your content editors updates a section in the content tree on the production environment. You would like to see how this updated content looks with the new code you are working on. Follow these steps to do a Partial Restore of the updated content node:
Go to the Content section of your local Umbraco backoffice
Right-click the content node which you know contains updates
Choose Partial Restore
Select the environment that you would like to restore content from
Decide whether you also want restore any child nodes under the selected node
Start the restore by clicking Restore
When the restore is done, reload the content tree to see the changes
How to restore content in Umbraco Deploy
When you have content on your environment and you clone down your Umbraco project to your local machine, you will need to do an extra step, in order to see your content locally.
The restore option also comes in really handy when you have content editors creating content on different environments. You will be able to restore and work with that content on your different environments and locally.
There are four options when it comes to restoring content.
The first time you run your project locally you will have the option to restore your content and media before going to the Umbraco backoffice.
This will restore all content and media, plus any other entities configured for back-office transfer.
When your site is done spinning up, click the green Restore button - this will restore all content and media.
Wait till the process completes - this might take a while, depending on the amount of content and media you have on your Umbraco site.
When it's complete select Open Umbraco to go to the backoffice.
You will now see all your content and media in the Umbraco backoffice.
The second option for restoring your content and media is found in the Umbraco backoffice.
Go to the Umbraco backoffice on the environment you want to restore content and media to.
Click the three dots an select Do something else, or Right-click the Content Tree.
Choose Workspace restore... from the menu.
You will now have the option to restore content from any environment that's to the right of the current environment in the deployment workflow.
To ensure the restore will succeed, make sure that your environments have the same meta data and structure files.
Click Restore from .. and wait till the process completes - this might take a while, depending on the amount of content and media you have on your project.
When it's done, right-click the Content tree again and choose Reload to see your content in the tree.
As well as content, media and any other entities configured for back-office transfer, will also be restored in the process.
To see the media, go to the Media section and Reload the tree.
The operation is triggered in the same way as when restoring everything, but instead the Tree restore... menu option should be selected.
For example, if triggered from the content tree, only content entities will be restored. This will also restore any media that’s referenced in that content, but it won’t attempt to restore the full media library, nor any other entities.
By using the Partial Restore option, you can make sure that you only restore the part of the content that you need to work with. You can either restore a single item, or include all the descendents of that item too.
How deleting meta data and files work in Umbraco Deploy
With Umbraco Deploy deletions are environment specific. This means that in order to delete something entirely from your project, you need to delete it on all environments.
In this article you can read about the correct way of deleting files, schema and content when using Umbraco Deploy.
When you are using Umbraco Deploy, you might have more than one environment - including a local clone of the project. These environments each have their own database. The databases will contain references to all of your content and media, as well as to all of your schema files (e.g. Document Types, Templates etc).
The databases are environment specific. When you deploy from one environment to another, Umbraco Deploy will compare incoming schema files with references to these in the databases using both alias and GUID. If something doesn't add up - e.g. there is a mismatch between the database references and the files deployed - you will see an error. Learn more about this in the Troubleshooting section.
The workflow described above does not pick up deletions of content and schema from the database, which is why you'll need to delete the content and/or schema on all your environments, in order to fully complete the deletion.
The main reason Umbraco Deploy does not delete schema and content on deployments, is because it could lead to unrecoverable loss of data. Imagine that you delete a Document Type on your Development environment, and push this deletion to your production environment where you have a lot of content nodes based on the deleted Document Type. When the deployments goes through, all of those content nodes would be instantly removed with no option to roll back as the Document Type they are based on no longer exists. To avoid anyone ending up in this unfortunate situation, deletes are not automatically handled and will require an active decision from you on each environment in order to take place.
Let's say you've deleted a Document Type on your Development environment, and now you want to deploy this deletion to the production environment.
Before you deploy the changes, in Git it will show that the following changes are ready to be committed and deployed:
Commit the changes and push them to your repository and trigger a deployment to your environment.
Once the deployment is complete, you will notice the following:
The Document Type you deleted on Development is still present in the backoffice on the production environment.
You might wonder why the Document Type that you have deleted, is still there. The reason is, that deploy only deletes the associated UDA file, and not the actual Document Type in the database.
In order to completely delete the Document Type from your entire project, you need to delete it from the backoffice of any of the other environments you have as well. When the Document Type has been deleted from the backoffice of all environments and no UDA file exists, you can consider it completely gone.
You should however keep in mind that if you at any point during the process, save your Document Type again, a UDA file will be regenerated and when you start deploying changes between environments, this will likely end up recreating your deleted Document Type.
Every file that's deleted, will also be deleted on the next environment when you deploy. However, there are some differences depending on what you have deleted.
Here's an overview of what happens when you deploy various deletions to the next environment.
Deleted:
The associated .uda
file.
Not deleted:
The entry in the database.
The item will still be visible in the backoffice.
Deleted:
The associated .uda
file.
The associated .cshtml
file (the view file).
Not deleted:
The entry in the database.
The template file will be empty, but still be visible in the backoffice.
As these are only files, everything will be deleted on the next environment upon deployment.
Content and media deletions will not be picked up by deployments and will have to be deleted on each environment you wish to delete the content or media on.
Deleted:
The associated .uda
file
Not deleted:
The entry in the database
The language will still be visible in the backoffice/content dashboard (for multilingual content)
Deleting the language in the backoffice on the target environment will ensure the environments are in sync.
In this section we discuss some additional steps you can carry out to streamline your local development workflow.
Working in a team, it's common for developers to pull code from source control to update their local environment with the latest Umbraco schema.
They can do this by starting up the website, navigating to the Settings > Deploy dashboard and triggering a data extraction.
We can automate this step using a git hook.
When working with Umbraco Cloud, this step is configured automatically for you when you clone and run your project the first time. If working with Umbraco Deploy On-Premise, you can set it up yourself.
The process works by using the marker file Umbraco Deploy uses to trigger an update of the Umbraco schema from the .uda
files from source control.
If a file named deploy-on-start
is found in the /umbraco/Deploy
folder, an update will run automatically when the site starts up.
Therefore, if we ensure that the file is created everytime the source code is pulled from the remote repository, we can automate the update.
To do this, carry out the following steps:
Find the .git
folder within your solution root.
It might be a hidden folder, so if you don't see it, make sure your file browser is configured to show hidden files and folders.
Within that you'll find a hooks
folder, and inside that, a file called post-merge.sample
.
Rename the file to remove the .sample
extension and open it in a text editor.
Apply the following text and save the file (amending the path to the web project as appropriate for your solution structure):
Run a git pull origin <branchname>
.
Start up the website and you should find the Umbraco schema update has been triggered.
How to restore content in Umbraco Deploy using the deployment dashboard
Once all code and meta data is in sync between your environments, it's time to transfer your content and media. This is done from the Umbraco Backoffice.
Content and media transfers are flexible which means you have complete control over which content nodes and/or media items you want to transfer. You can transfer it all in one go, a few at a time or transfer only a single node.
Transferring content will overwrite any existing nodes on the target environment - content transfers will transfer the items that you select in the "source" environment to the "target" environment exactly the same as it was in the "source". This means that if you have some content on the target environment already, this will be replaced by the new content from the source environment.
Important: Content and Media transfers will only work if you've deployed all changes to your meta data before hand. Please refer to our documentation on how to deploy meta data from Deploying Content.
Let’s go through a content transfer step by step. Imagine you’ve finished working on new content for your project locally and you are ready to transfer the changes to your development site.
You want to transfer the whole site. You start from the Home
node and choose to transfer everything under it:
Click on the ellipsis next to the Home
node in the Content tree.
Choose "Do something else".
There you get the choice of Queue for transfer.
If you’re currently editing the Home page you could also use the Actions dropdown to find Queue for transfer.
Choose if you want to include all pages under the chosen page or only transfer the chosen node.
If you wish to transfer all your content at once, right-click the top of the Content tree where you will also find Queue for transfer - this will queue all your content for transfer.
Select the language versions that you want to queue for transfer. Only languages for which you have permission to access will be selectable.
Set the publish date and time if you want to change when the transferred content should be published.
By default, the content will be transferred in its current published state. So if the content is published in the current environment, the changes will be deployed and the item immediately be published in the destination. If you prefer to schedule the publishing of the changes, you can do so by selecting a publish date.
Click Queue to add the content item to the transfer queue.
Go to the Deployment dashboard by clicking on the Content section header.
You will be able to see which items are currently ready to be transferred - this will include both content and media that you've queued for transfer.
Confirm by clicking Transfer toDevelopment and monitor the progress of the transfer.
If everything went well, you will see the confirmation screen saying that the transfer has succeeded.
Media items are transferred the same way as content:
In the Media section Right-click the items you want to transfer and choose Queue for transfer.
Or right-click the top of the Media section to transfer all you media at once.
Go to the Deployment dashboard in the Content section to see the items you've queued for transfer and to transfer your items.
In order for Deploy to handle Forms data as content, you will need to ensure that the transferFormsAsContent
setting in configuration is set to true
. See details in the Deploy Settings for Umbraco 9+ article.
Once the setting have been added to the source and target environment forms can be transferred the same way as content and media:
In the Forms section Right-click the items you want to transfer and choose Queue for transfer.
Or right-click the top of the Forms section to transfer all your Forms at once.
Go to the Deployment dashboard in the Content section to see the items you've queued for transfer and to transfer your items.
This does not include entries submitted via the forms.
Sometimes a content transfer might not be possible. For example if you add a new property to the HomePage Document type and you don’t have that property in both environments, you’ll get an error with a hint on how to fix this.
If you are seeing this type of issue when trying to transfer content, head over to our article about Schema Mismatch errors, where you can read about how to resolve the issues.
Learn about the different settings and configurations available in Umbraco Deploy.
Most configuration for Umbraco Deploy is provided via dotnet configuration. This is most often stored in the appsettings.json
file found at the root of your Umbraco website. If the configuration has been customized to use another source, then the same keys and values discussed in this article can be applied there.
The convention for Umbraco configuration is to have package based options stored as a child structure below the Umbraco
element, and as a sibling of CMS
. Umbraco Deploy configuration follows this pattern, i.e.:
There are some required settings but most configuration for Umbraco Deploy is optional. In other words, values have defaults that will be applied if no configuration is available for a particular key.
For illustration purposes, the following structure represents the full set of options for configuration of Umbraco Deploy, along with the default values. This will help when you need to provide a different setting to understand where it should be applied.
Some configuration is applied via code rather than application settings. Where this is the case is also discussed in the sections to follow.
The ApiKey
is a random string (of at least 10 characters) set to the same value on all environments to authenticate HTTP requests between them. For improved security, set the ApiSecret
to a cryptographically random value of 64 bytes instead (using Base64-encoding).
The default value for this setting is Default
, which configures Umbraco Deploy to work according to how we expect most customers to use the product. Umbraco schema, such as Document and Data Types, are serialized to disk as .uda
files in save operations. These are checked into source control and used to update the schema in the upstream environments via a trigger from your CI/CD pipeline, or automatically if using Umbraco Cloud.
Items managed by editors - content, media and optionally forms, dictionary items and members - are deployed between environments using the transfer and restore options available in the backoffice.
It is possible to use this method for all Umbraco data, by setting the value of this setting to BackOfficeOnly
. With this in place, all data, including what is typically considered as schema, are available for transfer via the backoffice.
Our recommended approach is to leave this setting as Default
and use source control and a deployment pipeline to ensure that structural changes to Umbraco are always aligned with the code and template amends that use them.
However, we are aware that some customers prefer the option to use the backoffice for all data transfers. If that is the case, the BackOfficeOnly
setting will allow this.
This setting allows you to exclude a certain type of entity from being deployed. This is not recommended to set, but sometimes there may be issues with the way a custom media fileprovider works with your site and you will need to set it for media files. Here is an example:
This setting allows you to manage how relations are deployed between environments. You will need to specify an alias and a mode for each relation type. The mode can be either:
Exclude
- This causes the relation to be excluded and not transferred on deployments
Weak
- This causes the relation to be deployed if both content items are found on the target environment
Strong
- This requires the content item that is related is set as a dependency, so if anything is added as a relation it would also add it as a dependency
As of Deploy 10.1.2, 11.0.1 and higher, if this setting is left blank, the relation types used for usage tracking are omitted. These relations are rebuilt by the CMS following a save of an item in the target environment and so don't need to be transferred.
If a particular relation type is not listed, it's considered as a "weak" relation.
This setting is used by package creators who want their custom editors to work with Deploy. The packages should be creating this setting automatically. There is a community-driven package that has value connectors for Deploy called Deploy Contrib.
Here is an example of how the setting can look:
Umbraco Deploy has a few built-in timeouts, which on larger sites might need to be modified. You will usually see these timeouts in the backoffice with an exception mentioning a timeout. It will be as part of a full restore or a full deployment of an entire site. In the normal workflow, you should never hit these timeouts.
There are four settings available relating to backoffice deployment operations:
SessionTimeout
SourceDeployTimeout
HttpClientTimeout
DatabaseCommandTimeout
These timeout settings default to 20 minutes, but if you are transferring a lot of data you may need to increase it.
It's important that these settings are added to both the source and target environments in order to work.
A fifth timeout setting is available from Umbraco Deploy 9.5 and 10.1, allowing for the adjustment of the maximum time allowed for disk operations such as schema updates.
DiskOperationsTimeout
This setting defaults to 5 minutes.
All of these times are configured using standard timespan format strings.
Even with appropriate settings of the above timeouts, Deploy's backoffice transfer operations can hit a hard limit imposed by the hosting environment. For Azure, this is around 4 minutes. This will typically only be reached if deploying a considerable amount of items in one go. For example, a media folder with thousands of items can reach this limit.
An error message of "The remote API has returned a response indicating a platform timeout" will be reported.
If encountering this issue, there are two batch settings that can be applied with integer values (for example 500). This will cause Deploy to transfer items in batches, up to a maximum size. This will allow each individual batch to complete within the time available. The higher the value, the bigger the batches.
SourceDeployBatchSize
- applies a batch setting for the transfer of multiple selected items to an upstream environment (such as a media folder with many images)
PackageBatchSize
- applies a batch setting to the processing of a Deploy "package", which contains all the items selected for a Deploy operation, plus all the determined dependencies and relations
In order for Deploy to handle Forms data as content, you'll to ensure the TransferFormsAsContent
setting is set to true
. To transfer Forms data as schema, i.e. via .uda files committed to source control, use a value of false
.
On changing this value from false
to true
, make sure to remove any .uda
files for Forms entities that have already been serialized to disk. These will no longer be updated. By deleting them you avoid any risk of them being processed in the future and inadvertently reverting a form to an earlier state.
In a similar way, Deploy can be configured to allow for backoffice transfers of dictionary items instead of using files serialized to disk, by setting TransferDictionaryAsContent
as true
.
Please see the note above under TransferFormsAsContent on the topic of removing any existing serialized files having changed this value to true
.
When deploying dictionary items, an exception will be thrown if a translation is provided for a language that doesn't exist in the target environment.
Normally this is a useful fail-safe to ensure translations aren't lost in the transfer operation.
If you have deleted languages that have already existing translations, you may want to temporarily remove this check. You can do that by setting this value to true
.
When this is in place a translation for a language that doesn't exist in the target environment will be ignored. A warning message will be output to the log.
When deploying dictionary items, Umbraco Deploy follows the approach used for all content, emptying values that are transferred and set.
If you transfer a dictionary item with an empty translation to another environment that already contains a translation, it will be overwritten.
Set this value to false
to not overwrite already populated values with empty strings.
It's also possible to transfer members and member groups via the backoffice between environments. This is disabled by default as a deliberate decision to make use of the feature needs to be taken, as for most installations it will make sense to have member data created and managed only in production. There are obvious potential privacy concerns to consider too. However, if being able to deploy and restore this information between environments makes sense for the specific workflow of your project, it's a supported scenario.
To enable, you can add or amend the AllowMembersDeploymentOperations
and TransferMemberGroupsAsContent
settings.
The AllowMembersDeploymentOperations
setting can take four values:
None
- member deployment operations are not enabled (the default value if the setting is missing)
Restore
- restore of members from upstream environments via the backoffice is enabled
Transfer
- transfer of members to upstream environments via the backoffice is enabled
All
- restore and transfer of members from upstream environments via the backoffice is enabled
With TransferMemberGroupsAsContent
set to true
, member groups can also be transferred via the backoffice, and groups identified as dependencies of members being transferred will be automatically deployed.
Please see the note above under TransferFormsAsContent on the topic of removing any existing serialized files having changed this value to true
.
This setting is to be defined and set to false
only if you are using an external membership provider for your members. You will not want to export Member Groups that would no longer be managed by Umbraco but by an external membership provider.
Setting exportMemberGroups
to false
will no longer export Member Groups to .uda files on disk. The default for this setting is true
, as most sites use Umbraco's built-in membership provider and thus will want the membership groups exported.
When restoring or transferring content, Umbraco Deploy will make checks to ensure that any dependent content, media or other items are either present in the target environment, or can be deployed from the source environment.
For example, you may have a media picker on a content item, referencing media that has been deleted or is in the recycle bin. In this situation the dependency won't be available in the target environment.
Deploy can halt at this point, so you get an error and the deployment won't complete until the issue is resolved. To fix, you would need to remove the reference to the deleted media item.
Alternatively, you can configure Deploy to ignore these issues and proceed with the transfer operation without warning.
To configure the behavior you prefer, amend this value to either None
, Transfer
, Restore
or All
.
For example, using the following settings, you will have an installation that ignores broken dependencies when restoring from an upstream environment. It will however still prevent deployment and report any dependency issues when attempting a transfer to an upstream environment.
When configuring for Deploy 9, an additional IgnoreBrokenDependencies
setting existed that took a value of true
or false
. To achieve the same result as the example above, the following configuration was required:
Some customers have reported intermittent issues related to Umbraco's memory cache following deployments, which are resolved by a manual reload of the cache via the Settings > Published Status > Caches dashboard. If you are running into such issues and are able to accommodate a cache clear after deployment, this workaround can be automated via the following setting:
By upgrading to the most recent available version of the CMS major you are running, you'll be able to benefit from the latest bug fixes and optimizations in this area. That should be your first option if encountering cache related issues. Failing that, or if a CMS upgrade is not an option, then this workaround can be considered.
Culture and hostname settings, defined per content item for culture invariant content, are not deployed between environments by default. They can be opted into via configuration.
To enable this, set the configuration value as appropriate for the types of domains you want to allow:
Culture
- the language setting for the content, defined under "Culture"
AbsolutePath
- values defined under "Domains" with an absolute path, e.g. "/en"
Hostname
- values defined under "Domains" with a full host name, e.g. "en.mysite.com"
Combinations of settings can be applied, e.g. Hostname,AbsolutePath
.
When deploying content items, public access rules based on member groups are transferred. You can amend this behavior using this setting.
None
- no public access rules will be transferred
AddOrUpdate
- public access rules added or updated in a source environment will be transferred to the destination
Remove
- public access rules removed a source environment will be removed in the destination
All
- all public access information will be transferred
AddOrUpdate
is the default setting used if no value is configured.
Webhooks may be considered environment specific or schema information that you would like to synchronize between environments. As such, by default, Umbraco Deploy does not include webhooks in schema deployment operations.
If you would like you include them you can adjust this setting:
None
- webhooks are not deployed and are expected to be managed independently in each environment
All
- webhooks included in schema deployments
Specifies options for handling trashed content (documents, media and members) on export or import:
You can amend this behavior using this setting:
None
- trashed content will not be exported or imported
Export
- trashed content will be included in an export
Import
- trashed content will be processed and moved to the recycle bin on import
All
- trashed content will be included in an export, processed and moved to the recycle bin on import
When using Umbraco Deploy with Umbraco Cloud, a development database is automatically created when restoring a project into a local environment for the first time.
For Umbraco 10, by default, a SQLite database is created.
If you would prefer to use SQL Server LocalDb when it's available on your local machine, set this value to true
. If LocalDB isn't reported as being available by Umbraco, it will fallback to using a SQLite database instead.
Deploy will do comparisons between the entities in different environments to determine if they match and decide whether to include them in the operation. By default, for media files, a check is made on a portion of the initial bytes of the file.
This corresponds to the default setting of PartialFileContents
.
If a lot of files need to be checked, this can be slow, and a faster option is available that uses the file metadata. The only downside of changing this option is a marginally increased chance of Deploy considering a media file hasn't changed when it has. This would omit it from the deployment.
To use this method, set the value to Metadata
.
When reviewing a set of items for a deployment operation, Deploy will retrieve and include relations. It does this either via single database lookups, or by bringing all relations into memory in one step, and retrieving them from there.
For small deployment operations, the former is the more optimal approach. It gets slow though when the number of items being transferred is large.
The cut-off before switching methods is set by this configuration value, and it defaults to an operation size of 100
items.
When restoring between different media systems exceptions can occur due to file paths. They can happen between a local file system and a remote system based on blob storage. What is accepted on one system may be rejected on another as the file path is too long. Normally this will only happen for files with particularly long names.
If you are happy to continue without throwing exceptions in these instances you can set this value to true
. For example, this may make sense if restoring to a local or development environment. If this is done such files will be skipped, and although the media item will exist there will be no associated file.
When a Deploy operation completes, cache refresher notifications are fired. These are used to update Umbraco's cache and search index.
In production this setting shouldn't be changed from it's default value of false
, to ensure these additional data stores are kept up to date.
If attempting a one-off, large transfer operation, before a site is live, you could set this value to true
. That would omit the firing and handling of these notifications and remove their performance overhead. Following which you would need to ensure to rebuild the cache and search index manually via the backoffice Settings dashboards.
With this setting assigned a value of true
, Umbraco Deploy will attempt to resolve users when transfers are made to new environments.
Users and user groups are maintained separately in different environments, so it isn't always the case that an editor has accounts across all environments. When an account exists matching by email address, Deploy will associate the changes made in upstream environments with the user that initiated the transfer. Allowing the expected information about save and publish operations to be available in the audit log of the environment where the data was transferred.
When the setting is set to false
, or a matching account isn't found, the audit records will be associated with the super-user administrator account.
Deploy operations suspend scheduled publishing, Examine indexing, document cache and/or signature database update events by default. You can amend this behavior for all supported or specific operations using these settings.
Each setting within this section represents a Deploy operation. For each, the suspensions that are carried out can be amended with one or more of following values:
DiskRead
- None, ScheduledPublishing, Examine, DocumentCache, All
PartialRestore
- None, ScheduledPublishing, Examine, DocumentCache, All
Restore
- None, ScheduledPublishing, Examine, DocumentCache, Signatures, All
Deploy
- None, ScheduledPublishing, All
Import
- None, ScheduledPublishing, Examine, DocumentCache, All
Export
- None, ScheduledPublishing, All
The default value for all suspension settings is All
.
So for example if you wanted to remove Examine indexing suspension and resumption during partial restore operations, you could set the following:
It's also possible to set the values for all operations by setting Suspensions
to a value instead of an object, for example:
If you prefer configuration in code, operators overloads on the settings class make this process straightforward, as shown in the following example:
If set to true
the configuration details shown on the setting's dashboard will be hidden.
If set to true
the version details shown on the setting's dashboard will be hidden.
A default notification handler for the ValidateArtifactImportNotification
is registered by Deploy that:
Adds warnings for dependencies that must match exactly, and are both not in the import and not with matching checksums in the target environment
Adds warnings for all remaining content dependencies and errors for all schema dependencies that don't exist in the import
To avoid this handler from being registered, you can set this setting to false
.
Deploy can import content and/or schema previously exported from another Umbraco installation on start-up. This can be customized by changing the Umbraco:Deploy:ImportOnStartup
settings (note that this is directly below the Deploy
section and not nested below Settings
):
Enabled
- this feature is enabled by default, but can be disabled (e.g. to prevent importing on specific environments)
Files
- the files that are imported on start-up (relative to the project content root, defaults to umbraco\Deploy\import-on-startup.zip
), which are checked individually (files that do not exist are skipped and a warning will be logged)
FileAction
- None
will leave the file on disk (and potentially import it again on the next start-up), Archive
renames the file to end with .imported
and Delete
(the default) will remove the file on successful import
WarningsAsErrors
- indicates whether warnings should be considered as errors
EntityTypes
- sets the entity types to import, note that the default import validation will return warnings for entity types in the ZIP archive that are skipped due to this setting
Cultures
- the ISO codes of content variants that should be imported
Username
- the email address of the user that performs the import (used for auditing), uses the 'super-user' administrator account if not set
Umbraco Deploy can optionally register events that you can use with Umbraco webhooks. You can add them via code, for which we provide an extension method. The following example shows how you can use this within a composer.
With that in place you should see two new events available that you can use in creating your webhooks.
Deploy operation was completed
Deploy operation failed
An example of the payload sent is shown below:
How to import content and schema on startup and implement your own `IArtifactImportOnStartupProvider`
Deploy can import content and/or schema previously exported from another Umbraco installation on start-up. This allows for a quick setup of a baseline/starter kit or serves as a workaround for large ZIP archives that cannot be uploaded via the backoffice.
The default configuration will look for the ZIP archive umbraco\Deploy\import-on-startup.zip
on start-up and if it exists, run an import and delete the file on successful completion. If you want to customize the default behavior, do so via settings.
This feature is extensible via a provider-based model by implementing IArtifactImportOnStartupProvider
and registering it using builder.DeployArtifactImportOnStartupProviders()
. The default Umbraco.Deploy.Infrastructure.SettingsArtifactImportOnStartupProvider
implementation uses the above settings and inherits from Umbraco.Deploy.Infrastructure.ArtifactImportOnStartupProviderZipArchiveBase
(which can be used for your own custom implementation).
IArtifactImportOnStartupProvider
An example of an import on a start-up provider that imports from a physical directory (instead of ZIP archive) is shown below:
How to Deploy changes between a local machine and an environment in Umbraco Deploy using either a Git Gui or without.
In this article you can learn more about how to deploy your code changes and meta data from a local instance to a remote environment.
When you are working with your Umbraco project locally all changes you make will automatically be identified and picked up by your local Git client.
Here's a quick step-by-step on how you deploy these changes to your environment:
You've cloned a site to your local machine to work on.
You've made some changes to a Document Type.
The corresponding .uda
file for this Document Type is now updated with the changes - the file is located in the /umbraco/Deploy/Revision
folder.
You've also created a new Data Type that's used on the Document Type. This Data Type is stored as a new file in the /umbraco/Deploy/Revision
folder as well.
Using Git, commit those two changed files to your local repository and push them to your repository.
A deployment kicks in and the Document Type is updated and the new Data Type you created locally is now automatically created in the remote environment as well.
If you don't have a Git client installed on your local machine, you can use Git or Git Bash for command-line Git operations. Run the following commands:
When pulling new commits, it is a good idea to see if any of these commits contained changes to the schema (anything in umbraco/Deploy/Revision/
). To ensure your local schema is up-to-date, you can navigate to the umbraco/Deploy/
folder and create a deploy marker if it doesn't exist. From a command line type the following command:
/…mysite/umbraco/Deploy> echo > deploy
The local site should be running when you do this. The deploy marker will change to deploy-progress
while updating the site and to deploy-complete
when done. If there are any conflicts/errors you will see a deploy-failed
marker instead, which contains an error message with a description of what went wrong.
How to respond to deployment events using cache refresher notifications
When you deploy content or other Umbraco data between environments, some notifications that are normally fired in CMS operations are suppressed.
For example, you may handle a ContentPublishedNotification
to apply some custom logic when a content item is published. This code will run in a normal CMS publish operation. However, when deploying a content item into another environment and triggering it's publishing there, the notification will not be issued. And the custom logic in the notification handler will not run.
This behavior is deliberate and done for performance and reliability reasons. A normal save and publish operation by an editor operates on one item at a time. With deployments, we may have many, and publishing these notifications may lead to at best slow operations, and at worst inconsistent data.
The notification will also already be published on the environment where the actual operation was carried out. So repeating this with each content transfer might also result in unwanted behavior.
However, what if you do want to run some code on an update, even if this is happening as part of a Deploy operation?
Not all notifications are suppressed by Umbraco Deploy. Some are batched and fired after the deploy operation is complete. These include those related to refreshing the Umbraco cache and rebuilding indexes.
You can use these cache refresher notifications to handle operations that should occur after a deployment is completed. All notification issued by the CMS are made available. So when deploying a set of content items, a refresher notification will be issued for each.
The following two code samples illustrate how this can be done.
The first handles a content cache refresher, which takes a payload from where the content ID can be extracted. Using that the content can be retrieved in the local Umbraco instance and used as appropriate.
For more details and further examples, please see the .
The second example is similar, but handles an update to a dictionary item. With this one we get a parameter that consists of the item's ID. Again we can retrieve it and carry out some further processing.
In both cases, as is usual for notification handlers, you will need to register them via a composer:
How to extend Umbraco Deploy to synchronize custom data
Umbraco Deploy supports the deployment of CMS schema information and definitions from the HQ's Forms package, along with managed content and media. Additionally, it can be extended by package or custom solution developers. This allows the deployment of custom data, such as that stored in your own database tables.
As a package or solution developer, you can hook into the disk-based serialization and deployment. It is similar to that used for Umbraco Document Types and Data Types. It's also possible to provide the ability for editors to deploy custom data via the Backoffice. In the same way that Umbraco content and media can be queued for transfer and restored.
Entities are what you may be looking to transfer between two websites using Deploy. Within Umbraco, they are the Document Types, Data Type, documents etc. In a custom solution or a package, there may be representations of some other data that's being stored separately from Umbraco content. These can still be managed in the Backoffice using custom trees and editors.
For the purposes of subsequent code samples, we'll consider an example entity as a Plain Old Class Object (POCO) with a few properties.
The entity has no dependency on Umbraco or Umbraco Deploy; it can be constructed and managed however makes sense for the package or solution. The only requirement is that it has an ID that will be consistent across the environments (normally a Guid) and a name.
Every entity Deploy works with, whether from Umbraco core or custom data, needs to have an artifact representation. You can consider an artifact as a container capable of knowing everything there is to know about a particular entity is defined. They are used as a transport object for communicating between Deploy environments.
They can also be serialized to JSON representations. These are visible in the .uda files seen on disk in the /data/revisions/
folder for schema transfers. It is also used when transferring content between different environments over HTTP requests.
Artifact classes must inherit from DeployArtifactBase
.
The following example shows an artifact representing the entity and it's single property for transfer:
In most cases the default settings Umbraco Deploy uses for serialization will be appropriate. For example, it ensures that culture specific values such as dates and decimal numbers are rendered using an invariant culture. This ensures that any differences in regional settings between source and destination servers are not a concern.
If you do need more control, attributes can be applied to the artifact properties.
For example, to ensure a decimal value is serialized to a consistent number of decimal places you can use the following. RoundingDecimalJsonConverter
is found in the Umbraco.Deploy.Serialization
namespace
Service connectors are responsible for knowing how to handle the mapping between artifacts and entities. They know how to gather all the data required for the type of entity they correspond to, including figuring out what dependencies are needed. For example, in Umbraco, how a Document Type depends on a Data Type. They are responsible for packaging an entity as an artifact and for extracting an entity from an artifact and persisting it in a destination site.
Service connectors inherit from ServiceConnectorBase
and are constructed with the artifact and entity as generic type arguments.
The class is decorated with a UdiDefinition
via which the name of the entity type is provided. This needs to be unique across all entities so it's likely worth prefixing with something specific to your package or application.
The following example shows a service connector, responsible for handling the artifact shown above and it's related entity. There are no dependencies to consider here. More complex examples involve collating the dependencies and potentially handling extraction in more than one pass to ensure updates are made in the correct order.
An illustrative data service is provided via dependency injection. This will be whatever is appropriate for to use for Create, Read, Update and Delete (CRUD) operations around reading and writing of entities.
In Deploy 9.5/10.1, to improve performance on deploy operations, we introduced a cache. This change required the addition of new methods to interfaces, allowing the passing in of a cache parameter. In order to introduce this without breaking changes, we created some new interfaces and base classes.
In the example below, if instead we inherited from ServiceConnectorBase2
, which has a type parameter of IServiceConnector2
, we would be able to implement IArtifact? IServiceConnector2.GetArtifact(Udi udi, IContextCache contextCache)
. This would allow the connector to read and write to the cache and remove the use of the obsolete methods.
There's no harm in what is listed below though. It's only that the connectors won't be able to use the cache for any look-ups that are repeated in deploy operations. The obsolete methods won't be removed until Deploy 11. In that version we plan to return back to the original interface and class names. We also plan to introduce the new method overloads which will be a documented breaking change.
It's also necessary to provide an extension method to generate the appropriate identifier:
Umbraco entities often have dependencies on one another, this may also be the case for any custom data you are looking to deploy. If so, you can add the necessary logic to your service connector to ensure dependencies are added. This will ensure Umbraco Deploy also ensures the appropriate dependencies are in place before initiating a transfer.
If the dependent entity is also deployable, it will be included in the transfer. Or if not, the deployment will be blocked and the reason presented to the user.
In the following illustrative example, if deploying a representation of a "Person", we ensure their "Department" dependency is added. This will indicate that it must exist to allow the transfer. We can also use ArtifactDependencyMode.Match
to ensure the dependent entity not only exists but also matches in all properties.
As well as dependencies at the level of entities, we can also have dependencies in the property values as well. In Umbraco, an example of this is the multi-node tree picker property editor. It contains references to other content items, that should also be deployed along with the content that hosts the property itself.
Value connectors are used to track these dependencies and can also be used to transform property data as it is moved between environments.
The following illustrative example considers a property editor that stores the integer ID of an media item. The integer ID of a media item is not consistent between environments, so we'll need to transform it. And we also want to ensure that the related media item itself is transferred as well as the integer ID reference.
The third and final type of connector is the grid cell value connector. These are responsible for converting values and tracking dependencies for grid editors.
Umbraco Deploy comes with connectors for the built-in editors such as for media and macros.
It's also possible to create your own, as in this example.
The following grid editor is an illustrative example for a rudimentary custom image picker. It contains a package mamifest file:
And an editor view:
The value stored by the grid editor will be the integer Id of a media item.
When transferring this value to an upstream environment using Umbraco Deploy, these tasks are required:
The integer Id needs to be converted to a Guid for transfer. Guids for content and media are the same across environments but integer Ids will differ.
On creation in the upstream environment, the Guid value needs to be converted back to an integer for storage in the grid editor.
A dependency on the selected media item needs to be tracked, such that it will be transferred to the upstream environment along with the changes to the grid content.
We can implement that via a grid cell value connector. The following code is for Umbraco Deploy 10.1. For earlier versions, or from Umbraco Deploy 11, the base class should be GridCellValueConnectorBase
.
With the artifact and connectors in place, the final step necessary is to register your entity for deployment.
Connectors do not need to be registered. The fact that they inherit from particular interfaces known to Umbraco Deploy is enough to ensure that they will be used.
If custom entity types are introduced that will be handled by Umbraco Deploy, they need to be registered with Umbraco to parse the UDI references.
This is done via the following code, which can be triggered from a Umbraco component or an UmbracoApplicationStartingNotification
handler.
To deploy the entity as schema, via disk based representations held in .uda
files, it's necessary to register the entity with the disk entity service. This is done in a component, where events are used to trigger a serialization of the entity to disk whenever one of them is saved.
In Umbraco Deploy 9.4 a schema comparison feature was added to the dashboard available under Settings > Deploy. This lists the Deploy managed entities held in Umbraco and shows a comparison with the data held in the .uda
files on disk.
All core Umbraco entities, such as Document Types and Data Types, will be shown.
To include entities from plugins, they need to be registered using a method overload as shown above, that allows to provide additional detail, e.g.:
The parameters are as follows:
The system name of the entity type (as used in the UdiDefinition
attribute).
A human readable, pluralized name for display in the schema comparison dashboard user interface.
A flag indicating whether the entity is an Umbraco one, which should be set to false
.
A function that returns all entities of the type installed in Umbraco, mapped to an object exposing the Udi and name of the entity.
If the optimal deployment workflow for your entity is to have editors control the deployment operations, the transfer entity service should be used. This would be instead of registering with the disk entity service. The process is similar, but a bit more involved. There's a need to also register details of the tree being used for editing the entities. In more complex cases, we also need to be able to handle the situation where multiple entity types are managed within a single tree.
The following code shows the registration of an entity for Backoffice deployment, where we have the simplest case of a single tree for the entity.
The RegisterTransferEntityType
method on the ITransferEntityService
takes the following parameters:
The name of the entity type, as configured in the UdiDefinition
attribute associated with your custom service connector.
A pluralized, human-readable name of the entity, which is used in the transfer queue visual presentation to users.
A set of options, allowing configuration of whether different deploy operations like queue for transfer and partial restore are made available from the tree menu for the entity.
A value indicating whether the entity is an Umbraco entity, queryable via the IEntityService
. For custom solutions and packages, the value to use here is always false
.
The alias of the tree used for creating and editing the entity data.
We then have three functions, which are used to determine if the registered entity matches a specific node and tree alias. For trees managing single entities as we have here, we know that all nodes related to that tree alias will be for the registered entity. And that each node Id is the Guid identifier for the entity. Hence we can use the following function definitions:
Return true
for all route paths.
Return true
for all node Ids.
Return true
and parse the Guid identity value from the provided string.
For more complex cases we need the means to distinguish between entities. An example could be when a tree manages more than one entity type. Here we would need to identify whether the entity is being referenced by a particular route path or node ID. A common way to handle this is to prefix the GUID identifier with a different value for each entity. It can then be used to determine to which entity the value refers.
For example, as shown in the linked sample and video, we have entities for "Team" and "Rider", both managed in the same tree. When rendering the tree, a prefix of "team-" or "rider-" is added to the Guid identifier for the team or rider respectively. We then register the following functions, firstly for the team entity registration:
And then for the rider:
If access to services is required when parsing the entity ID, where the HttpContext
is provided as a parameter, a service can be retrieved. For example:
The remoteTree
optional parameter adds support for plugins to implement Deploy's "partial restore" feature. This gives the editor the option to select an item to restore, from a tree picker displaying details from a remote environment. The parameter is of type DeployRegisteredEntityTypeDetail.RemoteTreeDetail
that defines three pieces of information:
A function responsible for returning a level of a tree.
The name of the entity (or entities) that can be restored from the remote tree.
The remote tree alias.
An example function that returns a level of a remote tree may look like this:
Finally, the entitiesGetter
parameter allows you to pass a function that will return all entities for the registered type. This is necessary to allow support for the "set signatures" operation available via the backoffice settings dashboard. By providing a function that returns the collection of entities, the triggered operation will be able to prepare the signature for each one.
To complete the setup for partial restore support, an external tree controller needs to be added, attributed to match the registered tree alias. Using a base class available in Umbraco.Deploy.Forms.Tree
, this can look like the following:
Most features that are available for the deployment of Umbraco entities will also be accessible to entities defined in custom solutions or packages. The "Transfer Now" option available from "Save and Publish" or "Save" button for content/media is only available to custom entities if requested client-side.
You would do this in the custom AngularJS controller responsible for handling the edit operations on your entity. Inject the pluginEntityService
and calling the addInstantDeployButton
function as shown in the following stripped down sample (for the full code, see the sample data repository linked above):
Umbraco Deploy improves the efficiency of transfers by caching signatures of each artifacts in the database for each environment. The signature is a string based, hashed representation of the serialized artifact. When an update is made to an entity, this signature value should be refreshed.
Hooking this up can be achieved by applying code similar to the following, extending the ExampleDataDeployComponent
shown above.
Another way is to use the Deploy Dashboard in the Settings section of the Umbraco backoffice. Here you can see the status of ongoing or completed deployment processes. The status will show whether an operation has been triggered and whether it is in progress, has completed or has failed. The dashboard will show the status based on the marker files on the disk, eg. deploy-progress
. From the Deploy Dashboard it is also possible to trigger processes. Learn more about this dashboard in the article.
An introduction to this feature can be found in the second half of .
There's also a code sample, demonstrated in the video linked above, available .
Version specific documentation for upgrading to new major versions of Umbraco Deploy.
This page covers specific upgrade documentation for when migrating to major 13 of Umbraco Deploy.
If you are upgrading to a minor or patch of Deploy you can find the details about the changes in the release notes article.
Version 13 of Umbraco Deploy has a minimum dependency on Umbraco CMS core of 13.0.0
. It runs on .NET 8.
Version 13 contains a number of breaking changes. We don't expect many projects to be affected by them as they are in areas that are not typical extension points. For reference though, the full details are listed here:
The default value for the configuration option ResolveUserInTargetEnvironment
was changed to true
.
Configuration of relations was changed from a list to a dictionary.
Configuration of value connectors was changed from a list to a dictionary.
The following updates describe the more significant changes to the codebase and public API:
Moved value connectors for core property editors from Umbraco.Deploy.Contrib
into Umbraco.Deploy.Infrastructure
.
Renamed the Deploy add-on for Umbraco Forms from Umbraco.Deploy.Forms
to Umbraco.Forms.Deploy
.
These updates are more minor. We don't expect many projects to be affected by them as they are in areas that are not typical extension points:
Removed the obsolete IsHeadless
property from `UmbracoCloudClientConfigurationInfo``.
Made the ProcessX
methods for each step of content connectors private.
Removed the obsolete overload of SaveContentType
on ContentTypeConnectorBase
.
An obsolete constructor was removed from DictionaryItemConnector
.
QueueItemDto
was moved into the Umbraco.Deploy.Infrastructure.Persistence
namespace.
DocumentConnector
has a changed constructor such that we can use redirect tracking logic now exposed from CMS.
Remove now unnecessary interfaces and extension methods for rich text parsing. These were introduced to ensure backward compatibility in older versions: IMacroParser2
, IImageSourceParser2
and ILocaLinkParser2
.
You can find the version specific upgrade notes for versions out of support in the Legacy documentation on GitHub.