We have a solution, where we use DACPAC deploy in an Azure Devops pipeline to maintain the Azure SQL database.
Apart from the normal issues (like adding not-null columns), this works ok.
But recently, the Query Store
ran full, without SQL-server deleting old plans (as it was configured to)
I therefore increased the “Max Size
” of the Query Store which solved the problem, but some time later, the “Max Size
” value was reverted back to its default value (a measly 100MB)
It turns out, it was the DACPAC deployment that overwrites this setting and constantly reverts it.
The default value is specified in the database project settings in the VS project – under “Project settings” / “Database Settings” / “Optional”.
Here all the default values can be found, including 6 for the Query Store
It seems to be possible to avoid the DACPAC deployment to overwrite database-options as a whole.
Unchecking the “Deploy database properties
” in the database project publish does not have any effect.
But here, they mention a “ScriptDatabaseOptions
” which can be set to false when using a programmatic deploy using “DacServices
“.
Checking the SqlPackage options (which can also be specified in an ‘Azure SQL Database deployment
‘ task), this property is found
/p: ScriptDatabaseOptions=(BOOLEAN 'True') Specifies whether target database properties should be set or updated as part of the publish action.
So, it should be enough to add this in the “Additional SqlPackage.exe Arguments
” box in Azure Devops
/p:ScriptDatabaseOptions=False
To me, it does not make sense to have the deployment-task overwrite these kinds of settings – at least you would then need to plan your deployments appropriately – expecially “emergency” like deployments to update values.