Returning Json response from Azure Function v3+

It can be a pain to return json-responses from a Function App, when you also want the proper Content-Type HTTP header to be returned, which it of course should.

If you do it the straight-forward way (works the same with OkObjectResult) –

return new BadRequestObjectResult(jsonResponse);

if response contains a json string, it will be returned ok, but the Content-Type will be “text/plain”

BadRequestObjectResult resp = new BadRequestObjectResult(jsonResponse);
resp.ContentTypes.Add(MediaTypeHeaderValue.Parse("application/json"));
resp.Value = jsonResponse;
return resp;

if response contains a json string, the special characters (like “) will be escaped, and it will no longer return valid json, but the Content-Type is ok (application/json)

If the content-type is not set in above example, the special characters are not escaped

If you return a JsonResult as suggested in some places, the ContentType will be correct, but the Json will be escaped.

return new JsonResult(jsonResponse);

Make a class like this

using System.Text.Json;
using Microsoft.AspNetCore.Mvc;

namespace Azure.Function.Returning.Json.Response {
  public class CustomJsonResult : ContentResult {
    private const string ContentTypeApplicationJson = "application/json";

    public CustomJsonResult(object p_value, JsonSerializerOptions p_options = null, int p_statusCode = 200) {
      ContentType = ContentTypeApplicationJson;
      Content = p_options == null ? JsonSerializer.Serialize(p_value) : JsonSerializer.Serialize(p_value, p_options);
      StatusCode = p_statusCode;
    }
  }
}

and return response like

 return new CustomJsonResult(jsonResponse, new JsonSerializerOptions { PropertyNamingPolicy = null }, 400);

This way, correct json will be returned as well as correct Content-Type

Boomi – Java keystore

If you need to interact directly with the Java keystore (cacerts) that Boomi uses, then the default password for this is “changeit”

It does not seem to be documented anywhere – but it is included in a few of the example keytool commands that can be found in Boomi Community – for instance the one below.

https://community.boomi.com/s/article/Connecting-to-an-IFS-Applications-server-using-a-self-signed-certificate

 

How to compare two revisions of a Boomi component

Everything that you develop in Boomi (processes, maps, connectors, profiles, etc.) is a component and a component in Boomi is always represented (behind the pretty GUI) as a XML document.

This means, that if you have made any change in a Boomi component, and wants to verify which change that you made, you can determine this by comparing two revisions of the ComponentXML for the changed component. This will be a text-comparison and not a GUI-one, but with a little practice, it is easy enough to interpret.

Continue reading “How to compare two revisions of a Boomi component”

Excel – open as “ReadOnly” / Viewing by default

If you open Excel files from Sharepoint, they by default open in Edit mode.

This sadly means, that whenever people interact with the sheet – for instance by filtering, this filtering will be saved. Extreeeemely annoying.

This may very well not be what you want. Fortunately, you can still set you Excel file to open by default in “Viewing” mode.
Continue reading “Excel – open as “ReadOnly” / Viewing by default”

Miracast – Wireless display/screen

While Google ChromeCast and products like Xiaomi Mi TV Stick supports wireless screen-sharing  and wireless streaming, they come with a caveat that may not make them the best solution in all cases

They require:

  • An mobile app to control them
  • A google account to use them

Especially the google account makes it a no-go for me.

However, if all you want it to stream Netflix etc., both these products might be a better option for you.

But if all you are interested in is a wireless screen for your pc or mobile device, there are other, simpler options

Continue reading “Miracast – Wireless display/screen”

Boomi ExecutionRecords – Leveraging Atomsphere API

Recently, I had to extract all the documents of a Boomi process run to investigate a support case. It was 2000+ documents, and I was in no mood to do it manually.

Boomi has detailed how to do it here, and luckily, a lot of the code and techniques from the “Code Search” project, could be reused. But it is by no means simple to do – 5 different API calls are needed, and for each document to be fetched, two of these needs to be called repeatedly handling HTTP 202’s in the process. This is also quite slow. It took several tries, and my 2000 documents took around 2+ hours to download.

Continue reading “Boomi ExecutionRecords – Leveraging Atomsphere API”

Boomi Code Search – Leveraging Atomsphere API

The Boomi Atomsphere API

If you like me would like to search though all your Boomi components for a certain text, the choice is very straightforward, because there is only one.

You have to code it yourself.

For this and many other uses, Boomi has their Atomsphere API, which does indeed expose quite a lot of functionality, but not quite as much as you might want.

Possible uses of the api:

  • Search code for text
  • Test code for known coding issues with external tools
  • Start execution of processes
  • Query a specific process execution for its results
  • …….

Continue reading “Boomi Code Search – Leveraging Atomsphere API”

Boomi – Web Server Listening port not showing in netstat

When you install a new Boomi Atom, and the “Shared Web Server” is enabled on port 9090 (default), you sort of expect that the java atom process is immediately listening at that port.

I was searching for a method to locally query the atom for its generic status, and this page, says that a special Endpoint is always available

http://<host>:<port>/_admin/status

This Endpoint is not authenticated.
Continue reading “Boomi – Web Server Listening port not showing in netstat”

LogicApps – remember state of a subscribing trigger

LogicApps has two different kinds of triggers

  • Polling
  • Subscribing

The polling triggers are easy. Whenever the LogicApp is enabled and its trigger is scheduled, it will poll the source for new messages.

The subscribing triggers, like CDS(Dataverse)- and SQL-triggers, are a little harder.

Those triggers has state and remembers where it left off, so if you disable the logic app and re-enable it, then it will trigger for all of the items it has not yet seen since the last time the trigger fired.

This can give you massive problems, if you disabled the LogicApp for the purpose of mass-updating eg. SQL rows without triggering the LogicApp.
Continue reading “LogicApps – remember state of a subscribing trigger”

Azure Devops DACPAC deployment to Azure SQL overwrites database properties

We have a solution, where we use DACPAC deploy in an Azure Devops pipeline to maintain the Azure SQL database.

Apart from the normal issues (like adding not-null columns), this works ok.

But recently, the Query Store ran full, without SQL-server deleting old plans (as it was configured to)

I therefore increased the “Max Size” of the Query Store which solved the problem, but some time later, the “Max Size” value was reverted back to its default value (a measly 100MB)

It turns out, it was the DACPAC deployment that overwrites this setting and constantly reverts it.

The default value is specified in the database project settings in the VS project – under “Project settings” / “Database Settings” / “Optional”.

Here all the default values can be found, including 6 for the Query Store

It seems to be possible to avoid the DACPAC deployment to overwrite database-options as a whole.

Unchecking the “Deploy database properties” in the database project publish does not have any effect.

But here, they mention a “ScriptDatabaseOptions” which can be set to false when using a programmatic deploy using “DacServices“.

Checking the SqlPackage options (which can also be specified in an ‘Azure SQL Database deployment‘ task), this property is found

/p: ScriptDatabaseOptions=(BOOLEAN 'True') Specifies whether target database properties should be set or updated as part of the publish action.

So, it should be enough to add this in the “Additional SqlPackage.exe Arguments” box in Azure Devops

/p:ScriptDatabaseOptions=False

To me, it does not make sense to have the deployment-task overwrite these kinds of settings – at least you would then need to plan your deployments appropriately – expecially “emergency” like deployments to update values.