Orlando IntegrationHub Changes
The Orlando release has been in Early Availability for weeks and GA is right around the corner. I previously covered the Flow Designer changes in Orlando and in this post I will do the same for IntegrationHub.
New York introduced Dynamic Inputs and now with Orlando we have Dynamic Outputs. These allow developers to define the data structure of a complex object programmatically and then use that as the output of an action.
Similarly to Dynamic Inputs, there must be a Data Gathering Action that can hit an REST endpoint and execute script, ultimately to determine the structure of the output object. An example of this would be an action to export to a remote table, either another instance or another system. The Data Gathering Action would include a call to the system to determine the structure of the object on the other side. Then when used in authoring the action, the complex object for the output is automatically filled out.
I can state from recent first-hand experience that while the power of returning complex objects is great, the act of building those out can be tedious. This is exactly the type of task I would prefer to seen offloaded to robots if I could.
Note - feature requires IntegrationHub Enterprise Pack
Data Stream Features
Data Streams were introduced in New York and were given a beefier feature set in Orlando. There are three big parts to this.
- Data Streams can be defined with templates. By starting with the “Limit/Offset” template, you Data Stream Action will be prepopulated with the logic and calls to use the common pagination paradigms for APIs. It almost always requires some tweaking (if the parameters aren’t named exactly “limit” and “offset” for example) but it is much faster and easier than starting from scratch.
Data Streams can be called from script. By calling the API
executeDataStreamAction(String name, Map inputs, Number timeout)
it will return a ScriptableDataStream object. This is an iterator that can be used to act across the data set on each individual object returned from the Data Stream. Goodbye, coding up pagination logic yourself!
Test Data Stream functionality has been introduced. Now you can test these actions like any Flow or Action in the system. Do be aware that the test will only return the first 20 rows if there are more than that. I was confused by this at first. It is not your pagination logic, it is the way the test works for this feature. Depending on the system being tested against, the returned set could be very large so a representative sample is all that is returned.
Import Sets via IntegrationHub
When setting up Import Sets, the ability to drive that import via a REST Step in IntegrationHub is an option. When creating a new Data Source “REST (IntegrationHub)” is an option for type. When that is selected the form will show the options needed, including selecting the DataSource Request Action. There will also be a quick link to create one.
If you click the link, you will be taken to the Designer page in an Action creation form. When filled out and accepted, the Action editor will be in a pre-configured form with the necessary inputs already created and the REST step added.
If necessary, by clicking a checkbox you can also have a script step added to the action automatically. If there is some sort of logic needed between the inputs and REST step, this is where you would put it.
I’m biased as now I work for the IntegrationHub team, but I find this exciting. In my new role I am using some of this, including building DataStream Actions from the pagination templates. I can testify it saved me a lot of time doing pretty tedious stuff. In the future I hope to explore Dynamic Outputs a little more. I’ve run across some use cases that might call for it, so I have more to learn there.
I hope all the developers in the community can share a little of that excitement and use these features and all the capabilities to deliver faster, at a higher quality and with a better UX for the creators of Flows. Happy developing and let me know how it goes for you!