Chuck Tomasi

3 minute read

Data Sources Refresher

Before I cover the newest feature, I want to provide a quick refresher on what data sources are and the role they play with importing data into your ServiceNow instance.

Data sources are the first part of importing data from files, JDBC (databases), web services, and more. The data is stored in a staging table called an import set, a tabular representation of the data you extracted from the data source. That data is then run through a transform map to translate incoming fields (or columns) to fields on the target table and optionally do additional transformation such as date formatting and normalization before saving the resulting record in a target table. Here’s a diagram to help visualize the process.

ImportingDataDataFlow.png

Data Stream Refresher

Data streams are special actions built-in to Flow Designer’s “action designer”. They are great for large data sets (>10MB) and implicitly use pagination. As an example, I want to get 100,000 vehicle records from a vendor via a REST API, but it takes 15 minutes to get all those records in one request. The vendor’s API times out after 30 seconds and the resulting response record of 100K records would be too large to store and process. That’s where pagination comes in. Your data stream action can request 100 records at a time which only takes 1 second and the flow processes them one record at a time without knowledge of the page size. Making them easy to build and easy to consume in a Flow.

Note: Data streams require an IntegrationHub license.

Using a Data Stream as a Data Source

With Paris, Data Sources now include the ability to use an existing Data Stream as a Data Source type.

2020-08-11-08-39-10.png

Here are the steps to use a Data Stream as a Data Source. Note, this assumes you have a working Data Stream properly configured with working credentials.

  1. Navigate to System Import Sets > Administration > Data Sources.

  2. Click New to create a new data source.

  3. Provide a Name, Import set table label (the import set table name will automatically generate), and set the Type to Data Stream (IntegrationHub).

  4. The field *Data in a single column** offers you a choice to use the JSON input as a JSON string in a single field, or parse the properties and put the values in fields (similar to what you would see from a spreadsheet import).

    Sample import set record WITH data in a single column

    2020-08-11-08-48-16.png

    Sample import set record WITH data in a single column

    2020-08-11-08-49-58.png

  5. In the Data Stream action field, pick your data stream.

  6. Save the record again.

    If your Data Stream has any input variables, they will appear on the form at this point.

  7. Populate the variable, if needed.

  8. Save the record again.

  9. Click Test load 20 records.

At this point, inspect your imported data. When I did this with the “single column” option turned on, I noticed that the field was only 40 characters and far too small for my JSON string. You may have to configure the dictionary entry on the ‘data’ field to be large enough. This is typical for many import sets that you may have to adjust field sizes and types.

Now that you’ve tested the Data Stream as a Data Source, you are ready to build your transform map and any additional logic to import records.

Reference


Comments