Blog

Load CMDB Data via Email

Loading CMDB via email isn’t the best option, you would be better off building an REST integration. However there are times when working with vendors or partners that email is the “best” option to load data in a particular situation.

RELATED LINKS

Here are some articles that helped me build this and can help you too!

IMPORT PROCESS

Important Note: It is best practice to run the import in a development instance first to see if you have any errors or mistakes. It can be difficult to remove a mistake in data after it is within production.

Import process is as follows

  1. Inbound Action. Processes the inbound email

  2. Script Include. Code that loads and schedules the import set

  3. Data Source. Data in Excel (or CSV)

  4. Import Set. Data is imported to a Import Set, a temporary location for the data

  5. Transform Map. Data is transformed to the Target Table

  6. Target Table. The end result of the import. CIs, Users, Groups, etc.

EXAMPLE IMPORT

STEP 1: INBOUND ACTION

I am reusing the code from this community post. You’ll want to substitute the importSetTableName variable for Name of Import Set and the transformMapIDs variable to the SysID of the Transform Map you create later.

The community post also references how to do with scoped applications, but I am going to skip that topic in this article.

Name: Load CMDB Data
Active: True
Stop Processing: true
Execution Order: 20
Type: New
When to Run
Conditions: Subject starts with CMDB Data Load
Script:
(function runAction(/*GlideRecord*/ current, /*GlideRecord*/ event, /*EmailWrapper*/ email, /*ScopedEmailLogger*/ logger, /*EmailClassifier*/ classifier) {
	var importSetTableName = "u_cmdb_data_load"; //Name of Import Set
	var transformMapIDs = "8a36e396db33330033ba8e4748961919";   //SysID of Transform Map
	var applicatonScope = "Global";

	// Create the datasource record
	current.name = "File import from: " + email.from;   //Feel free to rename this as appropriate
	current.import_set_table_name = importSetTableName;
	current.file_retrieval_method = "Attachment";
	current.type = "File";
	current.format = "Excel"; // For Excel Files
	//current.format = "CSV"; // For CSV Files
	current.header_row = 1;
	current.sheet_number = 1;
	current.sys_package.setDisplayValue(applicatonScope);
	current.sys_scope.setDisplayValue(applicatonScope);
	var dataSourceID = current.insert();

	/*
    * Schedule Load of Attachment
    *
    * This inbound email action will generate an import data source, however the attachment isn't copied to the data source until
    * after the insert of the record.   Scheduling the import to happen 30 seconds later so that attachment has time to be copied.
    */

	new global.EmailFileImportUtils().scheduleImport(dataSourceID, transformMapIDs);

})(current, event, email, logger, classifier);

STEP 2: SCRIPT INCLUDE

Name: EmailFileImportUtils
Accessible from: All application scopes
Script:
var EmailFileImportUtils = Class.create();
EmailFileImportUtils.prototype = {
	initialize: function() {
	},
	
	scheduleImport: function(dataSourceID, transformMapIDs) {
		/*
 		* Create scheduled job to process import
 		*
 		* The inbound email action will generate an import data source, however the attachment isn't copied to the data source until
 		* after the insert of the record.   The code below will create a scheduled job to process the import 30 seconds later
 		* so that attachment has time to be copied to the data source from the email.
 		*/
		
		var schRec = new GlideRecord("sys_trigger");
		schRec.name = "Load Data Source: " + dataSourceID;
		schRec.trigger_type = 0;   // Run Once
		schRec.script = "new global.EmailFileImportUtils().loadImportSet('" + dataSourceID + "', '" + transformMapIDs + "')";
		
		var nextAction = new GlideDateTime();
		nextAction.addSeconds(30);   // 30 seconds should be enough time however this can be changed.
		schRec.next_action = nextAction;
		schRec.insert();
	},
	
	loadImportSet: function(dataSourceID, transformMapIDs) {
		// Get Datasource Record
		var dataSource = new GlideRecord("sys_data_source");
		dataSource.get(dataSourceID);
		
		// If CSV and header isn't on row 1, recreate attachment with empty rows removed
		if (dataSource.getValue("format") == "CSV" && dataSource.getValue("header_row") > 1) {
			var attachmentRec = new GlideRecord("sys_attachment");
			attachmentRec.addQuery("table_sys_id", dataSource.getValue("sys_id"));
			attachmentRec.query();
			if (attachmentRec.next()) {
				var oldAttachmentID = attachmentRec.getValue("sys_id");
				var inputStream = new GlideSysAttachment().getContentStream(oldAttachmentID);
				var textReader = new GlideTextReader(inputStream);
				var ln = " ";
				var newLine = "";
				
				var lineCounter = 0;
				var headerRow = parseInt(dataSource.getValue("header_row"));
				while((ln = textReader.readLine()) != null) {
					lineCounter += 1;
					if (lineCounter < headerRow) 
						
					
					
					if (ln.length > 1) {
						newLine += ln + "\n";
					}
				}
				new GlideSysAttachment().write(dataSource, 'Changed ' + attachmentRec.getValue('file_name'), "text/csv", newLine);
			}
		}		
		
		// Process data source file
		var loader = new GlideImportSetLoader();
		var importSetRec = loader.getImportSetGr(dataSource);
		var ranload = loader.loadImportSetTable(importSetRec, dataSource);
		importSetRec.state = "loaded";
		importSetRec.update();
		
		// Transform import set
		var transformWorker = new GlideImportSetTransformerWorker(importSetRec.sys_id, transformMapIDs);
		transformWorker.setBackground(true);
		transformWorker.start();
	},
	
	type: 'EmailFileImportUtils'
	
};

STEP 3: CREATE EXCEL FILE

Here is an example excel file. You can import any number of columns and types.

Class | Asset tag | Serial number | Manufacturer | Model ID | Assigned to
cmdb_ci_computer | P1000241 | 56WHL71 | Gateway | Gateway DX Series | Carol Coughlin
cmdb_ci_computer | P1000640 | FVG-200-L80989-GT | Apple | Apple MacBook Air 13" | Florine Willardson
cmdb_ci_computer | P1000512 | KIE-450-K88260-FO | Apple | Apple MacBook Pro 15" | Mellissa Sule
cmdb_ci_computer | P1000479 | BQP-854-D33246-GH | Apple | Apple MacBook Pro 15" | Miranda Hammitt
cmdb_ci_computer | P1000249 | RSB-980-E66113-CM | Dell Inc. | Dell Inc. PowerEdge T610 | Genevieve Kekiwi
cmdb_ci_computer | P1000251 | KWF-742-G95931-TF | Dell Inc. | Dell Inc. PowerEdge T610 | Emilia Oxley
cmdb_ci_computer | P1000250 | FBJ-435-J27123-DL | Dell Inc. | Dell Inc. PowerEdge T610 | Freida Michelfelder
cmdb_ci_computer | P1000252 | QVQ-746-C74363-IT | Dell Inc. | Dell Inc. PowerEdge T610 | Darrell Amrich
cmdb_ci_computer | P1000471 | ERU-672-W29354-HW | IBM | IBM Thinkpad T20 | Roger Seid
cmdb_ci_computer | P1000454 | GOU-623-U55281-RH | IBM | IBM Thinkpad T20 | Suzette Devaughan
cmdb_ci_computer | P1000467 | OGT-690-V53734-CD | IBM | IBM Thinkpad T20 | Tyree Courrege
cmdb_ci_computer | P1000465 | LSD-159-C38065-XA | IBM | IBM Thinkpad T20 | Tammie Schwartzwalde

STEP 4: LOAD DATA

Loading the data initially will create the import set. I suggest just loading 1 row of data to create the import set table and columns. There are other ways to create the import set, but this probably the easiest.

1. Login as an administrator
2. Left Navigator Bar > System Import Sets > Load Data

  • Create Table

  • Import Set Table: CMDB Data Upload [u_cmdb_data_load]

  • Source of the import: File

  • File: Select your spreadsheet

  • Sheet number: 1

  • Header row: 1

3. Click Submit
4. After Import Set Table is created, Click Create Transform Map link.

STEP 5: TRANSFORM MAP

TRANSFORM MAP

1. Fill in fields

  • Name: CMDB Data Load

  • Source Table: Configuration Item [cmdb_ci]

  • Target Table: Incident

  • Run Business Rules: true

  • Run Script: true

  • Enforce Mandatory fields: No

  • Copy empty fields: false (You can also put true if you want to blank out fields)

  • Script: target.name = source.u_asset_tag; //Your naming method

2. Click Save

FIELD MAPS

The field maps map the fields from the import set to the target table. In this example,

Source | Target | Coalesce | Choice action
u_model_id | model_id | false | ignore
u_serial_number | serial_number | true | ignore
u_manufacturer | manufacturer | false | ignore
u_asset_tag | asset_tag | false | ignore
u_class | sys_class_name | false | ignore
u_assigned_to | assigned_to | false | ignore

COALEASE METHODS

You should set the coalease to match in the field map on a certain field. A common field for matching for configuration items is serial number.

If you need something more complex, here is an article that can help:

TRANSFORM SCRIPTS

Often data isn’t perfect. You may need to write a transform script to manipulate certain data. A common thing to do is write a transform script for a reference field to find a sys_id.

Here is an article on how to write transform scripts if needed:

STEP 6: TEST

After your Import Set, Transform Map, and Transform Script are ready. Email into ServiceNow the cmdb data file. Take a small sample size first so that you can test this.

STEP 7: REVIEW RESULTS

You can view the import set (u_cmdb_data_load) to see what data was processed or if there are errors.

Running an import in development allows you to test multiple times until it is correct.

Best of luck in your imports!

Tags Development, Integrations