Outbound Jobs
Overview
This section outlines how to create and manage your Outbound Jobs.
You can find your outbound jobs by clicking on the Outbound Data menu item.
Outbound Transfers Overview
App Filter – Filter jobs to specific connected apps. For example, click DataStation to only see jobs that were created via the DataStation.
View Mode – Click the toggle button to change between showing or hiding the steps for each job.
Job Listing – Displays a list of all active jobs, and, if the View Mode is set to Expanded, also shows the steps under each job.
Job History – Displays recent job runs for all jobs.
Job Menu
The list of jobs contains actions that you can use to monitor and manage each job.
If the View Mode is set to expanded, you can also manage each job step, as well.
Edit – Edit this job definition.
Delete – Delete this job definition, including all of its steps.
View Job History – Displays all previous runs for this job.
View Hashing Info – Displays information about the current state of the record hashes for this job, including the ability to check if an ID exists in the hash, delete a single hash, or delete all hashes.
Run Job Now – Immediately starts an ad-hoc run of the current job.
Edit Step – Edits this job step.
Delete Step – Deletes this job step.
Add Step – Adds a new step to this job. Use the Order field to control when the step is executed in the sequence.
Adding an Outbound Transfer
Press the Add Transfer Job button to add a new job. The Create Job form will appear.
Data Settings
Choose a data source for this job.
IQA
If "IQA" is selected, you must provide an IQA which meets the following criteria:
Returns at least one row (required initially so the DataStation can read the column names, afterwards the IQA may return 0 results)
Has no user-facing prompts (required or optional)
Filters are OK to add as long as Prompt is set to None.
Delta Hashing
If enabled, you must specify a Key Column Name that DataStation will use to uniquely identify each record in your dataset.
Good examples of key columns are:
iMIS ID or Contact Key (most common)
(For a list of events) Event Code
(For a list of groups) Group ID
(For a list of transactions) Transaction ID + Line Number
(For a list of batches) Batch Number
(For a list of activities) Activity Sequence Number
You may also elect to enable Send Empty Dataset, which will send an empty payload to the destination endpoint(s) (e.g. an empty JSON array, or an empty or headers-only file).
If this setting is off, no payload is built and no request(s) is/are made.
None
If "None" is selected, the job will run without any data source.
Tip!
Jobs with no data source are useful if you want to invoke an HTTP endpoint on a regular schedule, and you have a static payload that you want to send (that doesn't depend on any iMIS data).
General Settings
Crontab Schedule
DataStation uses Quartz-style crontabs to define job schedules.
You can use this online tool to generate or validate a Crontab expression, or click the Schedule Builder button to the right of the Crontab field to open the Schedule Builder, where you can fill out a form to define a one-time or recurring schedule.
Examples
Quartz-style crontabs contain two an extra parameters (Second and Year), compared to a traditional Unix-style crontab expression.
In addition, either the day-of-month OR day-of-week fields MUST contain a "?".
The Quartz-style crontab is expressed as:
Second | Minute | Hour | Day-of-Month | Month | Day-of-Week | Year |
---|
Some examples:
0 0 6 * * ? *
– At 6:00 AM, every day0 0 16 ? * 1 *
– At 4:00 PM, only on Sundays0 30 2/6 ? * * *
– At 2:30 AM, 8:30 AM, 2:30 PM, and 8:30 PM (Every 6 hours, beginning at 2:30 AM), every day0 15 22 8 6 ? 2021
– At exactly 10:15 PM on June 8, 2021 (Runs only once)
Job Draft / Schedule Tip!
If you want to create a job, but not have the schedule run just yet, replace the year parameter in the Crontab expression with a year far in the future, such as 2050.
For example, if you wanted to draft a job that ran at 8AM every day, replace this crontab:
0 0 8 * * ? *
with this one:
0 0 8 * * ? 2050
And the job will not run until you update the schedule expression again.
To undo, simply replace "2050
" with "*
", which instructs the job to run every year.
Number of History Records to Keep
Specify the number of total job history records that you want the DataStation to retain.
After this count has exceeded, DataStation automatically prunes old history records.
You may specify a value from 1 to 500.
Example
If you have a job that runs nightly, and you want to keep 6 months of history logs, enter 180.
E-mail Alert Settings
E-mail Alert Setting
Specify when you would like to receive e-mail alerts after a job has completed:
None – This option disables all e-mail notifications.
Errors Only – You will only receive a summary e-mail if the job fails.
Always – You will receive an e-mail summary each time the job runs. The e-mail will indicate if the job succeeded or failed.
Email Address(es)
Specify one or more e-mail addresses where you would like to receive the job notifications.
Separate multiple e-mail addresses with a comma (,
).
Finishing Up
When finished, press Save New Job.
If successful, you will see a message saying that your job definition has been created. You will immediately be taken to the "Add New Step" screen, where you can proceed to set up your first job step.
See Outbound Job Steps for more information about creating and managing individual job steps.
Troubleshooting/Guidelines
SELECT DISTINCT (unique results) causes performance issues and should be avoided unless absolutely necessary
iTransfer already performs deduplication of data under certain circumstances
"NOLOCK" should be used for increased iMIS performance while iTransfer jobs are running
Limit by should not be used at all since it is not respected by the REST API
There should be no visible "prompt" filters - only internal (Prompt=None) filters
All columns should have aliases
Do not use "Item" as this is a reserve word.
"Hidden" columns (starting with underscores) are not actually "hidden" in iTransfer so if you don't want a column, just don't include it
Using sorting causes issues / sorting is not always respected
Test the IQA both using the "Run" tab and the "Report" tab to ensure performance is good, and verify the number of total records matches what you expect