This section outlines how to create and manage your Outbound Jobs.
You can find your outbound jobs by clicking on the Outbound Data menu item.
Outbound Transfers Overview
- App Filter – Filter jobs to specific connected apps. For example, click DataStation to only see jobs that were created via the DataStation.
- View Mode – Click the toggle button to change between showing or hiding the steps for each job.
- Job Listing – Displays a list of all active jobs, and, if the View Mode is set to Expanded, also shows the steps under each job.
- Job History – Displays recent job runs for all jobs.
The list of jobs contains actions that you can use to monitor and manage each job.
If the View Mode is set to expanded, you can also manage each job step, as well.
- Edit – Edit this job definition.
- Delete – Delete this job definition, including all of its steps.
- View Job History – Displays all previous runs for this job.
- View Hashing Info – Displays information about the current state of the record hashes for this job, including the ability to check if an ID exists in the hash, delete a single hash, or delete all hashes.
- Run Job Now – Immediately starts an ad-hoc run of the current job.
- Edit Step – Edits this job step.
- Delete Step – Deletes this job step.
- Add Step – Adds a new step to this job. Use the Order field to control when the step is executed in the sequence.
Adding an Outbound Transfer
Press the Add Transfer Job button to add a new job. The Create Job form will appear.
Choose a data source for this job.
DataStation uses Quartz-style crontabs to define job schedules.
You can use this online tool to generate or validate a Crontab expression, or click the Schedule Builder button to the right of the Crontab field to open the Schedule Builder, where you can fill out a form to define a one-time or recurring schedule.
Quartz-style crontabs contain two an extra parameters (Second and Year), compared to a traditional Unix-style crontab expression.
In addition, either the day-of-month OR day-of-week fields MUST contain a "?".
The Quartz-style crontab is expressed as:
0 0 6 * * ? *– At 6:00 AM, every day
0 0 16 ? * 1 *– At 4:00 PM, only on Sundays
0 30 2/6 ? * * *– At 2:30 AM, 8:30 AM, 2:30 PM, and 8:30 PM (Every 6 hours, beginning at 2:30 AM), every day
0 15 22 8 6 ? 2021– At exactly 10:15 PM on June 8, 2021 (Runs only once)
Job Draft / Schedule Tip!
If you want to create a job, but not have the schedule run just yet, replace the year parameter in the Crontab expression with a year far in the future, such as 2050.
For example, if you wanted to draft a job that ran at 8AM every day, replace this crontab:
0 0 8 * * ? *
with this one:
0 0 8 * * ? 2050
And the job will not run until you update the schedule expression again.
To undo, simply replace "
2050" with "
*", which instructs the job to run every year.
Number of History Records to Keep
Specify the number of total job history records that you want the DataStation to retain.
After this count has exceeded, DataStation automatically prunes old history records.
You may specify a value from 1 to 500.
If you have a job that runs nightly, and you want to keep 6 months of history logs, enter 180.
E-mail Alert Settings
E-mail Alert Setting
Specify when you would like to receive e-mail alerts after a job has completed:
- None – This option disables all e-mail notifications.
- Errors Only – You will only receive a summary e-mail if the job fails.
- Always – You will receive an e-mail summary each time the job runs. The e-mail will indicate if the job succeeded or failed.
Specify one or more e-mail addresses where you would like to receive the job notifications.
Separate multiple e-mail addresses with a comma (
When finished, press Save New Job.
If successful, you will see a message saying that your job definition has been created. You will immediately be taken to the "Add New Step" screen, where you can proceed to set up your first job step.
See Outbound Job Steps for more information about creating and managing individual job steps.
- SELECT DISTINCT (unique results) causes performance issues and should be avoided unless absolutely necessary
- iTransfer already performs deduplication of data under certain circumstances
- "NOLOCK" should be used for increased iMIS performance while iTransfer jobs are running
- Limit by should not be used at all since it is not respected by the REST API
- There should be no visible "prompt" filters - only internal (Prompt=None) filters
- All columns should have aliases
- Do not use "Item" as this is a reserve word.
- "Hidden" columns (starting with underscores) are not actually "hidden" in iTransfer so if you don't want a column, just don't include it
- Using sorting causes issues / sorting is not always respected
- Test the IQA both using the "Run" tab and the "Report" tab to ensure performance is good, and verify the number of total records matches what you expect