Job Manager
Anything classified as job in the software is tracked in the NU_JOB_MSTR table in the BusinessPlus database, including CDD Reports, tools launched from pages, and utilities. The Job Manager plugin in the Administrative Console (Database Admin, Utilities) is used to monitor the jobs.
Jobs represent work that is being done on a user's behalf by the 7i server or the application server. There are several types of jobs. They are all tracked through the same mechanism and consolidated on the same Jobs list.
Possible job types:
Classic Job | Jobs processed by the application server. |
CDD Report | Reports run from CDD installed on a client PC. |
CDD 7i Report | Reports run through 7i. |
Tool Execution | Tools requested from the options bar on 7i pages. |
Utility Job | Utility jobs processed by the application server. |
Workflow | Jobs processed by the 7i Workflow engine, submitted from multiple areas of 7i. |
Menu Options
Refresh: Refreshes the Job Manager information. The Job Manager automatically refreshes based on user settings.
Filter By Type: Since there can be a significant amount of data in the Job Manager, use this option to remove job types that may not be important. For example, Classic Interact jobs are interactive jobs on the Application Server and may not be essential depending on the needs of the user monitoring job status on the system.
Filter By Status: Allows jobs to be filtered based on their status. This option is useful to, for instance, remove large numbers of completed jobs to make the list more manageable.
Top-Level Group: Changes the top-level grouping in the Job Manager to be based on Status, Type, or User. Only one top-level grouping can be defined at any given time.
Cancel Job: If the Job currently selected can be cancelled, this toolbar button is enabled as well as a Cancel Job context menu item, setting the status of the Job to CR (Cancel Requested). When the software reaches a point where it can safely cancel the job, the job will be terminated, and the status automatically changed to CA (Cancelled). Whether a job can be cancelled is based on both the type of job and the status of the job.
Search
By default, the Job Manager lists the jobs created with today's date. The Job Search panel allows the user to search for other jobs in the system based on the criteria listed in the panel. To return the Job Manager to its default, simply clear the search criteria and select Search.
Job Tracking
When jobs are introduced into the system, a record is created in the table nu_job_mstr. This record will track the progress of a job from introduction to completion, including any success or failure information and other relevant details. The information shown on the 7i Jobs page and the Job Manager screen is collected from these nu_job_mstr records.
Job records will remain in nu_job_mstr until they expire and are removed, based on the settings defined in the Configure Local Server console plugin. When the record of a job is removed, the corresponding job output may optionally be removed as well. Within nu_job_mstr, the status of a job is encoded in the column nuj_status with one of the following values:
CA | Cancelled (by user request or due to job limit) |
FL | Failed (completed unsuccessfully or terminated) |
IN | Initialized (newly created job) |
IP | In progress (started by the Workflow Engine, processing) |
OK | Successful completion |
QU | Queued, pending (job not yet executing) |
WF | Queued by Workflow (instance not yet processed by Workflow Engine). A few key nu_job_mstr fields that may be useful:
|
Documents Online
From the Administrative Menu, search by specific output. When one or more documents are found in Documents Online, a plus sign will be shown next to the job. Click the plus sign to expand the list of available documents, which may be opened by double-clicking.
Job Output
There are two main types of output produced by BusinessPlus: reports and tail sheets. Reports are created by jobs that are specifically designed to produce output. Tail sheets are produced by the system as a record of the job itself, including details about the job request, the actual processing and the results, where relevant.
Job output is stored within Documents Online. In addition to being available through links on the Job Manager page, the job output and tail sheet may also be displayed using the Documents Online pages.
In the case of classic jobs, the View Tail Sheet option will attempt to contact the application server to obtain the first 100 lines of the tail sheet. This option requires some configuration on the server (x_print_cmds) to ensure that copies of all tail sheets are maintained in a known location.
In the case of Workflow Jobs that are in progress, the View Tail Sheet option will read the relevant ifas_output_dtl records from the database and present the incomplete tail sheet information for the job in progress.
The user can display and store the BT70 or workflow job output in plain text and XML.
View Tail Sheet
If the Job being displayed has tail sheet information available, it can be accessed by either double-clicking the job in the list or by right-clicking and selecting View Tail Sheet. Tail sheets can include output on the Application Server or tail sheets archived in Documents Online.
Properties
To display the full property list for a given job, right-click the desired job and select the Properties menu option.
User Settings
To control how often the job information is refreshed, select the Preferences, Settings option from the toolbar. The Job Manager tab allows the user to select the frequency of the Job Status refresh queries.
Classic Job Output
Classic BusinessPlus jobs run on an application server may also have their output redirected to Documents Online, but this requires additional configuration. To use this feature, it is necessary to specify a particular printer (typically called Workflow), which must be defined in the
$XPORTDIR/scripts/x_print_cmds file on the application server. Typically, that printer definition looks like this:
workflow)
wfqueue -v -u $3 -m $5 -j $8 $1 >>/tmp/wfqueue.log 2>&1
;;
In most cases, it is also desirable to store classic tail sheets in a location that is accessible from 7i. This requires using a Documents Online printer (Workflow) or copying all STDLIST output (the tail sheets) to a STDLISTS subdirectory. When the STDLISTS subdirectory is used, the tail sheet copies are retained on the application server and can be partially displayed (the first hundred lines) using the link on the Jobs page. In this configuration, the application server file $XPORTDIR/scripts/x_print_cmds should contain logic similar to the following:
case $5
in
STDLIST) export auditdir=$XPORTDIR/.spool/STDLISTS
shortname=`basename $1`
cp $1 $auditdir/$shortname
mask=`grep ":JOB" $shortname|cut -c5-20|cut -f1 -d","`
echo "$mask `date` $1 $3" >> $auditdir/STDLIST.log
exit
;;
esac
This logic moves each classic tail sheet into the xport/.spool/STDLISTS directory that is searchable by the 7i Jobs page.
Workflow Jobs
Many of the jobs available in the Home or Dashboard pages, as well as many of the tools within 7i pages, are implemented as Workflow jobs. Workflow jobs use specific Workflow models to process particular functions that run in the background and are tracked as a job. In most cases, the generic JOB model is used for Workflow job processing, but specific models may also be used for jobs with higher complexity or specific needs.
When a Workflow job is requested, a record will be created in the wf_instance table that stores Workflow instances. The Workflow engine is responsible for processing these instances. When the engine picks up an instance that is marked for job processing, it will automatically track the job in the nu_job_mstr table. As the job progresses through the various activities defined in the Workflow model, the job progress will be updated. A typical Workflow job model contains only a Process activity, which runs a particular 7i utility within a separate process (launcher.exe). When the activity is complete, the status of the Workflow instance is updated, indicating that the engine should again pick up the instance for any additional processing that is needed.