Administration


Systems Integration

PlanTest supports several aspects of systems integration, some discussed here. Further information is available by contacting Seaconis staff.

Subjects in this document are:

  1. Use of the command line parameters to start PlanTest from Job Control Systems.
  2. Use of services to supply data or analysis from other applications or systems.
  3. Integration of external test or step results into PlanTest workflows for view and decisions by examiners.

Job Control Integration


PlanTest can be used stand-alone or integrated with a Job Control System (JCS). JCS integration happens on several levels, but the most direct is how PlanTest is started by the JCS with parameters for the data sources and type of access. Data sources may be file based or from services. Also, integration with other examination or testing results can be accomplished by loading data following the Seaconis external results schema.

Command Line Startup Options

PlanTest Desktop is often integrated into client job control systems. The most straightforward way this is done is by calling PlanTest from a command line with arguments for the plan to be examined and other options that control the access type and desired behavior on startup. This can be integrated with PlanTest accessing client services for specific data.

This section describes the manner and options for starting PlanTest from the command line.

Command line arguments for PlanTest startup can be either:

  1. A single argument that is the source path for a file (CEXML or .ptxml*) or an identifier for a source document from a service. For example, all of the following are valid uses.
    plantest "C:\myData\D122104.ptxml"  
    plantest "C:\myData\D122104.xml"  
    plantest “msx2345d”
  1. A more complex startup, where the command line arguments are of the format:

plantest fileID:identifier access:full autorun:true xreload:true

Argument Description
fileID: string that is a source identifier
access: 'full' gives full examination and save permissions, 'view' does not allow execute, decisions, or save
autorun: 'true' runs any auto-executable tests on startup, after load of any prior exam results
xreload: 'true' reloads ExternalResults even if previous exam results are found and loaded, 'false' will load ExternalResults only for new of examination

Examples:

"C:\Program Files (x86)\Seaconis\PlanTest\bin\plantest" fileid:"C:\testdata\Nsw\2018Q3\ManyLots\1225691.ptxml" access:full autorun:true xreload:true

"C:\Program Files (x86)\Seaconis\PlanTest\bin\plantest" fileid:doc123 access:full autorun:true xreload:true

Services Integration


This is an integration technique frequently used during PlanTest adaptation for a jurisdiction or site. Any test can be built to call a service for specific current data, or for analysis and pass back of results. It has also been used during the data loading prior to examination. For more information, contact Seaconis.

External Workflow and Results Integration


You can bring the workflow and results from other systems into PlanTest. This is a two step process: include steps into workflows, and configure external output into PlanTest readable files for import.

This can all be done by existing customization methods. No Seaconis development work is necessary.

You must createPlanTest workflow step elements (Name, Tip, Description, Messages) in the workflow files. This ensures the step in PlanTest operates as would any other step with the exception of execution of a test module (see Services Integration if you want this behavior.)

Examination results are persisted in the proprietary Seaconis XML structure which contains all the information necessary to recreate the examination within PlanTest. External results require only a few of these elements.

External Results Structure

The following XML shows a test result from an examination results file, any elements not required of the external results have been removed. Any external result must match a step or test in the PlanTest examination workflow.

    <TestResult>
      <TestName>ArbitraryTest</TestName >
      <DataHeaders>
        <ColumnName>Company</ColumnName>
        <ColumnName>Product</ColumnName>
        <ColumnName>Quantity</ColumnName>
        <ColumnName>Limit</ColumnName>
      </DataHeaders>
      <Records>
        <DataRecord>
          <isError>false</isError>
          <Row>
            <DataValue>Klaxon Services</DataValue>
            <DataValue>Decibels</DataValue>
            <DataValue> 500</DataValue>
            <DataValue>or more</DataValue>
          </Row>
        </DataRecord>
        <DataRecord>
          <isError>true</isError>
          <Row>
            <DataValue>Fun Factory</DataValue>
            <DataValue>FunGun</DataValue>
            <DataValue>80</DataValue>
            <DataValue>100</DataValue>
          </Row>
        </DataRecord>
      </Records>
      <AnalysisPassFail>pass</AnalysisPassFail>
   <InfoMessages>
      <Info>
         A message that communicates the success condition of a test or step.
         For instance, Klaxon’s klaxons are really loud…
      </Info>
   </InfoMessages>
      <ErrorMessages>
        <Error>An error or information message from the test. </Error>
        <Error>The FunGun is less than pleasing. </Error>
      </ErrorMessages>
      <ExternalComments>Comments from user of external system. </ExternalComments>
    </TestResult>

The structure allows for a dynamic description of a table, and then provides the records for the table. The different messages serve as easily transferred reporting content.

These messages are not the same as the Messages used in the workflow XML, where they signify a specific requisition to be used by downstream applications in the case of a fail decision.