Transform Services User guide
Service parameters
To run the service, specify the source object storage and identify the input data set.
REQUIRED: "source"
Note
Identify the transform source object storage, where the input resides. The source object storage details appear in the Model9 agent configuration file.
Required Keywords for "source"
{ "source": { "url":"<URL>", "api":"<API>", "bucket":"<USER_BUCKET>", "user":"<USERID>", "password":"<PASSWORD>", } }
Optional Keywords for "source"
{ "source": { "useS3V4Signatures":"false"|"true" } }
Keyword | Description | Required | Default |
---|---|---|---|
url | The object storage / proxy url | YES | - |
api | The api-protocol used by this object storage / proxy | YES | - |
bucket | The bucket defined within the object storage / proxy | YES | - |
user | The userid provided by the object storage / proxy | YES | - |
password | The password provided by the object storage / proxy | YES | - |
useS3V4Signatures | Whether to use the V4 protocol of S3. Required for certain object storage providers, such as HCP Cloud Scale and cohesity. Relevant for api “S3” only. | NO | false |
OPTIONAL: “target”
Note
Identify the transform target object storage. Values not specified will be taken from “source” parameter.
{ "target": { "url":"<URL>", "api":"<API>", "bucket"<USER-BUCKET>", "user":"<USERID>", "password":"<PASSWORD>", "useS3V4Signatures":"false"|"true" } }
Keyword | Description | Required | Default |
---|---|---|---|
url | The object storage / proxy url | NO | Taken from “source” |
api | The api-protocol used by this object storage / proxy | NO | Taken from “source” |
bucket | The bucket defined within the object storage / proxy | NO | Taken from “source” |
user | The userid provided by the object storage / proxy | NO | Taken from “source” |
password | The password provided by the object storage / proxy | NO | Taken from “source” |
useS3V4Signatures | Whether to use the V4 protocol of S3. Required for certain object storage providers, such as HCP Cloud Scale and cohesity. Relevant for api “S3” only. | NO | false |
REQUIRED: “input”
Note
If you specify VSAM keywords for a sequential input data set, the transform will be performed and a warning message will be issued
Required Keywords for "input"
{ "input": { "name":"<DSN>", "complex":"<group-SYSPLEX>" } }
Optional Keywords for "input"
{ "input": { "type":"backup"|"archive"|"import"|"cloudcopy", "entry":"0|<N>", "prefix":"model9|<USER-PREFIX>", "recordBinary":"false|true", "recordCharset":"<CHARSET>", "vsam":{ "keyBinary":"false|true", "keyCharset":"<CHARSET>" } }
Keyword | Description | Default |
---|---|---|
name | Name of the original data set | MF data set legal name, case insensitive |
complex | The Model9 resource complex name as defined in the agent configuration file. | String representing the complex |
type | The type of the input data set, according to the Model9 Cloud Data Manager policy that created it:
| “backup” (case insensitive) |
entry | When the type is “backup”, “entry” represents the generation. The default is “0”, meaning the latest backup copy. Entry “1” would be the backup copy that was taken prior to the latest copy, and so on. | “0” |
prefix | The environment prefix as defined in the agent configuration file. | “model9” |
recordBinary | Whether the record input is binary. Qualifies for all “record” input (PS, PDS, VSAM data) | “false” (case insensitive) |
recordCharset | If the record input is not binary, what will be the character set of the input. Qualifies for all “record” input (PS, PDS, VSAM data) | “IBM-1047” |
keyBinary | In case the input is VSAM data set, whether the VSAM key is binary. The output is in base64 format | “false” (case insensitive) |
keyCharset | In case the input is VSAM data set and the key is not binary, the character set of the VSAM key | “IBM-1047” |
OPTIONAL: "output"
Note
The output is the transformed data of the MF data set, accessible as S3 object
Note
When transforming a file with the same name as an existing file in the target, the existing file will be replaced by the newly transformed file.
Note that the service does not delete previously transformed files but rather overwrites files with the same name, so when re-transforming a file using the “split” function, ensure to remove any previously transformed files to avoid having splitted files of different versions.
When splitting a file, wait for the successful completion of the transform function before continuing with the processing, to insure all the parts of a the file were created.
Specifying “text” format for a “binary” input will cause the transform to fail.
{ "output": { "prefix":"model9|<USER-PREFIX>", "compression":"none|gzip", "format":"JSON|text|CSV", "charset":"UTF", "endWithNewLine":"false|true", "splitBySize":"< nnnnb/m/g>", "splitByRecords":"<n>" } }
Keyword | Description | Default |
---|---|---|
prefix | Prefix to be added to the object name: ”Prefix”/”object name” | “transform” |
compression | Should the output be compressed: “gzip”|”no | “gzip” (case insensitive) |
format | The format of the output file: “JSON”|”Text”|”CSV” | “JSON” (case insensitive) |
charset | If the key input is not binary, this keyword specifies what will be the character set of the output. Currently only “UTF” is supported | “UTF” |
endWithNewLine | A newline will be added at the end of the file, before end of file. This is required by some applications. | false |
splitBySize | Whether to split the output files to several files by the requested size, for example, “3000b”, "1000m", "1g". The output files will be numbered <file-name>.1, 2, 3 and so on.
| 0 No split by size will be performed |
splitByRecords | Whether to split the output files to several files, according to output records. The output files will be numbered <file-name>.1, 2, 3 and so on.
| 0 No split by records will be performed |
Service parameters samples
“Hello world”
Transform the latest backup of a plain text data set, charset IBM-1047, converted to UTF and compressed.
{ "input" : { "name" : "SAMPLE.TEXT", "complex" : "group-PLEX1" }, "output" : { "format" : "text" }, "source" : { "url" : "https://s3.amazonaws.com", "api" : "aws-s3", "bucket" : "prod-bucket", "user" : "sdsdDVDCsxadA43TERVGFBSDSSDff", "password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7" } }
Transforming an unloaded DB2 table
Transform the latest backup of an unloaded DB2 table, charset IBM-1047, converted to UTF and compressed, located with a specific prefix:
{ "input" : { "name" : "DB2.UNLOADED.SEQ", "complex" : "group-PLEX1" }, "output" : { "format" : "text", }, "source" : { "url" : "https://s3.amazonaws.com", "api" : "aws-s3", "bucket" : "prod-bucket", "user" : "sdsdDVDCsxadA43TERVGFBSDSSDff", "password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7" }, "output" :{ "prefix" : "DBprodCustomers" } }
Transforming a VSAM file using the defaults
When transforming a vsam file, the defaults are a text key and binary data, transforming to a JSON output file:
{ "input" : { "name" : "SAMPLE.VSAM", "complex" : "group-PLEX1" }, "source" : { "url" : "https://s3.amazonaws.com", "api" : "aws-s3", "bucket" : "prod-bucket", "user" : "sdsdDVDCsxadA43TERVGFBSDSSDff", "password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7" } }
Transforming a VSAM text file to CSV
Specify a text data, transforming to a CSV output file:
{ "input" : { "name" : "SAMPLE.VSAM", "complex" : "group-PLEX1" }, "vsam" :{ "keyBinary" :"false|true", "keyCharset":"<CHARSET>" }, "output" : { "format" : "CSV", }, "source" : { "url" : "https://s3.amazonaws.com", "api" : "aws-s3", "bucket" : "prod-bucket", "user" : "sdsdDVDCsxadA43TERVGFBSDSSDff", "password" : "ddferdscsdW4REFEBA33DSffss344gbs4efe7" } }
Service response and log
The transform service is invoked as an HTTP request. It returns:
In case of a WARNING or an ERROR - the HTTP response will also contain messages in the Log below.
Note
Informational messages are printed only to service log and not to the HTTP response. The service log can be viewed on the AWS console when executing the service from AWS, or the docker log, when executing the service on-premises.
HTTP status
Code | Description |
---|---|
200 | OK |
400 | Bad user input or unsupported data set |
500 | Unexpected error |
HTTP response
{ "status" :“OK|WARNING|ERROR”, "outputName" :“<OUTPUT-NAME>”, "inputName" :”<DSN>”, "outputCompression" :”none|gzip”, "outputSizeInBytes" :”<SIZE-IN_BYTES>”, "outputFormat" :”JSON|text|CSV” }
Log
{ "log": [ "<INFO-MESSAGE>", "<WARNING-MESSAGE>", "<ERROR-MESSAGE>", ] }
Output keyword | Description |
---|---|
status |
|
outputName | The object name as appears in the target object storage |
inputName | The input data set name |
outputCompression | The compression type as selected in the input parameters / default |
outputSizeInBytes | The number of bytes on the output object |
outputFormat | The format as selected in the input parameters / default |
Service response and log samples
Status OK sample
{ "status" : "OK", "outputName" : "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=a641d670-2d05-41e7-9dd3-7815e1b2d4c4", "inputName" : "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS", "outputCompression": "NONE", "outputSizeInBytes": 97, "outputFormat" : "JSON" }
Status WARNING sample
{ "log" : [ "ZM9K001I Transform service started", "ZM9K108W Specifying input parameter vsam is ignored for input data set with DSORG PS", "ZM9K002I Transform service completed successfully, output is transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19" ], "status" : "WARNING", "outputName" : "transform/QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS!uuid=d779fbf9-da6b-495b-b6b9-de7583905f19", "inputName" : "QA.SMS.MCBK.SG1QNOBK.DSERV.TXT.TMPPS", "outputCompression": "NONE", "outputSizeInBytes": 97, "outputFormat" : "JSON" }
Status ERROR sample
{ "status": "ERROR", "log" : [ "ZM9K001I Transform service started", "ZM9K008E The input was not found: name QA.SMS.MCBK.DSERV.TXT.NON, archive false, entry (0)" ] }
Input format support
Supported formats
SMS-managed data sets
Non-SMS managed data sets
Sequential and extended-sequential data sets with the following RECFM:
V
VB
F
FB
Non-extended VSAM KSDS data sets
Unsupported formats
RRDS, VRRDS, LINEAR, ESDS
Extended format data sets with compression or encryption
PDS data sets
RECFM not mentioned above (U, FBA…)
Output format support
Supported types
Text
JSON
CSV