Migration Manual - Integration Framework
Migration Manual - Integration Framework
This new framework provides features that don't exist in the Integration Framework, while some
features from the original Integration Framework do not yet exist in the Job Framework.
Target Audience
Developers, who will be concerned with the migration of all Workflow Components using the
provided tools
Site Administrators, who will be concerned with migration of Workflow Schedules using the
provided tools
Business Users - Best Practices for Job Framework schedule setup and monitoring of execution
(see also: Documentation: Job management and scheduling)
As such, the provided tools are not subject to the official Salesforce Commerce Cloud support. In
case you discover discrepancies or issues, these may be report in the context of Bitbucket. While
Salesforce Commerce Cloud forgoes any obligation for bug fixes, the existing Commerce Cloud
community, which includes Salesforce Commerce Cloud employees as well as any other
Community developer, will apply enhancements as needed as part of the community effort and
spirit.
It's is highly recommended to test migrated Job Step Types (Workflow Components), and schedules
thoroughly, before you make them part of your daily procedures.
Migrating between the two frameworks does require additional education effort on best practices
and trouble shooting. Please familiarize yourself and your team with the provided tools and
thoroughly plan and test your migration.
Knowledge
Dependencies
General Cleanup
Planning
(In case there are long-running jobs; if migration is taking longer or rather complex; if you pursue an
integrative approach to mitigate risks)
Prioritization: Prepare your list that contains all Job Schedules, in the correct order for step-by-step
conversion.
Definition of migration schedule:
Workflow Components
Tests
Final timeframe for Job Schedule migration to avoid maintenance windows of either
Salesforce Commerce Cloud or involved 3rd party systems
Unlike Workflow Components, which are defined via Custom Objects of type
WorkflowComponentDefinition, Job Step Types are defined through a steptypes.json file,
found in every cartridge's root folder, that carries the corresponding implementation.
Using the automated process and provided adapter scripts, the current Integration Framework
implementation can be reused without being touched and even be ran in parallel via Job
Framework and Integration Framework.
2. Workflow Schedules (and their component configurations) will be converted into Job
Schedules (and their respective Job Steps)
Job Schedules are System Objects that are converted out of their Custom Object
WorkflowScheduleDefinition counterparts.
Job Schedules either rely on standard, i.e. built-in components, or custom Job Step Type
implementations.
Custom Job Step Type implementations have to be defined and implemented through a
cartridge that is enabled in the cartridge path.
Please be aware: The steps outlined in this chapter depend on each other. Deviating from the stated
will most likely lead to higher effort during migration, roll-out and configuration of the new Job
Schedules.
The migration cartridge can be found in the Integration Framework Community Suite
Repository.
Download or pull the code
You will only need the bm_if_migration_cockpit cartridge (no need to update or change
other Integration Framework cartridges)
Migration Cockpit cartridge has a dependency towards bm_integrationframework
Deploy the cartridge to your target environment
Add the cartridge to the Business Manager cartridge path
2. Maintain permissions
Set corresponding permission for the respective role: Go to Administration > Organization >
Roles > [your role here] - Business Manager Modules, locate "CS Integration Framework
Workflow Migration", make sure it is selected and click "Update". You need to do this in
"Organization" context.
Note: Don't forget to revoke permissions when migration is complete
Workflow Components can be seen as configurable building blocks for recurring jobs. They consist
of their implementation (usually a pipeline) and configuration parameters (accessed through the
Pipeline dictionary). In order to achieve a decent UX, those parameters can be filled with values via
Business Manager UI. But to be able show all configuration parameters (along with e.g. a short
description or a drop-down field for possible values), metadata is needed that declares all
parameters for generating a configuration UI in the Business Manager. Additionally, the Business
Manager needs to know which Pipeline/Controller has to be run when the job is triggered.
This is what the WorkflowComponentDefinition custom object was responsible for in the
Integration Framework. In the Job Framework, those components are now called Job Step Types.
In the Job Framework, the metadata contained in these objects will be delivered together with the
cartridge that contains the implementation. This way the configuration is tied to the corresponding
code more closely than through custom objects. Also, no metadata import is required when adding
new Job Steps - adding the cartridge to the cartridge path will do.
In this step, we use the Migration Cockpit to transfer metadata from custom objects to the JSON
notation: A file called "steptypes.json" will go in each cartridge that contains job implementations
and contain only their metadata. Since the Migration Cockpit cannot determine which cartridge
contains which Job Step implementation, the user will have to split the resulting file and distribute
the correct parts of it to the corresponding cartridges manually.
1. Open the Migration Cockpit. In Step "1. Export Workflow Components", click download button
to generate the steptypes.json file.
2. View steptypes.json file in a text editor
The JSON contains one root object called step-types.
As there are only script-type Job Steps in this migration, the only step type used is script-
module-step.
script-module-step defines an array containing all the custom code Job Steps.
3. Split the JSON file into cartridge-based configurations
Select the right script-module-step child objects for each cartridge that contains Job
Steps
Move them out of the generated JSON file into a new one (using the same structure
of step-types and script-module-step object)
Put the resulting JSON file in the root directory (not the "cartridge" folder) of every
cartridge
4. Deploy all affected cartridges and activate new code version if necessary
5. Review Results
In Business Manager, go to Administration -> Job Schedules (Beta)
Create new Job (call it "Migration Test" or similar), you will face the Job Editor now
Click on "Configure a Step", you will see the list of all Job Step Types
Go through the list of the migrated Job Step Types, select them one by one, and check if
all parameters are listed in the form
Test Job Configuration can be deleted again after all Job Steps are verified
6. Job Step Types are now migrated to the Job Framework.
Important: Do not change any of the resulting Job IDs (@type-id) attributes at this point, as the Job
Step IDs will be used by the migrated Schedules later (Step 3 of this migration).
One will go to the root of bc_export and contain the definition for Export-Orders,
One will go to bc_import and contain the definition for Import-OrderStatusUpdate and
Import-Inventory.
Our goal in this step is, to convert those Workflow Schedules to Job Steps in the new Job
Framework, resulting in an as-complete-as-possible representation in the new environment.
Unfortunately, in some cases a perfect 1:1 conversion won't be possible, but using the Migration
Cockpit can help to get very close to that. (See Chapter 4 for a detailed feature comparison.)
Fortunately, it is possible to operate both Integration Framework and Job Framework next to each
other. While users must not enable jobs in both environments at the same time, it is possible to
disable a certain job in one environment and enable it in the other. This way it is possible to do a
While this tutorial describes automatic migration, please consider manual migration (by re-
implementing or refactoring) jobs that e.g. face requirement changes in the near future or just need
to be refactored "some day". Migrating to Job Framework is a great opportunity to do some cleanup
- and learn how to use the new Framework at the same time, as there are several Script API
differences coming along with the new Framework.
In order to be able to run Integration Framework Jobs without any code changes, automatic
migration will make use of an Adapter class emulating the Integration Framework environment.
(See Chapter 4 for details.)
Logging
While the CS Integration Framework had different log levels, Job Schedules only have one
log entries will be prefixed with the corresponding log level, but filtering/suppressing log entries by
severity is not possible out of the box anymore.
Migration Cockpit emulates a verbose mode that disables log levels below WARNING in order to
reduce excessive logging. Please monitor log files and edit your log behavior appropriately if
necessary. Furthermore, defining own log files is not supported (and not recommended) any longer.
The Job Framework creates one log file for each run, while the log file can easily be accessed through
the Business Manager.
Workflow Schedule
This status is not supported anymore. Jobs using this mechanism will have to be re-implemented
and cannot be migrated automatically. Since this exit status is defined via Demandware Script as a
return value, the Migration Cockpit cannot react on this kind of limitation. Job step types and Job
steps will be exported normally, but the job behavior will change. Be sure to check your code for
usage of SUSPEND exit status if you run into issues.
Additionally, there are three Integration Framework standard components (DateCondition,
TimeCondition, DateTimeCondition) that use this exit status. These components will not be
supported by the Job Framework, and Workflow Schedules using one of them will be marked with a
comment in the exported XML.
If you need the possibility to enable/disable certain Job Steps, please introduce a parameter that
tells the step whether it should execute or not, and implement conditional execution for that step.
Workflow Plan
With the Job Framework, it is not possible to generate a plan for future executions of jobs. So
instead of a plan, there is the Job History only, showing executions from the present or past.
The CS Integration Framework, as an actual framework, of course defined its own interfaces and
objects. Hence, all jobs that were implemented against the Framework expect certain functionality/
have certain dependencies. In order to ensure that implementation does not have to be touched,
those dependencies have to be emulated after migrating to the Job Framework.
The main focus lies on the CurrentWorkflowComponentInstance object, which is passed through
the pipeline dictionary. This object is available in the Job Framework, provided by an adapter
class. Preparing/mocking the object happens through Adapter class, which is used as an entry point
for every migrated Job Step. Besides other custom job parameters, the target Job
Pipeline/Controller is passed through an additional Step Parameter (Action). The configured
pipeline will be called after mocking the object. This approach ensures that no code changes are
necessary for the migration. As the Adapter class (along with all standard components provided by
the Integration Framework) is part of the Integration Framework cartridge - which has to stay in the
cartridge path.
While the migration takes advantage from a lot benefits provided by the new Job Framework, there
are some drawbacks that can only be solved by re-implementation of Job Steps and manual re-
configuration of Job Schedules.
J OB S TEPS
The maximum timeout for Script Pipelets within pipelines is limited to 60 min (3,600 sec).
Some legacy components may have workarounds implemented to deal with that problem
and by doing that, may have accidentally or acceptingly put extra load on servers, which in
turn could cause performance problems. Please consider re-implementation.
See
https://documentation.demandware.com/DOC2/topic/com.demandware.dochelp/Jobs/C
reatingCustomJobSteps.html for detailed description on how to write customer job steps.
J OB S CHEDULES
Sometimes workflows on different levels, i.e. site vs. organization, different sites, etc., had to
be chained. Typically, it has been achieved by feeble timing between independent jobs. With
the new framework, it can be configured properly.
There was only one email distribution list for all status notifications. With the new
framework, you can assign different lists to different statuses.
The platform Job Step Types can be used to streamline processes: for example, once you got
some confidence in your product data replication processes from your back end to staging,
or you run jobs to help preparing the data for production every day, you can now easily
combine it with data replication directly.
With the old framework, it wasn't quite possible to set appropriate resource locks, which
mark resources as exclusively locked by the current job. That sometimes led to inconsistent
data. The only way to prevent that from happening was it to set the resource lock for all
platform Workflow Schedules (the once that triggered the Integration Framework Jobs); a
rather bad and not recommendable practice. With the new framework, you can set the
resource lock per schedule, which is way more flexible and safe. E.g. when you import
See
https://documentation.demandware.com/DOC2/topic/com.demandware.dochelp/Jobs/C
reatingaNewJobSchedule.html for detailed description on how to setup new job schedules.
Cleanup
Before executing this step, be sure that ALL jobs and steps are up and running correctly using Job
Framework
Backup site and metadata (especially regarding custom objects)
Delete all Integration Framework related custom objects (WorkflowComponent*,
WorkflowSchedule*)
Disable Business Manager Tools (by revoking access rights):
Workflow Schedules
Workflow Plan
Do not remove/delete Integration Framework Cartridges!
See 4.4 for details on Adapter Class
Remove Migration Cockpit cartridge
Only if there is no more job depending on the adapter, those cartridges can be removed.
Executes a pipeline. The name This Job Step can be used to migrate the
ExecutePipeline
and start node of the pipeline has current platform Job Schedules to the Job
to be configured at parameter Framework based ones. In this case the Job
'ExecutePipeline.Pipeline'. Schedule will contain only one flow and step
using the parameters that have been added to
the old ones.
Includes steps from another job. Can be used to "run" a Job, identified by its ID
IncludeStepsFromJob
in context of this step. E.g. it can be used to
avoid redundant configurations.
Roll back of pre-configured code replication
UndoPreconfiguredCodeReplicationP
processes
rocess
With the Job Framework, external parties can now trigger Job Executions via OCAPI
(https://documentation.demandware.com/DOC2/topic/com.demandware.dochelp/OCAPI/17.1/
data/Resources/Jobs.html). I.e. just like any other permitted back end system, a Product
Information Management System can now push out new product data and immediately trigger the
job inside SFCC that would consume the data and replicate everything to production. Please make
sure that only a very close and trusted group of external systems get this type of access, as there is
no fine-grained permission control on those jobs, i.e. the one that got the permission to execute a
Job, can basically execute all jobs if it knows it by ID.
Using the Job Resource, it is also possible to monitor the execution of these jobs. That helps to make
external systems aware of completion of jobs and react on any given exit status. Using this API, it
would also be possible to create dashboards showing the current status of jobs, errors, duration, etc.
The above image depicts a job that runs three workflows in parallel before the last workflow, the one
at the bottom, gets executed.
Please note: Running workflows on the same level in parallel is the default. In case you need them
to run sequentially, e.g. because both streams would cannibalize their resources, you explicitly need
to configure it this way. Just use the toggle box (second from the right; on top of workflow boxes) to
adjust the setting accordingly.
Known Issues
Support for parameter data type password. A corresponding ticket (APP-40717) has been
filed with Salesforce Commerce Cloud and we are hopeful to see it in one of the next
platform releases. Until that happens, all password parameters have to be treated like string
and the values are just as visible in the forms as strings. In case you find it unacceptable, you
may well wait with your migration until password data types are supported.
A pendant to the Integration Framework Workflow Plan panel is missing. That means, it is
hard to see for what time the jobs have been scheduled. This type of panel will most likely be
release in the course of 2017.
Example: steptypes.json
Shows configuration for a cartridge that contains one Job Step (StandardComponents-CleanUpFiles)
with 5 parameters. Would go to bc_myCartridge\steptypes.json.
{
"step-types": {
"script-module-step": [{
"@type-id": "custom.CleanUpFiles",
"module":
"bc_integrationframework/cartridge/scripts/workflow/legacy/PipelineStepRunner
",
"function": "execute",
"parameters": {
"parameters": [{
"@name": "Action",
"@description": "Legacy action of the community suite IF,
to be called through the new Step Execution",
"@type": "string",
"@required": "true",
"enum-values": {
"value": [
"StandardComponents-CleanUpFiles"
]
}
}, {
"@name": "FilePattern",
"@description": "File name pattern (Default is \".*\")",
"@type": "string",
"@required": false,
"@trim": true
}, {
"@name": "DirectoriesToPurge",
"@description": "Directories to purge",
"@type": "string",
"@required": false,
"@trim": true
}, {
"@name": "DirectoriesToArchive",
"@description": "Directories to archive",
"@type": "string",
"@required": false,
"@trim": true
}, {
Shows configuration for JSON (ID, Description and Action being standard attributes).