b2c-custom-job-steps

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Custom Job Steps Skill

自定义作业步骤技能

This skill guides you through creating new custom job steps for Salesforce B2C Commerce batch processing.
Running an existing job? If you need to execute jobs or import site archives via CLI, use the
b2c-cli:b2c-job
skill instead.
本技能将指导你为Salesforce B2C Commerce批处理创建新的自定义作业步骤
运行现有作业? 如果你需要通过CLI执行作业或导入站点归档,请改用
b2c-cli:b2c-job
技能。

When to Use

适用场景

  • Creating a new scheduled job for batch processing
  • Building a data import job (customers, products, orders)
  • Building a data export job (reports, feeds, sync)
  • Implementing data sync between systems
  • Creating cleanup or maintenance tasks
  • 创建用于批处理的新定时作业
  • 构建数据导入作业(客户、商品、订单)
  • 构建数据导出作业(报表、数据馈送、同步)
  • 实现系统间的数据同步
  • 创建清理或维护任务

Overview

概述

Custom job steps allow you to execute custom business logic as part of B2C Commerce jobs. There are two execution models:
ModelUse CaseProgress Tracking
Task-orientedSingle operations (FTP, import/export)Limited
Chunk-orientedBulk data processingFine-grained
自定义作业步骤允许你将自定义业务逻辑作为B2C Commerce作业的一部分执行。有两种执行模式:
模式适用场景进度跟踪
Task-oriented(面向任务)单一操作(FTP、导入/导出)有限
Chunk-oriented(面向块)批量数据处理细粒度

File Structure

文件结构

my_cartridge/
├── cartridge/
│   ├── scripts/
│   │   └── steps/
│   │       ├── myTaskStep.js       # Task-oriented script
│   │       └── myChunkStep.js      # Chunk-oriented script
│   └── my_cartridge.properties
└── steptypes.json                  # Step type definitions (at cartridge ROOT)
Important: The
steptypes.json
file must be placed in the root folder of the cartridge, not inside the
cartridge/
directory. Only one
steptypes.json
file per cartridge.
my_cartridge/
├── cartridge/
│   ├── scripts/
│   │   └── steps/
│   │       ├── myTaskStep.js       # Task-oriented script
│   │       └── myChunkStep.js      # Chunk-oriented script
│   └── my_cartridge.properties
└── steptypes.json                  # Step type definitions (at cartridge ROOT)
重要提示:
steptypes.json
文件必须放在cartridge的根目录中,不能放在
cartridge/
目录内。每个cartridge只能有一个
steptypes.json
文件。

Step Type Definition (steptypes.json)

步骤类型定义(steptypes.json)

json
{
    "step-types": {
        "script-module-step": [
            {
                "@type-id": "custom.MyTaskStep",
                "@supports-parallel-execution": "false",
                "@supports-site-context": "true",
                "@supports-organization-context": "false",
                "description": "My custom task step",
                "module": "my_cartridge/cartridge/scripts/steps/myTaskStep.js",
                "function": "execute",
                "timeout-in-seconds": 900,
                "parameters": {
                    "parameter": [
                        {
                            "@name": "InputFile",
                            "@type": "string",
                            "@required": "true",
                            "description": "Path to input file"
                        },
                        {
                            "@name": "Enabled",
                            "@type": "boolean",
                            "@required": "false",
                            "default-value": "true",
                            "description": "Enable processing"
                        }
                    ]
                },
                "status-codes": {
                    "status": [
                        {
                            "@code": "OK",
                            "description": "Step completed successfully"
                        },
                        {
                            "@code": "ERROR",
                            "description": "Step failed"
                        },
                        {
                            "@code": "NO_DATA",
                            "description": "No data to process"
                        }
                    ]
                }
            }
        ],
        "chunk-script-module-step": [
            {
                "@type-id": "custom.MyChunkStep",
                "@supports-parallel-execution": "true",
                "@supports-site-context": "true",
                "@supports-organization-context": "false",
                "description": "Bulk data processing step",
                "module": "my_cartridge/cartridge/scripts/steps/myChunkStep.js",
                "before-step-function": "beforeStep",
                "read-function": "read",
                "process-function": "process",
                "write-function": "write",
                "after-step-function": "afterStep",
                "total-count-function": "getTotalCount",
                "chunk-size": 100,
                "transactional": "false",
                "timeout-in-seconds": 1800,
                "parameters": {
                    "parameter": [
                        {
                            "@name": "CategoryId",
                            "@type": "string",
                            "@required": "true"
                        }
                    ]
                }
            }
        ]
    }
}
json
{
    "step-types": {
        "script-module-step": [
            {
                "@type-id": "custom.MyTaskStep",
                "@supports-parallel-execution": "false",
                "@supports-site-context": "true",
                "@supports-organization-context": "false",
                "description": "My custom task step",
                "module": "my_cartridge/cartridge/scripts/steps/myTaskStep.js",
                "function": "execute",
                "timeout-in-seconds": 900,
                "parameters": {
                    "parameter": [
                        {
                            "@name": "InputFile",
                            "@type": "string",
                            "@required": "true",
                            "description": "Path to input file"
                        },
                        {
                            "@name": "Enabled",
                            "@type": "boolean",
                            "@required": "false",
                            "default-value": "true",
                            "description": "Enable processing"
                        }
                    ]
                },
                "status-codes": {
                    "status": [
                        {
                            "@code": "OK",
                            "description": "Step completed successfully"
                        },
                        {
                            "@code": "ERROR",
                            "description": "Step failed"
                        },
                        {
                            "@code": "NO_DATA",
                            "description": "No data to process"
                        }
                    ]
                }
            }
        ],
        "chunk-script-module-step": [
            {
                "@type-id": "custom.MyChunkStep",
                "@supports-parallel-execution": "true",
                "@supports-site-context": "true",
                "@supports-organization-context": "false",
                "description": "Bulk data processing step",
                "module": "my_cartridge/cartridge/scripts/steps/myChunkStep.js",
                "before-step-function": "beforeStep",
                "read-function": "read",
                "process-function": "process",
                "write-function": "write",
                "after-step-function": "afterStep",
                "total-count-function": "getTotalCount",
                "chunk-size": 100,
                "transactional": "false",
                "timeout-in-seconds": 1800,
                "parameters": {
                    "parameter": [
                        {
                            "@name": "CategoryId",
                            "@type": "string",
                            "@required": "true"
                        }
                    ]
                }
            }
        ]
    }
}

Task-Oriented Steps

面向任务的步骤

Use for single operations like FTP transfers, file generation, or import/export.
适用于单一操作,如FTP传输、文件生成或导入/导出。

Script (scripts/steps/myTaskStep.js)

脚本(scripts/steps/myTaskStep.js)

javascript
'use strict';

var Status = require('dw/system/Status');
var Logger = require('dw/system/Logger');

/**
 * Execute the task step
 * @param {Object} parameters - Job step parameters
 * @param {dw.job.JobStepExecution} stepExecution - Step execution context
 * @returns {dw.system.Status} Execution status
 */
exports.execute = function (parameters, stepExecution) {
    var log = Logger.getLogger('job', 'MyTaskStep');

    try {
        var inputFile = parameters.InputFile;
        var enabled = parameters.Enabled;

        if (!enabled) {
            log.info('Step disabled, skipping');
            return new Status(Status.OK, 'SKIP', 'Step disabled');
        }

        // Your business logic here
        log.info('Processing file: ' + inputFile);

        // Return success
        return new Status(Status.OK);

    } catch (e) {
        log.error('Step failed: ' + e.message);
        return new Status(Status.ERROR, 'ERROR', e.message);
    }
};
javascript
'use strict';

var Status = require('dw/system/Status');
var Logger = require('dw/system/Logger');

/**
 * Execute the task step
 * @param {Object} parameters - Job step parameters
 * @param {dw.job.JobStepExecution} stepExecution - Step execution context
 * @returns {dw.system.Status} Execution status
 */
exports.execute = function (parameters, stepExecution) {
    var log = Logger.getLogger('job', 'MyTaskStep');

    try {
        var inputFile = parameters.InputFile;
        var enabled = parameters.Enabled;

        if (!enabled) {
            log.info('Step disabled, skipping');
            return new Status(Status.OK, 'SKIP', 'Step disabled');
        }

        // Your business logic here
        log.info('Processing file: ' + inputFile);

        // Return success
        return new Status(Status.OK);

    } catch (e) {
        log.error('Step failed: ' + e.message);
        return new Status(Status.ERROR, 'ERROR', e.message);
    }
};

Status Codes

状态码

javascript
// Success
return new Status(Status.OK);
return new Status(Status.OK, 'CUSTOM_CODE', 'Custom message');

// Error
return new Status(Status.ERROR);
return new Status(Status.ERROR, null, 'Error message');
Important: Custom status codes work only with OK status. If you use a custom code with ERROR status, it is replaced with ERROR. Custom status codes cannot contain commas, wildcards, leading/trailing whitespace, or exceed 100 characters.
javascript
// Success
return new Status(Status.OK);
return new Status(Status.OK, 'CUSTOM_CODE', 'Custom message');

// Error
return new Status(Status.ERROR);
return new Status(Status.ERROR, null, 'Error message');
重要提示: 自定义状态码仅适用于OK状态。如果在ERROR状态中使用自定义代码,它会被替换为ERROR。自定义状态码不能包含逗号、通配符、前导/尾随空格,且长度不能超过100个字符。

Chunk-Oriented Steps

面向块的步骤

Use for bulk processing of countable data (products, orders, customers).
Important: You cannot define custom exit status for chunk-oriented steps. Chunk modules always finish with either OK or ERROR.
适用于可计数数据的批量处理(商品、订单、客户)。
重要提示: 你不能为面向块的步骤定义自定义退出状态。块模块始终以OKERROR结束。

Required Functions

必填函数

FunctionPurposeReturns
read()
Get next itemItem or nothing
process(item)
Transform itemProcessed item or nothing (filters)
write(items)
Save chunk of itemsNothing
函数用途返回值
read()
获取下一个项目项目或空值
process(item)
转换项目处理后的项目或空值(过滤)
write(items)
保存块中的项目

Optional Functions

可选函数

FunctionPurposeReturns
beforeStep()
Initialize (open files, queries)Nothing
afterStep(success)
Cleanup (close files)Nothing
getTotalCount()
Return total items for progressNumber
beforeChunk()
Before each chunkNothing
afterChunk()
After each chunkNothing
函数用途返回值
beforeStep()
初始化(打开文件、查询)
afterStep(success)
清理(关闭文件)
getTotalCount()
返回总项目数以跟踪进度数字
beforeChunk()
每个块处理前执行
afterChunk()
每个块处理后执行

Script (scripts/steps/myChunkStep.js)

脚本(scripts/steps/myChunkStep.js)

javascript
'use strict';

var ProductMgr = require('dw/catalog/ProductMgr');
var Transaction = require('dw/system/Transaction');
var Logger = require('dw/system/Logger');
var File = require('dw/io/File');
var FileWriter = require('dw/io/FileWriter');

var log = Logger.getLogger('job', 'MyChunkStep');
var products;
var fileWriter;

/**
 * Initialize before processing
 */
exports.beforeStep = function (parameters, stepExecution) {
    log.info('Starting chunk processing');

    // Open resources
    var outputFile = new File(File.IMPEX + '/export/products.csv');
    fileWriter = new FileWriter(outputFile);
    fileWriter.writeLine('ID,Name,Price');

    // Query products
    products = ProductMgr.queryAllSiteProducts();
};

/**
 * Get total count for progress tracking
 */
exports.getTotalCount = function (parameters, stepExecution) {
    return products.count;
};

/**
 * Read next item
 * Return nothing to signal end of data
 */
exports.read = function (parameters, stepExecution) {
    if (products.hasNext()) {
        return products.next();
    }
    // Return nothing = end of data
};

/**
 * Process single item
 * Return nothing to filter out item
 */
exports.process = function (product, parameters, stepExecution) {
    // Filter: skip offline products
    if (!product.online) {
        return;  // Filtered out
    }

    // Transform
    return {
        id: product.ID,
        name: product.name,
        price: product.priceModel.price.value
    };
};

/**
 * Write chunk of processed items
 */
exports.write = function (items, parameters, stepExecution) {
    for (var i = 0; i < items.size(); i++) {
        var item = items.get(i);
        fileWriter.writeLine(item.id + ',' + item.name + ',' + item.price);
    }
};

/**
 * Cleanup after all chunks
 */
exports.afterStep = function (success, parameters, stepExecution) {
    // Close resources
    if (fileWriter) {
        fileWriter.close();
    }
    if (products) {
        products.close();
    }

    if (success) {
        log.info('Chunk processing completed successfully');
    } else {
        log.error('Chunk processing failed');
    }
};
javascript
'use strict';

var ProductMgr = require('dw/catalog/ProductMgr');
var Transaction = require('dw/system/Transaction');
var Logger = require('dw/system/Logger');
var File = require('dw/io/File');
var FileWriter = require('dw/io/FileWriter');

var log = Logger.getLogger('job', 'MyChunkStep');
var products;
var fileWriter;

/**
 * Initialize before processing
 */
exports.beforeStep = function (parameters, stepExecution) {
    log.info('Starting chunk processing');

    // Open resources
    var outputFile = new File(File.IMPEX + '/export/products.csv');
    fileWriter = new FileWriter(outputFile);
    fileWriter.writeLine('ID,Name,Price');

    // Query products
    products = ProductMgr.queryAllSiteProducts();
};

/**
 * Get total count for progress tracking
 */
exports.getTotalCount = function (parameters, stepExecution) {
    return products.count;
};

/**
 * Read next item
 * Return nothing to signal end of data
 */
exports.read = function (parameters, stepExecution) {
    if (products.hasNext()) {
        return products.next();
    }
    // Return nothing = end of data
};

/**
 * Process single item
 * Return nothing to filter out item
 */
exports.process = function (product, parameters, stepExecution) {
    // Filter: skip offline products
    if (!product.online) {
        return;  // Filtered out
    }

    // Transform
    return {
        id: product.ID,
        name: product.name,
        price: product.priceModel.price.value
    };
};

/**
 * Write chunk of processed items
 */
exports.write = function (items, parameters, stepExecution) {
    for (var i = 0; i < items.size(); i++) {
        var item = items.get(i);
        fileWriter.writeLine(item.id + ',' + item.name + ',' + item.price);
    }
};

/**
 * Cleanup after all chunks
 */
exports.afterStep = function (success, parameters, stepExecution) {
    // Close resources
    if (fileWriter) {
        fileWriter.close();
    }
    if (products) {
        products.close();
    }

    if (success) {
        log.info('Chunk processing completed successfully');
    } else {
        log.error('Chunk processing failed');
    }
};

Parameter Types

参数类型

TypeDescriptionExample Value
string
Text value
"my-value"
boolean
true/false
true
long
Integer
12345
double
Decimal
123.45
datetime-string
ISO datetime
"2024-01-15T10:30:00Z"
date-string
ISO date
"2024-01-15"
time-string
ISO time
"10:30:00"
类型描述示例值
string
文本值
"my-value"
boolean
布尔值(true/false)
true
long
整数
12345
double
小数
123.45
datetime-string
ISO格式日期时间
"2024-01-15T10:30:00Z"
date-string
ISO格式日期
"2024-01-15"
time-string
ISO格式时间
"10:30:00"

Parameter Validation Attributes

参数验证属性

AttributeApplies ToDescription
@trim
AllTrim whitespace before validation (default:
true
)
@required
AllMark as required (default:
true
)
@target-type
datetime-string, date-string, time-stringConvert to
long
or
date
(default:
date
)
pattern
stringRegex pattern for validation
min-length
stringMinimum string length (must be ≥1)
max-length
stringMaximum string length (max 1000 chars total)
min-value
long, double, datetime-string, time-stringMinimum numeric value
max-value
long, double, datetime-string, time-stringMaximum numeric value
enum-values
AllRestrict to allowed values (dropdown in BM)
属性适用类型描述
@trim
所有类型验证前去除空格(默认:
true
@required
所有类型标记为必填(默认:
true
@target-type
datetime-string、date-string、time-string转换为
long
date
类型(默认:
date
pattern
string用于验证的正则表达式
min-length
string最小字符串长度(必须≥1)
max-length
string最大字符串长度(总计不超过1000字符)
min-value
long、double、datetime-string、time-string最小数值
max-value
long、double、datetime-string、time-string最大数值
enum-values
所有类型限制为允许的值(在BM中显示为下拉菜单)

Configuration Options

配置选项

steptypes.json Attributes

steptypes.json属性

AttributeRequiredDescription
@type-id
YesUnique ID (must start with
custom.
, max 100 chars)
@supports-parallel-execution
NoAllow parallel execution (default:
true
)
@supports-site-context
NoAvailable in site-scoped jobs (default:
true
)
@supports-organization-context
NoAvailable in org-scoped jobs (default:
true
)
module
YesPath to script module
function
YesFunction name to execute (task-oriented)
timeout-in-seconds
NoStep timeout (recommended to set)
transactional
NoWrap in single transaction (default:
false
)
chunk-size
Yes*Items per chunk (*required for chunk steps)
Context Constraints:
@supports-site-context
and
@supports-organization-context
cannot both be
true
or both be
false
- one must be
true
and the other
false
.
属性是否必填描述
@type-id
唯一ID(必须以
custom.
开头,最长100字符)
@supports-parallel-execution
允许并行执行(默认:
true
@supports-site-context
可用于站点范围的作业(默认:
true
@supports-organization-context
可用于组织范围的作业(默认:
true
module
脚本模块的路径
function
要执行的函数名称(面向任务的步骤)
timeout-in-seconds
步骤超时时间(建议设置)
transactional
包装在单个事务中(默认:
false
chunk-size
是*每个块的项目数(*面向块的步骤必填)
上下文约束:
@supports-site-context
@supports-organization-context
不能同时为
true
或同时为
false
——必须一个为
true
,另一个为
false

Best Practices

最佳实践

  1. Use chunk-oriented for bulk data - better progress tracking and resumability
  2. Close resources in
    afterStep()
    - queries, files, connections
  3. Set explicit timeouts - default may be too short
  4. Log progress - helps debugging
  5. Handle errors gracefully - return proper Status objects
  6. Don't rely on transactional=true - use
    Transaction.wrap()
    for control
  1. 对批量数据使用面向块的步骤——进度跟踪和可恢复性更好
  2. afterStep()
    中关闭资源
    ——查询、文件、连接
  3. 设置明确的超时时间——默认值可能过短
  4. 记录进度——有助于调试
  5. 优雅处理错误——返回正确的Status对象
  6. 不要依赖transactional=true——使用
    Transaction.wrap()
    来控制事务

Related Skills

相关技能

  • b2c-cli:b2c-job
    - For running existing jobs and importing site archives via CLI
  • b2c:b2c-webservices
    - When job steps need to call external HTTP services or APIs, use the webservices skill for service configuration and HTTP client patterns
  • b2c-cli:b2c-job
    ——用于通过CLI运行现有作业和导入站点归档
  • b2c:b2c-webservices
    ——当作业步骤需要调用外部HTTP服务或API时,使用webservices技能进行服务配置和HTTP客户端模式开发

Detailed Reference

详细参考

  • Task-Oriented Steps - Full task step patterns
  • Chunk-Oriented Steps - Full chunk step patterns
  • steptypes.json Reference - Complete schema
  • 面向任务的步骤——完整的任务步骤模式
  • 面向块的步骤——完整的块步骤模式
  • steptypes.json参考——完整的模式