Panel | |
---|---|
|
Disclaimer
Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/
Overview
...
Panel | |
---|---|
|
Disclaimer
Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/
Overview
The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Copy objects from source to target. |
Write Options | Do not overwrite existing objects, Replace existing objects, Create new with timestamp if exists, Use Custom Parameters: Do not overwrite existing objects Objects on the target with the same name will not be overwritten. Replace existing objects Objects on the target with the same name will not be overwritten - even if the same object exists on the target. Create new with timestamp if exists If an Object on the target with the same name exists the object will be duplicated with a timestamp added to the name. The file extension will be kept:
For example: report1.txt exists on the target, then a new file with a timestamp as postfix will be created on the target; for example, report1_20220513_093057.txt. Replace existing objects Objects are always overwritten, even if same object exists already on the target. Use Custom Parameters Only the parameters in the field “Customer Parameters” will be applied (other Parameters will be ignored) |
error-on-no-transfer | The error-on-no-transfer flag lets the task fail in case no transfer was done. |
Source Storage Type | Select a source storage type:
|
Target Storage Type | Select the target storage type:
|
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
This For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
max-depth ( recursion depth ) | limits the recursion depth
Attention: If you change max-depth to a value greater than 1, a recursive action is performed. |
...
Before running a move, moveto, copy, copyto, or sync command, you can always try the command by setting the Dry-run option in the Task.
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing |
Action | [ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] copyto copies an objects from source to target and optionally renames the file on the target. |
Write Options | [Do not overwrite existing objects, Replace existing objects, Create new with timestamp if exists, Use Custom Parameters} Do not overwrite existing objects Objects on the target with the same name will not be overwritten. Replace existing objects Objects on the target with the same name will not be overwritten - even if the same object exists on the target. Create new with timestamp if exists If an Object on the target with the same name exists the object will be duplicated with a timestamp added to the name. The file extension will be kept:
|
For example, |
report1.txt exists on the target, then a new file with a timestamp as postfix will be created on the target |
; for example, report1_20220513_093057.txt. Replace existing objects Objects are always overwritten, even if same object |
exists already on the target. Use Custom Parameters |
Only the parameters in the field “Customer Parameters” will be applied (other Parameters will be |
ignored) | |
error-on-no-transfer | The error-on-no-transfer flag let the task fail in case no transfer was done. |
Source Storage Type | Select a source storage |
type: |
|
|
|
|
|
|
|
|
|
|
|
| |
Target storage Type | Select the target storage |
type: |
|
|
|
|
|
|
|
|
|
|
|
Connection File | In the connection file, configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the connection file. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of possible flags, refer to https://rclone.org/flags/. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Action: list objects
...
Agent
...
Linux or Windows Universal Agent to execute the Rclone command line.
...
Agent Cluster
...
Example
The following example copies the file report4.txt from the amazon s3 bucket stonebranchpm folder in to the azure container stonebranchpm.
The file will be renamed at the target to: stonebranchpm/report4_${_date('yyyy-MM-dd_HH-mm-ss')}.txt; for example report4_2022-05-16_08-00-29.txt
Action: list objects
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] List objects in an OS directory or cloud object store. |
Storage Type | Select the storage type:
|
Source Credential | Credential used for the selected Storage Type. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. |
Directory | Name of the directory you want to list the files in. For example, Directory: stonebranchpm/out would mean to list all objects in the bucket stonebranchpm folder out. |
List Format | [ list size and path, list modification time, size and path, list objects and directories, list objects and directories (Json) ] The Choice box specifies how the output should be formatted. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
max-depth ( recursion depth ) | limits the recursion depth
|
Example
The following example lists all objects starting with report in the S3 bucket stonebranchpm.
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, copyto, list objects, move, moveto, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Move move objects from source to target. |
Source | Enter a source storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
error-on-no-transfer | The error-on-no-transfer flag lets the task fail in case no transfer was done. |
Source Storage Type | Select a source storage type:
|
Target storage Type | Select the target storage type:
|
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
max-depth ( recursion depth ) | limits the recursion depth
Attention: If you change max-depth to a value greater than 1, a recursive action is performed. |
Example
The following example moves the objects matching report[3-6].txt from the Windows source directory C:\demo\out to the target Azure container stonebranchpm.
The windows Source is the server, where the Universal Agent is installed.
No recursive copy will be done, because max-depth
is set to 1.
Action: moveto
The Action moveto, moves a single object from source to target and allows to rename the object on the target.
Before running a move, moveto, copy, copyto, or sync command, you can always try the command by setting the Dry-run option in the Task.
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Move objects from source to target. |
error-on-no-transfer | The error-on-no-transfer flag lets the task fail in case no transfer was done. |
Source Storage Type | Select a source storage type:
|
Target storage Type | Select the target storage type:
|
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS |
For details on how to configure the Connection File, refer to section Configure the Connection File.
Filter Type
[ include, exclude, none ]
Define the type of filter to apply.
Filter
Provide the Patterns for matching file matching; for example, in a copy action:
Filter Type: include
Filter report[1-3].txt
means all reports with names matching report1.txt
and report2.txt
will be copied.
S3 to Azure Blob Storage, you would need to configure the connection Parameters for AWS S3 and Azure Blob Storage. for details on how to configure the Connection File refer to section Configure the connection file | |
Other Parameters | This field can be used to apply additional flag parameters to the selected |
For a list of all possible flags, refer to Global Flags.
Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.
Attention: If the flag max-depth 1 is not set, a recursive action is performed.action. For a list of all possible flags refer to: https://rclone.org/flags/ | |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC |
Basse URL | Universal Controller URL |
For example, https://192.168.88.40/uc | |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example moves the objects starting with report file report4.txt from the source S3 amazon s3 bucket stonebranchpm folder in to the target S3 bucket stonebranchpmtestazure container stonebranchpm.
No recursive move will be done, because the flag --max-depth 1 is set.
The file will be renamed at the target to: stonebranchpm/report4_${_date('yyyy-MM-dd_HH-mm-ss')}.txt; for example, report4_2022-05-16_08-00-29.txt
Action: remove-object
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove objects in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
File Path | Path to the directory in which you want to remove the objects. For example: File Path: stonebranchpmtest Filter: report[1-3].txt This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example removes all S3 objects matching the filter: report[1-3].txt
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory you want to remove. The directory can be an object store or a file system OS directory. The directory needs to be empty before it can be removed. For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example removes the s3 object store stonebranchdemo.
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Create an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Directory | Name of the directory you want to create. The directory can be an object store or a file system OS directory. For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example creates the S3 bucket stonebranchdemo.
...
Field | Description | |||||||
Agent | Linux or Windows Universal Agent to execute the Rclone command line. | |||||||
Agent Cluster | Optional Agent Cluster for load balancing. | |||||||
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Download a URL's content and copy it to the destination without saving it in temporary storage. | |||||||
Source | URL parameter. Download a URL's content and copy it to the destination without saving it in temporary storage. | |||||||
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. | |||||||
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. | |||||||
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Useful parameters for the copy-url command:
| |||||||
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. | |||||||
UAC Rest Credentials | Universal Controller Rest API Credentials | |||||||
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc | |||||||
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example downloads a PDF file:
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Monitor a file or object in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory to scan for the files to monitor. The directory can be an object store or a file system OS directory. For example: Directory: stonebranchpm/out This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include Filter This means all reports with names matching For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Set the two flags --use-server-modtime and --max-depth 1. With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux). The Parameter Flag max-depth 1 means only the current directory is in scope (no recursive monitoring). |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
Trigger on Existence | [ checked , unchecked] If checked, the monitor goes to success even if the file already exists when it was started. |
Interval | [ 10s, 60s, 180s ] Monitor interval to check of the file(s) in the configured directory. For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example monitors S3 bucket folder stonebranchpm/in for the object test3.txt
...