Disclaimer
Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/
Overview
The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.
It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint.
An advantage of using the Inter-Cloud Data Transfer integration over other approaches is that data is streamed from one object store to another without the need for intermediate storage.
Integrations with this solution package include:
- AWS S3
- Google Cloud
- Sharepoint
- Dropbox
- OneDrive
- Hadoop Distributed File Storage (HDFS)
Software Requirements
Software Requirements for Universal Agent
Universal Agent for Linux or Windows Version 7.0.0.0 or later is required.
Universal Agent needs to be installed with python option (--python yes).
Software Requirements for Universal Controller
Universal Controller 7.0.0.0 or later.
Software Requirements for the Application to be Scheduled
Rclone: v1.57.1 or higher needs to be installed on server where the Universal Agent is installed.
Rclone can be installed on Windows and Linux
To install Rclone on Linux systems, run:
curl https://rclone.org/install.sh | sudo bash
Note
If the URL is not reachable from your server, the Linux installation also can be done from pre-compiled binary.
To install Rclone on Linux system from a pre-compiled binary
Fetch and unpack
curl -O <https://downloads.rclone.org/rclone-current-linux-amd64.zip> unzip rclone-current-linux-amd64.zip cd rclone-*-linux-amd64
Copy binary file
sudo cp rclone /usr/bin/ sudo chown root:root /usr/bin/rclone sudo chmod 755 /usr/bin/rclone
Install manpage
sudo mkdir -p /usr/local/share/man/man1 sudo cp rclone.1 /usr/local/share/man/man1/ sudo mandb
To install Rclone on Windows systems:
Rclone is a Go program and comes as a single binary file.
Download the relevant binary here.
Extract the rclone or rclone.exe binary from the archive into a folder, which is in the windows path
Key Features
Some details about the Inter-Cloud Data Transfer Task:
Transfer data to, from, and between any cloud provider
Transfer between any major storage applications like SharePoint or Dropbox
Transfer data to and from a Hadoop File System (HDFS)
Download a URL's content and copy it to the destination without saving it in temporary storage
- Data is streamed from one object store to another (no intermediate storage)
Very Fast, if the object stores are in the same region
Preserves always timestamps and verifies checksums
Supports encryption, caching, compression, chunking
Perform Dry-runs
Dynamic Token updates for SharePoint connections
Regular Expression based include/exclude filter rules
Supported actions are:
List objects, List directory,
Copy/ Move
Remove object / object store
Perform Dry-runs
Monitor object
Copy URL
Import Inter-Cloud Data Transfer Universal Template
To use this Universal Template, you first must perform the following steps:
This Universal Task requires the /wiki/spaces/UC71x/pages/5178443 feature. Check that the/wiki/spaces/UC71x/pages/5177877 system property has been set to true.
Download the provided ZIP file.
In the Universal Controller UI, select Administration > Configuration > Universal Templates to display the current list of /wiki/spaces/UC71x/pages/5178054.
Click Import Template.
Select the template ZIP file and Import..
When the template has been imported successfully, the Universal Template will appear on the list. Refresh your Navigation Tree to see these tasks in the Automation Center Menu.
Configure Inter-Cloud Data Transfer Universal Tasks
To configure a new Inter-Cloud Data Transfer there are two steps required:
Configure the connection file
Create a new Inter-Cloud Data Transfer Task
Configure the Connection File
The connection file contains all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.
The provided connection file, below, contains the basic connection Parameters (flags) to connect to AWS, Azure, Linux, OneDrive (SharePoint), Google and HDFS. Additional Parameters can be added if required. Refer to the rclone documentation for all possible flags: Global Flags.
The following connection file must be saved in the Universal Controller script library. This file is later referenced in the different Inter-Cloud Data Transfer tasks.
[amazon_s3_target] type = s3 provider = AWS env_auth = false access_key_id = ${_credentialUser('${ops_var_target_credential}')} secret_access_key = ${_credentialToken('${ops_var_target_credential}')} region = us-east-2 [amazon_s3_source] type = s3 provider = AWS env_auth = false access_key_id = ${_credentialUser('${ops_var_source_credential}')} secret_access_key = ${_credentialToken('${ops_var_source_credential}')} region = us-east-2 [microsoft_azure_blob_storage_sas_target] type = azureblob sas_url = ${_credentialPwd('${ops_var_target_credential}')} [microsoft_azure_blob_storage_sas_source] type = azureblob sas_url = ${_credentialPwd('${ops_var_source_credential}')} [microsoft_azure_blob_storage_source] type = azureblob account = ${_credentialUser('${ops_var_source_credential}')} key = ${_credentialPwd('${ops_var_source_credential}')} [microsoft_azure_blob_storage_target] type = azureblob account = ${_credentialUser('${ops_var_target_credential_credential}')} key = ${_credentialPwd('${ops_var_target_credential}')} [google_cloud_storage_source] type = google cloud storage service_account_file = ${_credentialPwd('${ops_var_source_credential}')} object_acl = bucketOwnerFullControl project_number = clagcs location = europe-west3 [google_cloud_storage_target] type = google cloud storage service_account_file = ${_credentialPwd('${ops_var_target_credential}')} object_acl = bucketOwnerFullControl project_number = clagcs location = europe-west3 [onedrive_source] type = onedrive token = ${_credentialToken('${ops_var_source_credential}')} drive_id = ${_credentialUser('${ops_var_source_credential}')} drive_type = business update_credential = token [onedrive_target] type = onedrive token = ${_credentialToken('${ops_var_target_credential}')} drive_id = ${_credentialUser('${ops_var_target_credential}')} drive_type = business update_credential = token [hdfs_source] type = hdfs namenode = 172.18.0.2:8020 username = maria_dev [hdfs_target] type = hdfs namenode = 172.18.0.2:8020 username = maria_dev [linux_source] type = local [linux_target] type = local
Considerations
Rclone supports connections to almost any storage system on the market:
Overview of Cloud Storage Systems
However, the current Universal Task has only been tested for the following storage types:
LINUX
AWS S3
Azure Blob Storage
Google GCS
Microsoft One Drive incl. Share Point
HDFS
HTTPS URL
Note
If you want to connect to a different system, (for example, Dropbox), you should contact Stonebranch for support.
Create an Inter-Cloud Data Transfer Task
For Universal Task Inter-Cloud Data Transfer, create a new task and enter the task-specific Details that were created in the Universal Template.
The following Actions are supported:
Action | Description |
list directory | List directories; for example,
|
copy | Copy objects from source to target. |
list objects | List objects in an OS directory or cloud object store. |
move | Move objects from source to target. |
remove-object | Remove objects in an OS directory or cloud object store. |
remove-object-store | Remove an OS directory or cloud object store. |
create-object-store | Create an OS directory or cloud object store. |
copy-url | Download a URL's content and copy it to the destination without saving it in temporary storage. |
monitor-object | Monitor a file or object in an OS directory or cloud object store. |
In the following for each task action, the fields will be described and an example is provided.
Important Considerations
Before running a move or copy command, you can always try the command by setting the Dry-run option in the Task.
The following flags should be considered in any Copy, Move, remove-object and remove-object-store operations.
Flag | Description |
---|---|
max-depth 1 | Limits the recursion depth. max-depth 1 means only the current directory is in scope. Attention: If the flag is not set, a recursive action is performed. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object, and remove-object-store in the task field Other Parameters to avoid a recursive action. |
ignore-existing | Skips all files that exist on destination. Examples:
Recommendation: Add the flag --ignore-existing to all copy and move tasks, which avoids a file being deleted if the source is equal to the destination in a copy operation. |
error-on-no-transfer | The error-on-no-transfer flag let the task fail in case no transfer was done. |
update | To Skip files that are newer on the destination during a move or copy action, you can add the flag --update. |
Note
Refer to the rclone documentation for all possible flags: Global Flags.
Inter-Cloud Data Transfer Actions
Action: list directory
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] list directories; for example,
|
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL; for example, https://192.168.88.40/uc. |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to avoid a recursive action. |
Example
The following example list all AWS S3 buckets in the AWS account configured in the cloud2cloud.conf
file.
Action: copy
Before running a move or copy command, you can always try the command by setting the Dry-run option in the Task.
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Copy objects from source to target. |
Source | Enter a source storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include Filter This means all reports with names matching For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all copy and move task in the task field Other Parameters to avoid a recursive action. Attention: If the flag is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example copies all file starting with report4 from the amazon s3 bucket stonebranchpm folder in to the Azure container: stonebranchpm.
No recursive copy will be done, because the flag --max-depth 1 is set.
Action: list objects
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] List objects in an OS directory or cloud object store. |
Storage Type | Enter a Storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to only to avoid a recursive action. |
Directory | Name of the directory you want to list the files in. For example, Directory: stonebranchpm/out would mean to list all objects in the bucket stonebranchpm folder out. |
List Format | [ list size and path, list modification time, size and path, list objects and directories, list objects and directories (Json) ] The Choice box specifies how the output should be formatted. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example lists all objects starting with report in the S3 bucket stonebranchpm.
Action: move
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Move objects from source to target. |
Source | Enter a source storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example moves the objects starting with report from the source S3 bucket stonebranchpm folder in to the target S3 bucket stonebranchpmtest.
No recursive move will be done, because the flag --max-depth 1 is set.
Action: remove-object
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove objects in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
File Path | Path to the directory in which you want to remove the objects. For example: File Path: stonebranchpmtest Filter: report[1-3].txt This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example removes all S3 objects matching the filter: report[1-3].txt
( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest folder in.
No recursive remove will be done, because the flag --max-depth 1 is set.
Action: remove-object-store
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory you want to remove. The directory can be an object store or a file system OS directory. The directory needs to be empty before it can be removed. For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example removes the s3 object store stonebranchdemo.
No recursive remove of object stores will be done, because the flag --max-depth 1 is set.
Action: create-object-store
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Create an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Directory | Name of the directory you want to create. The directory can be an object store or a file system OS directory. For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
Example
The following example creates the S3 bucket stonebranchdemo.
Action: copy-url
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Download a URL's content and copy it to the destination without saving it in temporary storage. |
Source | URL parameter. Download a URL's content and copy it to the destination without saving it in temporary storage. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Useful parameters for the copy-url command: -a, --auto-filename Get the file name from the URL and use it for destination file path -h, --help help for copyurl --no-clobber Prevent overwriting file with same name -p, --print-filename Print the resulting name from --auto-filename --stdout Write the output to stdout rather than a file |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example downloads a PDF file:
- From the webaddress: https://www.bundesbank.de/resource/../blz-loeschungen-aktuell-data.pdf
- To the Linux folder: /home/stonebranch/demo/in
The Linux folder is located on the server where the Agent ${AGT_LINUX} runs.
Action: monitor-object
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Monitor a file or object in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory to scan for the files to monitor. The directory can be an object store or a file system OS directory. For example: Directory: stonebranchpm/out This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include Filter This means all reports with names matching For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Set the two flags --use-server-modtime and --max-depth 1. With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux). The Parameter Flag max-depth 1 means only the current directory is in scope (no recursive monitoring). |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
Trigger on Existence | [ checked , unchecked] If checked, the monitor goes to success even if the file already exists when it was started. |
Interval | [ 10s, 60s, 180s ] Monitor interval to check of the file(s) in the configured directory. For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory. |
UAC Rest Credentials | Universal Controller Rest API Credentials |
UAC Base URL | Universal Controller URL For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL] |
Example
The following example monitors S3 bucket folder stonebranchpm/in for the object test3.txt
If the object triggers the monitor (status success), only the object is uploaded after the monitor has been started.
Inter-Cloud Data Transfer Example Cases
The following section provides examples for the Inter-Cloud Data Transfer Task actions:
Copy
Move
Remove-object
Monitor-object
N# | Name |
Copy | |
C1 | Copy a file to target. The file does not exist at the target. |
C2 | Copy two files. One of files already exists at the target. |
C3 | Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target. |
C4 | Copy a file to a target, which already exists with Parameter error-on-no-transfer set. |
C5 | Copy all files except one, from one folder into another folder. |
C6 | Copy non-recursively a file from a folder into another folder using the parameter max-depth 1. |
C7 | Copy files using multiple Parameters. |
Move | |
M1 | Move without deleting if no transfer took place. |
M2 | Move a file with same source and target - config error case. |
Remove-object | |
R1 | Delete a file in a specific folder by using the parameter: max-depth 1. |
R2 | Delete all files matching a given name recursively in all folders starting from a given entry folder. |
Monitor-object | |
FM01 | Monitor a file in AWS - file does not exist on start of monitor. |
FM02 | Monitor a file in AWS - file exists on start of monitor. |
Copy
The following file transfer examples will be described for the Copy action:
N# | Name |
C1 | Copy a file to target. The file does not exist at the target. |
C2 | Copy two files. One of files already exists at the target. |
C3 | Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target. |
C4 | Copy a file to a target, which already exists with Parameter error-on-no-transfer set. |
C5 | Copy all files except one, from one folder into another folder. |
C6 | Copy non-recursively a file from a folder into another folder using the parameter max-depth 1. |
C7 | Copy files using multiple Parameters. |
Note
Before running a copy command, you can always try the command by setting the Dry-run option in the Task.
Copy a File to Target: File does not Exist at the Target
Copy the file report1.txt to target stonebranchpm/in/. The file does not exist at the target.
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type: amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter: report3.txt
Other Parameters: none
Result
Task Status: success
Output:
report1.txt: Copied
(new)
Copy Two Files:. One File Already Exists at the Target
Copy two files report2.txt and report3.txt to the target stonebranchpm/in/. The file report2.txt (same size and content) already exists at the target.
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type: amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter: report[2-3].txt
Other Parameters: none
Result
Task Status: success
Output:
report3.txt: Copied (new)
report2.txt: Unchanged skipping
Note
If the file report2.txt would have the same name, but a different size, than the task would overwrite the file report2.txt on the target.
Copy a File to a Target with Parameter ignore-existing Set. A File with the Same filename Exists on the Target.
Copy a file report2.txt to the target stonebranchpm/in/ with Parameter ignore-existing set . A file with the same filename exists at the target (size or content can be the same or different).
When ignore-existing is set, all files that exist on destination with the same filename are skipped. (No overwrite is done.)
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type: amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter: report2.txt
Other Parameters: --ignore-existing
Result
Task Status: success
Output:
There was nothing to transfer.
Exiting due to ERROR(s).