Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel

Table of Contents

Disclaimer

Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/

Overview

The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.

It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint.

An advantage of using the Inter-Cloud Data Transfer integration over other approaches is that data is streamed from one object store to another without the need for intermediate storage. 

Integrations with this solution package include: 

  • AWS S3
  • Google Cloud
  • Sharepoint
  • Dropbox
  • OneDrive
  • Hadoop Distributed File Storage (HDFS)

Software Requirements

Software Requirements for Universal Agent

  • Universal Agent for Linux or Windows Version 7.0.0.0 or later is required.

  • Universal Agent needs to be installed with python option (--python yes).

Software Requirements for Universal Controller

  • Universal Controller 7.0.0.0 or later.

Software Requirements for the Application to be Scheduled

  • Rclone: v1.57.1 or higher needs to be installed on server where the Universal Agent is installed.

  • Rclone can be installed on Windows and Linux

  • To install Rclone on Linux systems, run:

    Code Block
    languagepy
    linenumberstrue
    curl https://rclone.org/install.sh | sudo bash
    Note
    titleNote

    If the URL is not reachable from your server, the Linux installation also can be done from pre-compiled binary.

  • To install Rclone on Linux system from a pre-compiled binary

    Fetch and unpack

    Code Block
    languagepy
    linenumberstrue
    curl -O <https://downloads.rclone.org/rclone-current-linux-amd64.zip> 
    unzip rclone-current-linux-amd64.zip 
    cd rclone-*-linux-amd64

    Copy binary file

    Code Block
    languagepy
    linenumberstrue
    sudo cp rclone /usr/bin/ 
    sudo chown root:root /usr/bin/rclone 
    sudo chmod 755 /usr/bin/rclone

    Install manpage

    Code Block
    languagepy
    linenumberstrue
    sudo mkdir -p /usr/local/share/man/man1 
    sudo cp rclone.1 /usr/local/share/man/man1/ 
    sudo mandb 
  • To install Rclone on Windows systems:

    • Rclone is a Go program and comes as a single binary file.

    • Download the relevant binary here.

    • Extract the rclone or rclone.exe binary from the archive into a folder, which is in the windows path

Key Features

Some details about the Inter-Cloud Data Transfer Task:

  • Transfer data to, from, and between any cloud provider

  • Transfer between any major storage applications like SharePoint or Dropbox

  • Transfer data to and from a Hadoop File System (HDFS)

  • Download a URL's content and copy it to the destination without saving it in temporary storage

  • Data is streamed from one object store to another (no intermediate storage)
  • Very Fast, if the object stores are in the same region

  • Preserves always timestamps and verifies checksums

  • Supports encryption, caching, compression, chunking

  • Perform Dry-runs

  • Dynamic Token updates for SharePoint connections

  • Regular Expression based include/exclude filter rules

  • Supported actions are:

    • List objects, List directory,

    • Copy/ Move

    • Remove object / object store

    • Perform Dry-runs

    • Monitor object

    • Copy URL

Image Removed

Import Inter-Cloud Data Transfer Universal Template

To use this Universal Template, you first must perform the following steps:

  1. This Universal Task requires the Resolvable Credentials feature. Check that the Resolvable Credentials Permitted system property has been set to true.
  2. Download the provided ZIP file.
  3. In the Universal Controller UI, select Configuration > Universal Templates to display the current list of Universal Templates.
  4. Click Import Template.
  5. Select the template ZIP file and Import..

When the template has been imported successfully, the Universal Template will appear on the list. Refresh your Navigation Tree to see these tasks in the Automation Center Menu.

Configure Inter-Cloud Data Transfer Universal Tasks

To configure a new Inter-Cloud Data Transfer there are two steps required:

  1. Configure the connection file

  2. Create a new Inter-Cloud Data Transfer Task

...

The connection file contains all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

The provided connection file, below, contains the basic connection Parameters (flags) to connect to AWS, Azure, Linux, OneDrive (SharePoint), Google and HDFS. Additional Parameters can be added if required. Refer to the rclone documentation for all possible flags: Global Flags.

The following connection file must be saved in the Universal Controller script library. This file is later referenced in the different Inter-Cloud Data Transfer tasks.

Code Block
languagexml
titleconnections.conf
linenumberstrue
   			
[amazon_s3_target]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_target_credential}')}
secret_access_key = ${_credentialToken('${ops_var_target_credential}')}
region = us-east-2


[amazon_s3_source]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_source_credential}')}
secret_access_key = ${_credentialToken('${ops_var_source_credential}')}
region = us-east-2


[microsoft_azure_blob_storage_sas_target]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_target_credential}')}


[microsoft_azure_blob_storage_sas_source]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
key = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential_credential}')}
key = ${_credentialPwd('${ops_var_target_credential}')}


[google_cloud_storage_source]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_source_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[google_cloud_storage_target]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_target_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[onedrive_source]
type = onedrive
token = ${_credentialToken('${ops_var_source_credential}')}
drive_id = ${_credentialUser('${ops_var_source_credential}')}
drive_type = business
update_credential = token


[onedrive_target]
type = onedrive
token = ${_credentialToken('${ops_var_target_credential}')}
drive_id = ${_credentialUser('${ops_var_target_credential}')}
drive_type = business
update_credential = token


[hdfs_source]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[hdfs_target]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[linux_source]
type = local


[linux_target]
type = local
   
   

Image Removed

Considerations

Rclone supports connections to almost any storage system on the market:

Overview of Cloud Storage Systems

However, the current Universal Task has only been tested for the following storage types:

  • LINUX

  • AWS S3

  • Azure Blob Storage

  • Google GCS

  • Microsoft One Drive incl. Share Point

  • HDFS

  • HTTPS URL

Note
titleNote

If you want to connect to a different system, (for example, Dropbox), you should contact Stonebranch for support.

Create an Inter-Cloud Data Transfer Task

For Universal Task Inter-Cloud Data Transfer, create a new task and enter the task-specific Details that were created in the Universal Template.

The following Actions are supported:

...

list directory

...

List directories; for example,

  • List object stores like S3 buckets, Azure container.
  • List OS directories from Linux, Windows, HDFS.

...

copy

...

Copy objects from source to target.

...

list objects

...

List objects in an OS directory or cloud object store.

...

move

...

Move objects from source to target.

...

remove-object

...

Remove objects in an OS directory or cloud object store.

...

remove-object-store

...

Remove an OS directory or cloud object store.

...

create-object-store

...

Create an OS directory or cloud object store.

...

copy-url

...

Download a URL's content and copy it to the destination without saving it in temporary storage.

...

monitor-object

...

Monitor a file or object in an OS directory or cloud object store.

In the following for each task action, the fields will be described and an example is provided.

Important Considerations

  1. Before running a move or copy command, you can always try the command by setting the Dry-run option in the Task.

  2. The following flags should be considered in any Copy, Move, remove-object and remove-object-store operations.

...

Flag

...

Description

...

max-depth 1

...

Limits the recursion depth.

max-depth 1 means only the current directory is in scope.

Attention: If the flag is not set, a recursive action is performed.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object, and remove-object-store in the task field Other Parameters to avoid a recursive action.

...

ignore-existing

...

Skips all files that exist on destination.

Examples:

  1. You move a file to a destination that already exists. In this case, rclone will not perform the copy but instead deletes the source file. If you set the flag --ignore-existing, the delete operation will not be performed.

  2. The --ignore-existing parameter avoids a file being deleted if the source is equal to the destination in a copy operation.

Recommendation: Add the flag --ignore-existing to all copy and move tasks, which avoids a file being deleted if the source is equal to the destination in a copy operation.

...

error-on-no-transfer

...

The error-on-no-transfer flag let the task fail in case no transfer was done.

...

update

...

To Skip files that are newer on the destination during a move or copy action, you can add the flag --update.

Note
titleNote

Refer to the rclone documentation for all possible flags: Global Flags.

Inter-Cloud Data Transfer Actions

Action: list directory

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

list directories; for example,

  • List object stores like S3 buckets, Azure container.
  • List OS directories from Linux, Windows, HDFS.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL; for example, https://192.168.88.40/uc.

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to avoid a recursive action.

Example

The following example list all AWS S3 buckets in the AWS account configured in the cloud2cloud.conf file.

Image Removed

Action: copy

Before running a move or copy command, you can always try the command by setting the Dry-run option in the Task.

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Copy objects from source to target.

...

Source

...

Enter a source storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Target

...

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Filter Type

...

[ include, exclude, none ]

Define the type of filter to apply.

...

Filter

...

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt 

This means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all copy and move task in the task field Other Parameters to avoid a recursive action.

Attention: If the flag is not set, a recursive action is performed.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials

...

UAC Base URL

...

Universal Controller URL

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example copies all file starting with report4 from the amazon s3 bucket stonebranchpm folder in to the Azure container: stonebranchpm.

No recursive copy will be done, because the flag --max-depth 1 is set.

Image Removed

Action: list objects

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

List objects in an OS directory or cloud object store.

...

Enter a Storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Filter Type

...

[ include, exclude, none ]

Define the type of filter to apply.

...

Filter

...

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to only to avoid a recursive action.

...

Name of the directory you want to list the files in.

For example, Directory: stonebranchpm/out would mean to list all objects in the bucket stonebranchpm folder out.

...

[ list size and path, list modification time, size and path, list objects and directories, list objects and directories (Json) ]

The Choice box specifies how the output should be formatted.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL.

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example lists all objects starting with report in the S3 bucket stonebranchpm.

Image Removed

Action: move

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Move objects from source to target.

...

Source

...

Enter a source storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Target

...

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Filter Type

...

[ include, exclude, none ]

Define the type of filter to apply.

...

Filter

...

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL.

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example moves the objects starting with report from the source S3 bucket stonebranchpm folder in to the target S3 bucket stonebranchpmtest.

No recursive move will be done, because the flag --max-depth 1 is set.

Image Removed

Action: remove-object

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove objects in an OS directory or cloud object store.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Path to the directory in which you want to remove the objects.

For example:

File Path: stonebranchpmtest

Filter: report[1-3].txt

This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL.

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example removes all S3 objects matching the filter: report[1-3].txt

( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest folder in.

No recursive remove will be done, because the flag --max-depth 1 is set.

Image Removed

Action: remove-object-store

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove an OS directory or cloud object store.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Name of the directory you want to remove.

The directory can be an object store or a file system OS directory.

The directory needs to be empty before it can be removed.

For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest.

...

Connection File

...

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL.

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example removes the s3 object store stonebranchdemo.

No recursive remove of object stores will be done, because the flag --max-depth 1 is set.

Image Removed

Action: create-object-store

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Create an OS directory or cloud object store.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Name of the directory you want to create.

The directory can be an object store or a file system OS directory.

For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials.

...

UAC Base URL

...

Universal Controller URL.

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example creates the S3 bucket stonebranchdemo.

Image Removed

Action: copy-url

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Download a URL's content and copy it to the destination without saving it in temporary storage.

...

Source

...

URL parameter.

Download a URL's content and copy it to the destination without saving it in temporary storage.

...

Target

...

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Useful parameters for the copy-url command:

Code Block
languagepy
linenumberstrue
  -a, --auto-filename    Get the file name from the URL and use it for destination file path
  -h, --help             help for copyurl
      --no-clobber       Prevent overwriting file with same name
  -p, --print-filename   Print the resulting name from --auto-filename
      --stdout           Write the output to stdout rather than a file

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials

...

UAC Base URL

...

Universal Controller URL

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example downloads a PDF file:

The Linux folder is located on the server where the Agent ${AGT_LINUX} runs.

Image Removed

Action: monitor-object

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Monitor a file or object in an OS directory or cloud object store.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Name of the directory to scan for the files to monitor.

The directory can be an object store or a file system OS directory.

For example:

Directory: stonebranchpm/out
Filter: report1.txt

This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Filter Type

...

[ include, exclude, none ]

Define the type of filter to apply.

...

Filter

...

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt 

This means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Set the two flags --use-server-modtime and --max-depth 1.

With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux).

The Parameter Flag max-depth 1 means only the current directory is in scope (no recursive monitoring).

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

Trigger on Existence

...

[ checked , unchecked]

If checked, the monitor goes to success even if the file already exists when it was started.

...

[ 10s, 60s, 180s ]

Monitor interval to check of the file(s) in the configured directory.

For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials

...

UAC Base URL

...

Universal Controller URL

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example monitors S3 bucket folder stonebranchpm/in for the object test3.txt

If the object triggers the monitor (status success), only the object is uploaded after the monitor has been started.

Image Removed

Inter-Cloud Data Transfer Example Cases

The following section provides examples for the Inter-Cloud Data Transfer Task actions:

  • Copy

  • Move

  • Remove-object

  • Monitor-object

...

N#

...

Name

...

Copy

...

C1

...

Copy a file to target. The file does not exist at the target.

...

C2

...

Copy two files. One of files already exists at the target.

...

C3

...

Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target.

...

C4

...

Copy a file to a target, which already exists with Parameter error-on-no-transfer set.

...

C5

...

Copy all files except one, from one folder into another folder.

...

C6

...

Copy non-recursively a file from a folder into another folder using the parameter max-depth 1.

...

C7

...

Copy files using multiple Parameters.

...

Move

...

M1

...

Move without deleting if no transfer took place.

...

M2

...

Move a file with same source and target - config error case.

...

Remove-object

...

R1

...

Delete a file in a specific folder by using the parameter: max-depth 1.

...

R2

...

Delete all files matching a given name recursively in all folders starting from a given entry folder.

...

Monitor-object

...

FM01

...

Monitor a file in AWS - file does not exist on start of monitor.

...

FM02

...

Monitor a file in AWS - file exists on start of monitor.

Copy

The following file transfer examples will be described for the Copy action:

...

N#

...

Name

...

C1

...

Copy a file to target. The file does not exist at the target.

...

C2

...

Copy two files. One of files already exists at the target.

...

C3

...

Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target.

...

C4

...

Copy a file to a target, which already exists with Parameter error-on-no-transfer set.

...

C5

...

Copy all files except one, from one folder into another folder.

...

C6

...

Copy non-recursively a file from a folder into another folder using the parameter max-depth 1.

...

C7

...

Copy files using multiple Parameters.

Note
titleNote

Before running a copy command, you can always try the command by setting the Dry-run option in the Task.

Copy a File to Target: File does not Exist at the Target

Copy the file report1.txt to target stonebranchpm/in/. The file does not exist at the target.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report3.txt

  • Other Parameters: none

Image Removed

Result

  • Task Status: success

  • Output: report1.txt: Copied (new)

Copy Two Files:. One File Already Exists at the Target

Copy two files report2.txt and report3.txt to the target stonebranchpm/in/. The file report2.txt (same size and content) already exists at the target.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report[2-3].txt

  • Other Parameters: none

Image Removed

Result

  • Task Status: success

  • Output:

    • report3.txt: Copied (new)

    • report2.txt: Unchanged skipping

Note
titleNote

If the file report2.txt would have the same name, but a different size, than the task would overwrite the file report2.txt on the target.

Copy a File to a Target with Parameter ignore-existing Set. A File with the Same filename Exists on the Target.

Copy a file report2.txt to the target stonebranchpm/in/ with Parameter ignore-existing set . A file with the same filename exists at the target (size or content can be the same or different).

When ignore-existing is set, all files that exist on destination with the same filename are skipped. (No overwrite is done.)

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report2.txt

  • Other Parameters: --ignore-existing

Image Removed

Result

  • Task Status: failure

  • Output:

    • There was nothing to transfer.

    • Exiting due to ERROR(s).

Copy a File to a Target, which Already Exists with Parameter Flag error-on-no-transfer Set.

Copy a file report2.txt to a target, which already exists with Parameter error-on-no-transfer set .

When error-on-no-transfer is set, the task will fail in case no file is transferred.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report2.txt

  • Other Parameters: --error-on-no-transfer

Image Removed

Result

...

Task Status: failure

Output:

report2.txt: Unchanged skipping

...

There was nothing to transfer

...

Exiting due to ERROR(s).

Copy all Files except One, from One Folder into Another Folder

Copy contents of folder in into folder out excluding the file index.html.

Note
titleNote

If the folder out does not exist, it would be created.

Configuration

  • Source Storage Type: linux_source

  • Source: stonebranchpmtest/in

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpmtest/out

  • Filter Type: exclude

  • Filter: index.html

  • Other Parameters: none

Code Block
languagetext
stonebranchpmtest

├── in

│   ├── report1.txt

│   └── report2.txt

│   ├── sub01

│        ├── report1.txt <- this file will be copied incl. directory sub01

│   └── index.html <- will not be copied, because of the exclude filter

 

├── out

│   └── report10.txt

 

After execution of the delete action:

├── out

│   ├── report1.txt

│   └── report2.txt

│   ├── sub01

│        ├── report1.txt <- this file will be copied incl. directory sub01

│   └── report10.txt

Image Removed

Result

  • Task Status: success

  • Output:

    • report1.txt: Copied (new)

    • report2.txt: Copied (new)
    • sub1/report1.txt: Copied (new)

Copy Non-Recursively a File from a Folder into Another Folder by Using the Parameter max-depth 1. 

Copy the file report1.txt from the folder stonebranchpmtest/in into folder stonebranchpmtest2/. The Parameter max-depth 1 is used to copy non-recursively.

  • max-depth limits the recursion depth (default -1).

  • max-depth 1 means only the current directory is in scope.

Note
titleNote

The folder in will not be created on the target; only the file is copied to the provided target directory.

Configuration

  • Source Storage Type: amazon_s3_source

  • Source: stonebranchpmtest/in

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpmest2/out

  • Filter Type: include

  • Filter: report1.txt

  • Other Parameters: --max-depth 1

Code Block
languagetext
stonebranchpmtest

├── in

│   ├── report1.txt <- this file will be copied

│   ├── report2.txt

│   ├── sub01

│        ├── report1.txt <- this file will not be copied, because of --max-depth 1

├── report1.txt <- this file will not be copied

 

stonebranchpmtest2

├── report10.txt

After execution of the delete action:

stonebranchpmtest2

├── report1.txt

├── report10.txt

Image Removed

Result

  • Task Status: success

  • Output: report1.txt: Copied (new)

Copy Files Using Multiple Parameters

It is possible to provide multiple Parameters in the Other Parameters field of the Task.

Example:

Parameters: --max-depth 1 --error-on-no-transfer

In this case, both Parameters are applied.

Configuration

  • Source Storage Type: amazon_s3_source

  • Source: stonebranchpmtest/in

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpmest2/out

  • Filter Type: include

  • Filter: report1.txt

  • Other Parameters: --max-depth 1, --error-on-no-transfer

Image Removed

Result

The Parameters: --max-depth 1 and --error-on-no-transfer are both applied.

Move

The following file transfer examples will be described for the Move action:

...

N#

...

Name

...

M1

...

Move without deleting if no transfer took place.

...

M2

...

Move a file with same source and target - config error case.

Note
titleNote

Before running a Move command, you can always try the command by setting the Dry-run option in the Task.

Move Without Deleting if no Transfer Took Place

If you want only the source file to be deleted when a file transfer took place, then add the flag ignore-existing in the field Other Parameters

Example: You move a file to a destination that already exists. In this case, rclone will not perform the copy but deletes the source file.

If you set the flag --ignore-existing, the delete operation will not be performed.

Note
titleNote
  • The --ignore-existing parameter also avoids that a file is deleted, if the source is equal to the destination in a copy operation.

  • The max-depth 1 parameter avoids a recursive move of files.

  • The error-on-no-transfer parameter lets the task fail in case no transfer was done.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report4.txt

  • Other Parameters: --ignore-existing, --max-depth 1, --error-on-no-transfer

Image Removed

Result

  • Task Status: failed

    • The file report4.txt is not deleted at the source “/home/stonebranch/demo/out/”

  • Output:

    • There was nothin to transfer

    • Return Code: 9
    • Exiting due to ERROR(s).

Move a File with Same Source and Target - config error Case

Accidently, the source has been configured the same as the target.

The --ignore-existing parameter avoids that a file is deleted, if the source is equal to the destination in a copy operation.

  • The --ignore-existing parameter avoids that a file is deleted, if the source is equal to the destination in a copy operation.

  • The max-depth 1 parameter avoids a recursive move of files.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: linux_source

  • Target: /home/stonebranch/demo/out/

  • Filter Type: include

  • Filter: report4.txt

  • Other Parameters: --ignore-existing, --max-depth 1

Image Removed

Result

  • Task Status: success (because --error-on-no-transfer was not set)

    • The file report4.txt is not deleted.

  • Output:

    • There was nothin to transfer

    • Return Code: 0

Remove-object

The following file examples will be described for the Remove-object action:

...

N#

...

Name

...

R1

...

Delete a file in a specific folder by using the parameter max-depth 1.

...

R2

...

Delete all files matching a given name recursively in all folders starting from a given entry folder.

Note
titleNote

Before running a remove-object command, you can always try the command by setting the Dry-run option in the Task.

Delete a File in a Specific Folder by Using the Parameter max-depth 1

Delete only the report1.txt in the folder stonebranchpmtest/in/

To delete a specific file in a folder, provide the File Path to the file to delete and provide the Parameter: --max-depth 1 in the Other Parameters field.

  • max-depth limits the recursion depth (default -1).

  • max-depth 1 means only the current directory is in scope.

Configuration

  • File Path: stonebranchpmtest/in/

  • Storage Type: amazon_s3_source

  • Filter Type: include

  • Filter: report1.txt

  • Other Parameters: --max-depth 1

Code Block
languagetext
stonebranchpmtest

├── in

│   ├── report1.txt <- this file will be deleted

│   ├── report2.txt

│   ├── sub01

│        ├── report1.txt <- this file will not be deleted because of --max-depth 1

├── report1.txt

 

After execution of the delete action:

stonebranchpmtest

├── in

│   ├── report2.txt

│   ├── sub01

│        ├── report1.txt

├── report1.txt

Image Removed

Result

  • Task Status: success

  • Output: report1.txt Deleted

Delete all Files Matching a Given Name Recursively in all Folders Starting from a Given Entry Folder

Delete report1.txt recursively in all folders starting from the given entry folder stonebranchpmtest/ will be deleted.

Configuration

  • File Path: stonebranchpmtest/

  • Storage Type: amazon_s3_source

  • Filter Type: include

  • Filter: report1.txt

  • Other Parameters: none

Code Block
languagetext
stonebranchpmtest

├── in

│   ├── report1.txt <- this file will be deleted

│   ├── report2.txt

│   ├── sub01

│        ├── report1.txt <- this file be deleted

├── report1.txt <- this file be deleted

 

After execution of the delete action:

stonebranchpmtest

├── in

│   ├── report2.txt

│   ├── sub01

│        ├──

├──

Image Removed

Result

  • Task Status: success

  • Output:

    • in/report1.txt Deleted

    • in/sub1/report1.txt Deleted
    • report1.txt Deleted

Monitor-object

The following examples describe how file monitoring can be performed.

...

N#

...

Name

...

FM01

...

Monitor a file in AWS - file does not exist on start of monitor.

...

FM02

...

Monitor a file in AWS - file exists on start of monitor.

Note
titleNote

Before running a monitor-object command, you can always try the command by setting the Dry-run option in the Task.

Monitor a File in AWS - File does not Exist on Start of Monitor

With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for exxample, Linux).

max-depth 1 means only the current directory is in scope.

Configuration

  • Directory: stonebranchpmtest/in/

  • Storage Type: amazon_s3_source

  • Filter Type: include

  • Filter: test3.txt

  • Interval: 10s

  • Trigger on Existence: no

  • Other Parameters: --max-depth 1, --use-server-modtime

Image Removed

Result

  • Task Status: running

  • The monitor will stay in running until the file test3.txt is copied to stonebranchpmtest/in

  • Output:

    • INFO - REST SB library version 1.0

    • INFO - ############  Monitor Objects Action ###############

    • INFO - ############  List Objects Action ###############

    • INFO - sleeping for 10 seconds now

Monitor a File in AWS - File Exists on Start of Monitor

With the Parameter Flag --use-server-modtime,  you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux).

max-depth 1 means only the current directory is in scope.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report10.txt

  • Trigger on Existence: yes

  • Other Parameters: --ignore-existing, --max-depth 1, --error-on-no-transfer

Image Removed

Result

  • Task Status: success

  • The task goes to success because the file report10.txt already exists in stonebranchpmtest/in ( trigger on existence is set to true ).

  • Output:

    • ######  object in stonebranchpmtest/in/ ######

    • [{"Path":"report10.txt","Name":"report10.txt","Size":15,"MimeType":"text/plain","ModTime":"2021-12-08T16:26:18.000000000Z","IsDir":false,"Tier":"STANDARD"}]

    • Found, Already exists: {'Path': 'report10.txt', 'Name': 'report10.txt', 'Size': 15, 'MimeType': 'text/plain', 'ModTime': '2021-12-08T16:26:18.000000000Z', 'IsDir': False, 'Tier': 'STANDARD'}