Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

...

...

...

...

...


Warning
titleThis integration has been sunset

It is replaced by:
- Inter-Cloud Data Transfer: Focused on non-monitor related actions like Copy, Move, Sync, List, Create, Delete.
- Inter-Cloud Data Monitor: Focused only on Monitoring actions leveraging on Universal Events.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

list directories; for example,

  • List object stores like S3 buckets, Azure container.
  • List OS directories from Linux, Windows, HDFS.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source,hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source,copy_url
  • [linux_target,amazon_s3_target,microsoft_azure_blob_storage_target,microsoft_azure_blob_storage_sas_target,google_cloud_storage_target,ondrive_target,hdfs_target,datalakegen2_storage_source, datalakegen2_storage_sp_source,windows_source,sftp_source,copy_url]
Source Credential

Credential used for the selected Storage Type.

Connection File

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL; for example, https://192.168.88.40/uc.

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to avoid a recursive action.

max-depth ( recursion depth )

limits the recursion depth

max-depth 1means only the root directories are displayed without subdirectories. This is the default value.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Copy objects from source to target.

Write Options

Do not overwrite existing objects, Replace existing objects, Create new with timestamp if exists, Use Custom Parameters:

Do not overwrite existing objects

Objects on the target with the same name will not be overwritten.

Replace existing objects

Objects on the target with the same name will not be overwritten - even if the same object exists on the target.

Create new with timestamp if exists

If an Object on the target with the same name exists the object will be duplicated with a timestamp added to the name. The file extension will be kept:

filename_<YYYYMMDD_HHMMSS>.fileextenstion

For example:

report1.txt exists on the target, then a new file with a timestamp as postfix will be created on the target; for example, report1_20220513_093057.txt.

Replace existing objects

Objects are always overwritten, even if same object exists already on the target.

Use Custom Parameters

Only the parameters in the field “Customer Parameters” will be applied (other Parameters will be ignored)

error-on-no-transfer

The error-on-no-transfer flag lets the task fail in case no transfer was done.

Source Storage Type

Select a source storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url

Target Storage Type

Select the target storage type:

  • linux_target
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt, report2.txt, and report3.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

max-depth ( recursion depth )

limits the recursion depth

max-depth 1means only the root directories are displayed without subdirectories. This is the default value.

Attention: If you change max-depth to a value greater than 1, a recursive action is performed.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

List objects in an OS directory or cloud object store.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source,hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source,copy_url
  • [linux_target,amazon_s3_target,microsoft_azure_blob_storage_target,microsoft_azure_blob_storage_sas_target,google_cloud_storage_target,ondrive_target,hdfs_target,datalakegen2_storage_source, datalakegen2_storage_sp_source,windows_source,sftp_source,copy_url]
Source Credential

Credential used for the selected Storage Type.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt, report2.txt, and report3.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Directory

Name of the directory you want to list the files in.

For example, Directory: stonebranchpm/out would mean to list all objects in the bucket stonebranchpm folder out.

List Format

[ list size and path, list modification time, size and path, list objects and directories, list objects and directories (Json) ]

The Choice box specifies how the output should be formatted.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

max-depth ( recursion depth )

limits the recursion depth

max-depth 1means only the root directories are displayed without subdirectories. This is the default value.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

move objects from source to target

error-on-no-transfer

The error-on-no-transfer flag lets the task fail in case no transfer was done.

Source Storage Type

Select a source storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url]

Target storage Type

Select the target storage type:

  • linux_target,
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_sourc
  • windows_source
  • sftp_source
  • copy_url

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt, report2.txt, and report3.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

max-depth ( recursion depth )

limits the recursion depth

max-depth 1means only the root directories are displayed without subdirectories. This is the default value.

Attention: If you change max-depth to a value greater than 1, a recursive action is performed.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove objects in an OS directory or cloud object store.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
  • linux_target
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
File Path

Path to the directory in which you want to remove the objects.

For example:

File Path: stonebranchpmtest

Filter: report[1-3].txt

This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for file matching; for example, in a sync action:

Filter Type: include

Filter report[1-3].txt

This means all reports with names matching report1.txt, report2.txt, and report3.txt will be removed.

For more examples on the filter matching pattern, refer to https://rclone.org/filtering/ .

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

max-depth ( recursion depth )

Limits the recursion depth

max-depth 1 means only the current directory is in scope; in this case, only files in the current directory will be synced from the source to the target. This is the default value.

Attention: If you change max-depth to a value greater than 1, a recursive action is performed.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove an OS directory or cloud object store.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
  • linux_target
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
Source Credential

Credential used for the selected Storage Type.

Directory

Name of the directory you want to remove.

The directory can be an object store or a file system OS directory.

The directory needs to be empty before it can be removed.

For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest.

Connection File

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Create an OS directory or cloud object store.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
  • linux_target
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
Source Credential

Credential used for the selected Storage Type.

Connection File

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Directory

Name of the directory you want to create.

The directory can be an object store or a file system OS directory.

For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, copyto, list objects, move, moveto, sync, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Download a URL's content and copy it to the destination without saving it in temporary storage.

Source

URL parameter.

Download a URL's content and copy it to the destination without saving it in temporary storage.

Storage Type

For the action copy_url, the value must be copy_url.

Source Credential

Credential used for the selected Storage Type

Target

Enter a target storage type name as defined in the Connection File.

For example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Useful parameters for the copy-url command:

Code Block
languagepy
linenumberstrue
  -a, --auto-filename    Get the file name from the URL and use it for destination file path
  -h, --help             help for copyurl
      --no-clobber       Prevent overwriting file with same name
  -p, --print-filename   Print the resulting name from --auto-filename
      --stdout           Write the output to stdout rather than a file


Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Monitor a file or object in an OS directory or cloud object store.

Storage Type

Select the storage type:

  • linux_source
  • amazon_s3_source
  • microsoft_azure_blob_storage_source
  • microsoft_azure_blob_storage_sas_source
  • google_cloud_storage_Source
  • ondrive_source
  • hdfs_source
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
  • linux_target
  • amazon_s3_target
  • microsoft_azure_blob_storage_target
  • microsoft_azure_blob_storage_sas_target
  • google_cloud_storage_target
  • ondrive_target
  • hdfs_target
  • datalakegen2_storage_source
  • datalakegen2_storage_sp_source
  • windows_source
  • sftp_source
  • copy_url
Source Credential

Credential used for the selected Storage Type

Directory

Name of the directory to scan for the files to monitor.

The directory can be an object store or a file system OS directory.

For example:

Directory: stonebranchpm/out
Filter: report1.txt

This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File 137232401.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt 

This means all reports with names matching report1.txt, report2.txt, and report3.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

continuous monitoring

[ checked , unchecked]

After the monitor has identified a matching file it continuous to monitor for new files.

skip_if_active

checked , unchecked]

If the monitor has launched a Task(s), it will only launch a new Task(s) if the launched Tasks(s) are not active anymore.

Trigger on Existence

[ checked , unchecked]

If checked, the monitor goes to success even if the file already exists when it was started.

User Server Modtime

Use Server Modtime tells the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux).

Tasks

Comma-separated list of Tasks, which will be launched when the montoring criteria are met.

If the field is empty, no Task(s) will be launched. In this case, the Task acts as a pure monitor.

For example, Task1,Task2,Task3

Interval

[ 10s, 60s, 180s ]

Monitor interval to check of the file(s) in the configured directory.

For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

max-depth ( recursion depth )

Limits the recursion depth

max-depth 1 means only the current directory is in scope; in this case, only files in the current directory will be synced from the source to the target. This is the default value.

Attention: If you change max-depth to a value greater than 1, a recursive action is performed.

...