Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel

Table of Contents

Disclaimer

Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/

Overview

The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.

It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint.

An advantage of using the Inter-Cloud Data Transfer integration over other approaches is that data is streamed from one object store to another without the need for intermediate storage. 

Integrations with this solution package include: 

  • AWS S3
  • Google Cloud
  • Sharepoint
  • Dropbox
  • OneDrive
  • Hadoop Distributed File Storage (HDFS)

Software Requirements

Software Requirements for Universal Agent

  • Universal Agent for Linux or Windows Version 7.0.0.0 or later is required.

  • Universal Agent needs to be installed with python option (--python yes).

Software Requirements for Universal Controller

  • Universal Controller 7.0.0.0 or later.

Software Requirements for the Application to be Scheduled

...

Rclone: v1.57.1 or higher needs to be installed on server where the Universal Agent is installed.

...

Rclone can be installed on Windows and Linux

...

To install Rclone on Linux systems, run:

Code Block
languagepy
linenumberstrue
curl https://rclone.org/install.sh | sudo bash
Note
titleNote

If the URL is not reachable from your server, the Linux installation also can be done from pre-compiled binary.

To install Rclone on Linux system from a pre-compiled binary

Fetch and unpack

Code Block
languagepy
linenumberstrue
curl -O <https://downloads.rclone.org/rclone-current-linux-amd64.zip> 
unzip rclone-current-linux-amd64.zip 
cd rclone-*-linux-amd64

Copy binary file

Code Block
languagepy
linenumberstrue
sudo cp rclone /usr/bin/ 
sudo chown root:root /usr/bin/rclone 
sudo chmod 755 /usr/bin/rclone

...

Panel

Table of Contents

Disclaimer

Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/

Overview

The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.

It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint.

An advantage of using the Inter-Cloud Data Transfer integration over other approaches is that data is streamed from one object store to another without the need for intermediate storage. 

Integrations with this solution package include: 

  • AWS S3
  • Google Cloud
  • Sharepoint
  • Dropbox
  • OneDrive
  • Hadoop Distributed File Storage (HDFS)

Software Requirements

Software Requirements for Universal Agent

  • Universal Agent for Linux or Windows Version 7.0.0.0 or later is required.

  • Universal Agent needs to be installed with python option (--python yes).

Software Requirements for Universal Controller

  • Universal Controller 7.0.0.0 or later.

Software Requirements for the Application to be Scheduled

  • Rclone: v1.57.1 or higher needs to be installed on server where the Universal Agent is installed.

  • Rclone can be installed on Windows and Linux

  • To install Rclone on Linux systems, run:

    Code Block
    languagepy
    linenumberstrue
    curl https://rclone.org/install.sh | sudo bash
    Note
    titleNote

    If the URL is not reachable from your server, the Linux installation also can be done from pre-compiled binary.

  • To install Rclone on Linux system from a pre-compiled binary

    Fetch and unpack

    Code Block
    languagepy
    linenumberstrue
    sudo mkdircurl -pO <https:/usr/local/share/man/man1 
    downloads.rclone.org/rclone-current-linux-amd64.zip> 
    unzip rclone-current-linux-amd64.zip 
    cd rclone-*-linux-amd64

    Copy binary file

    Code Block
    languagepy
    linenumberstrue
    sudo cp rclone.1 /usr/local/share/man/man1/bin/ 
    sudo chown root:root /usr/bin/rclone 
    sudo mandb 

    To install Rclone on Windows systems:

    Rclone is a
    chmod 755 /usr/bin/rclone

    Install manpage

    Code Block
    languagepy
    linenumberstrue
    sudo mkdir -p /usr/local/share/man/man1 
    sudo cp rclone.1 /usr/local/share/man/man1/ 
    sudo mandb 
  • To install Rclone on Windows systems:

    • Rclone is a Go program and comes as a single binary file.

    • Download the relevant binary here.

    • Extract the rclone or rclone.exe binary from the archive into a folder, which is in the windows path

...

Flag

Description

max-depth 1

Limits the recursion depth.

max-depth 1 means only the current directory is in scope.

Attention: If the flag is not set, a recursive action is performed.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object, and remove-object-store in the task field Other Parameters to avoid a recursive action.

ignore-existing

Skips all files that exist on destination.

Examples:

  1. You move a file to a destination that already exists. In this case, rclone will not perform the copy but instead deletes the source file. If you set the flag --ignore-existing, the delete operation will not be performed.

  2. The --ignore-existing parameter avoids a file being deleted if the source is equal to the destination in a copy operation.

Recommendation: Add the flag --ignore-existing to all copy and move tasks, which avoids a file being deleted if the source is equal to the destination in a copy operation.

error-on-no-transfer

The error-on-no-transfer flag let the task fail in case no transfer was done.

update

To Skip files that are newer on the destination during a move or copy action, you can add the flag --update.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Move objects from source to target.

Source

Enter a source storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Target

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

...

No recursive move will be done, because the flag --max-depth 1 is set.

Action: remove-object

Add the flag max-depth 1 to all
FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove objects in an OS directory or cloud object store.

Storage Type

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

File Path

Path to the directory in which you want to remove the objects.

For example:

File Path: stonebranchpmtest

Filter: report[1-3].txt

This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Other Parameters

Recommendation:

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example removes all s3 S3 objects matching the filter: report[1-3].txt

( report1.txt, report2.txt and report3.txt ) from the s3 S3 bucket stonebranchpmtest folder in.

No recursive remove will be done, because the flag --max-depth 1 is set.

Action: remove-object-store

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Remove an OS directory or cloud object store.

Storage Type

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Directory

Name of the directory you want to list the files in.For example, Directory: stonebranchpm/out would mean to list all objects in the bucket stonebranchpm folder outremove.

The directory can be an object store or a file system OS directory.

The directory needs to be empty before it can be removed.

For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest.

Connection File

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.For a list of all possible flags, refer to Global Flags.to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action.

Attention: If the flag max-depth 1 is not set, a recursive action is performed.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example removes the s3 object store stonebranchdemo.

No recursive remove of object stores will be done, because the flag --max-depth 1 is set.

Image Added

Action: create-object-store

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Create an OS directory or cloud object store.

Storage Type

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Connection File

In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Directory

Name of the directory you want to create.

The directory can be an object store or a file system OS directory.

For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials.

UAC Base URL

Universal Controller URL.

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL].

Example

The following example creates the S3 bucket stonebranchdemo.

Image Added

Action: copy-url

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Download a URL's content and copy it to the destination without saving it in temporary storage.

Source

URL parameter.

Download a URL's content and copy it to the destination without saving it in temporary storage.

Target

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Useful parameters for the copy-url command:

Code Block
languagepy
linenumberstrue
  -a, --auto-filename    Get the file name from the URL and use it for destination file path
  -h, --help             help for copyurl
      --no-clobber       Prevent overwriting file with same name
  -p, --print-filename   Print the resulting name from --auto-filename
      --stdout           Write the output to stdout rather than a file

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

...

The following example removes the s3 object store stonebranchpmtest.

Image Removed

...

The following example downloads a PDF file:

The Linux folder is located on the server where the Agent ${AGT_LINUX} runs.

Image Added

Action: monitor-object

Example

The following example downloads a PDF file:

From the webaddress: https://www.bundesbank.de/resource/../blz-loeschungen-aktuell-data.pdf

Tto the linux folder: /home/stonebranch/demo/in

The linux folder is located on the server where the Agent ${AGT_LINUX} runs.

Image Removed

Action: monitor-object

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Monitor a file or object in an OS directory or cloud object store.

...

Storage Type

...

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Name of the directory to scan for the files to monitor.

The directory can be an object store or a file system OS directory.

For example:

Directory: stonebranchpm/out
Filter: report1.txt

This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Filter Type

...

[ include, exclude, none ]

Define the type of filter to apply.

...

Filter

...

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt 

This means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Example:

To Skip files that are newer on the destination during a move or copy action, you could add the flag --update.

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

Trigger on Existence

...

[ checked , unchecked]

If checked, the monitor goes to success even if the file already exists when it was started.

...

[ 10s, 60s, 180s ]

Monitor interval to check of the file(s) in the configured directory.

For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials

...

UAC Base URL

...

Universal Controller URL

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example monitors s3 bucket folder stonebranchpm/out for the object report1.txt.

If the object is found, the monitor goes to success.

...

FieldDescription

Agent

Linux or Windows Universal Agent to execute the Rclone command line.

Agent Cluster

Optional Agent Cluster for load balancing.

Action

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Create Monitor a file or object in an OS directory or cloud object store.

Storage Type

Enter a storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

Directory

Name of the directory you want to createto scan for the files to monitor.

The directory can be an object store or a file system OS directory.

For example, Directory: stonebranchpmtest would create the bucket stonebranchpmtest.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example creates the s3 bucket stonebranchpmtest.

Image Removed

Action: copy-url

...

Agent

...

Linux or Windows Universal Agent to execute the Rclone command line.

...

Agent Cluster

...

Optional Agent Cluster for load balancing.

...

Action

...

[ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ]

Download a URL's content and copy it to the destination without saving it in temporary storage.

...

Source

...

Enter a source storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Target

...

Enter a target storage Type name as defined in the Connection File; for example,

amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux ..

For a list of all possible storage types, refer to Overview of Cloud Storage Systems.

...

Connection File

...

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

...

Other Parameters

...

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Useful parameters for the copy-url command:

Code Block
languagepy
linenumberstrue
  -a, --auto-filename    Get the file name from the URL and use it for destination file path
  -h, --help             help for copyurl
      --no-clobber       Prevent overwriting file with same name
  -p, --print-filename   Print the resulting name from --auto-filename
      --stdout           Write the output to stdout rather than a file

...

Dry-run

...

[ checked , unchecked ]

Do a trial run with no permanent changes.

...

UAC Rest Credentials

...

Universal Controller Rest API Credentials

...

UAC Base URL

...

Universal Controller URL

For example, https://192.168.88.40/uc

...

Loglevel

...

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

:

Directory: stonebranchpm/out
Filter: report1.txt

This would monitor in the s3 bucket folder stonebranchpm/out for the object report1.txt.

Connection File

In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.

For details on how to configure the Connection File, refer to section Configure the Connection File.

Filter Type

[ include, exclude, none ]

Define the type of filter to apply.

Filter

Provide the Patterns for matching file matching; for example, in a copy action:

Filter Type: include

Filter report[1-3].txt 

This means all reports with names matching report1.txt and report2.txt will be copied.

For more examples on the filter matching pattern, refer to Rclone Filtering.

Other Parameters

This field can be used to apply additional flag parameters to the selected action.

For a list of all possible flags, refer to Global Flags.

Recommendation: Set the two flags --use-server-modtime and --max-depth 1.

With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux).

The Parameter Flag max-depth 1 means only the current directory is in scope (no recursive monitoring).

Dry-run

[ checked , unchecked ]

Do a trial run with no permanent changes.

Trigger on Existence

[ checked , unchecked]

If checked, the monitor goes to success even if the file already exists when it was started.

Interval

[ 10s, 60s, 180s ]

Monitor interval to check of the file(s) in the configured directory.

For example, Interval: 60s, would be mean that every 60s, the task checks if the file exits in the scan directory.

UAC Rest Credentials

Universal Controller Rest API Credentials

UAC Base URL

Universal Controller URL

For example, https://192.168.88.40/uc

Loglevel

Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]

Example

The following example monitors S3 bucket folder stonebranchpm/in for the object test3.txt

If the object triggers the monitor (status success), only the object is uploaded after the monitor has been started.

Image Added

Inter-Cloud Data Transfer Example Cases

The following section provides examples for the Inter-Cloud Data Transfer Task actions:

  • Copy

  • Move

  • Remove-object

  • Monitor-object

N#

Name

Copy

C1

Copy a file to target. The file does not exist at the target.

C2

Copy two files. One of files already exists at the target.

C3

Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target.

C4

Copy a file to a target, which already exists with Parameter error-on-no-transfer set.

C5

Copy all files except one, from one folder into another folder.

C6

Copy non-recursively a file from a folder into another folder using the parameter max-depth 1.

C7

Copy files using multiple Parameters.

Move

M1

Move without deleting if no transfer took place.

M2

Move a file with same source and target - config error case.

Remove-object

R1

Delete a file in a specific folder by using the parameter: max-depth 1.

R2

Delete all files matching a given name recursively in all folders starting from a given entry folder.

Monitor-object

FM01

Monitor a file in AWS - file does not exist on start of monitor.

FM02

Monitor a file in AWS - file exists on start of monitor.

Copy

The following file transfer examples will be described for the Copy action:

N#

Name

C1

Copy a file to target. The file does not exist at the target.

C2

Copy two files. One of files already exists at the target.

C3

Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target.

C4

Copy a file to a target, which already exists with Parameter error-on-no-transfer set.

C5

Copy all files except one, from one folder into another folder.

C6

Copy non-recursively a file from a folder into another folder using the parameter max-depth 1.

C7

Copy files using multiple Parameters.

Note
titleNote

Before running a copy command, you can always try the command by setting the Dry-run option in the Task.

Copy a File to Target: File does not Exist at the Target

Copy the file report1.txt to target stonebranchpm/in/. The file does not exist at the target.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report3.txt

  • Other Parameters: none


Image Added

Result

  • Task Status: success

  • Output: report1.txt: Copied (new)


Copy Two Files:. One File Already Exists at the Target

Copy two files report2.txt and report3.txt to the target stonebranchpm/in/. The file report2.txt (same size and content) already exists at the target.

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report[2-3].txt

  • Other Parameters: none

Image Added

Result

  • Task Status: success

  • Output:

    • report3.txt: Copied (new)

    • report2.txt: Unchanged skipping

Note
titleNote

If the file report2.txt would have the same name, but a different size, than the task would overwrite the file report2.txt on the target.

Copy a File to a Target with Parameter ignore-existing Set. A File with the Same filename Exists on the Target.

Copy a file report2.txt to the target stonebranchpm/in/ with Parameter ignore-existing set . A file with the same filename exists at the target (size or content can be the same or different).

When ignore-existing is set, all files that exist on destination with the same filename are skipped. (No overwrite is done.)

Configuration

  • Source Storage Type: linux_source

  • Source: /home/stonebranch/demo/out/

  • Target Storage Type: amazon_s3_target

  • Target: stonebranchpm/in/

  • Filter Type: include

  • Filter: report2.txt

  • Other Parameters: --ignore-existing

Image Added

Result

  • Task Status: success

  • Output:

    • There was nothing to transfer.

    • Exiting due to ERROR(s).