Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Panel

Table of Contents

Disclaimer

...

Code Block
languagexml
titleconnections.conf
linenumberstrue
   			
[amazon_s3_target]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_target_credential}')}
secret_access_key = ${_credentialToken('${ops_var_target_credential}')}
region = us-east-2


[amazon_s3_source]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_source_credential}')}
secret_access_key = ${_credentialToken('${ops_var_source_credential}')}
region = us-east-2


[microsoft_azure_blob_storage_sas_target]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_target_credential}')}


[microsoft_azure_blob_storage_sas_source]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
key = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential_credential}')}
key = ${_credentialPwd('${ops_var_target_credential}')}


[google_cloud_storage_source]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_source_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[google_cloud_storage_target]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_target_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[onedrive_source]
type = onedrive
token = ${_credentialToken('${ops_var_source_credential}')}
drive_id = ${_credentialUser('${ops_var_source_credential}')}
drive_type = business
update_credential = token


[onedrive_target]
type = onedrive
token = ${_credentialToken('${ops_var_target_credential}')}
drive_id = ${_credentialUser('${ops_var_target_credential}')}
drive_type = business
update_credential = token


[hdfs_source]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[hdfs_target]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[linux_source]
type = local


[linux_target]
type = local
   
   

Creation of the Connection File

The connection can be created manually by taking the sample connection file cloud2cloud.conf as template or interactively using the rclone config tool: rclone config.

If you do not want to show, in clear text, secret keys and password in the connection file, a Universal Controller credential could be used in the script. For example, if you want to encrypt the amazon s3 secret_access_key, you could set up a Universal Controller credential: AWS_SECRET_ACCESS_KEY_<D050320> and reference this credential in the script:

Code Block
languagepy
secret_access_key = ${_credentialPwd('AWS_SECRET_ACCESS_KEY_D050320')}

Considerations

Rclone supports connections to almost any storage system on the market:

Overview of Cloud Storage Systems

However, the current Universal Task has only been tested for the following storage types:

  • LINUX

  • AWS S3

  • Azure Blob Storage

  • Google GCS

  • Microsoft One Drive incl. Share Point

  • HDFS

  • HTTPS URL

Note
titleNote

If you want to connect to a different system, (for example, Dropbox), you should test this before taking it to production.

Create a New Inter-Cloud Data Transfer Task

For Universal Task Inter-Cloud Data Transfer, create a new task and enter the task-specific Details that were created in the Universal Template.

The following Actions are supported:

...

list directory

...

List directories; for example,

  • List object stores like S3 buckets, Azure container.
  • List OS directories from Linux, Windows, HDFS.

...

copy

...

Copy objects from source to target.

...

list objects

...

List objects in an OS directory or cloud object store.

...

move

...

Move objects from source to target.

...

remove-object

...

Remove objects in an OS directory or cloud object store.

...

remove-object-store

...

Remove an OS directory or cloud object store.

...

create-object-store

...

Create an OS directory or cloud object store.

...

copy-url

...

Download a URL's content and copy it to the destination without saving it in temporary storage.

...

monitor-object

...

Monitor a file or object in an OS directory or cloud object store.

...

Image Added

Considerations

Rclone supports connections to almost any storage system on the market:

Overview of Cloud Storage Systems

However, the current Universal Task has only been tested for the following storage types:

  • LINUX

  • AWS S3

  • Azure Blob Storage

  • Google GCS

  • Microsoft One Drive incl. Share Point

  • HDFS

  • HTTPS URL


Note
titleNote

If you want to connect to a different system, (for example, Dropbox), you should contact Stonebranch for support.

Create an Inter-Cloud Data Transfer Task

For Universal Task Inter-Cloud Data Transfer, create a new task and enter the task-specific Details that were created in the Universal Template.

The following Actions are supported:

ActionDescription

list directory

List directories; for example,

  • List object stores like S3 buckets, Azure container.
  • List OS directories from Linux, Windows, HDFS.

copy

Copy objects from source to target.

list objects

List objects in an OS directory or cloud object store.

move

Move objects from source to target.

remove-object

Remove objects in an OS directory or cloud object store.

remove-object-store

Remove an OS directory or cloud object store.

create-object-store

Create an OS directory or cloud object store.

copy-url

Download a URL's content and copy it to the destination without saving it in temporary storage.

monitor-object

Monitor a file or object in an OS directory or cloud object store.

In the following for each task action, the fields will be described and an example is provided.

Important Considerations

  1. Before running a move or copy command, you can always try the command by setting the Dry-run option in the Task.

  2. The following flags should be considered in any Copy, Move, remove-object and remove-object-store operations.


Flag

Description

max-depth 1

Limits the recursion depth.

max-depth 1 means only the current directory is in scope.

Attention: If the flag is not set, a recursive action is performed.

Recommendation:

Add the flag max-depth 1 to all Copy, Move, remove-object, and remove-object-store in the task field Other Parameters to avoid a recursive action.

ignore-existing

Skips all files that exist on destination.

Examples:

  1. You move a file to a destination that already exists. In this case, rclone will not perform the copy but instead deletes the source file. If you set the flag --ignore-existing, the delete operation will not be performed.

  2. The --ignore-existing parameter avoids a file being deleted if the source is equal to the destination in a copy operation.

Recommendation:

Add the flag --ignore-existing to all copy and move tasks, which avoids a file being deleted if the source is equal to the destination in a copy operation.

error-on-no-transfer

The error-on-no-transfer flag let the task fail in case no transfer was done.

update

To Skip files that are newer on the destination during a move or copy action, you can add the flag --update.





Inter-Cloud Data Transfer Actions

...