Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Transfer data to, from, and between any cloud provider

  • Transfer between any major storage applications like SharePoint or Dropbox

  • Transfer data to and from a Hadoop File System (HDFS)

  • Download a URL's content and copy it to the destination without saving it in temporary storage

  • Data is streamed from one object store to another (no intermediate storage)
  • Very Fast, if the object stores are in the same region

  • Preserves always timestamps and verifies checksums

  • Supports encryption, caching, compression, chunking

  • Perform Dry-runs

  • Dynamic Token updates for SharePoint connections

  • Regular Expression based include/exclude filter rules

  • Supported actions are:

    • List objects, List directory,

    • Copy/ Move

    • Remove object / object store

    • Perform Dry-runs

    • Monitor object

    • Copy URL


Import Inter-Cloud Data Transfer

...

Universal Template

To use this downloadable Universal Template, you first must perform the following steps:

  1. This Universal Task requires the Resolvable Credentials theĀ /wiki/spaces/UC71x/pages/5178443 feature. Check that the/wiki/spaces/UC71x/pages/5177877 system property has been set to true.

  2. Copy or Transfer the Universal Template file to a directory that can be accessed by the Universal Controller Tomcat userDownload the provided ZIP file.

  3. In the Universal Controller UI, select Administration > Configuration > Universal Templates to display the current list of /wiki/spaces/UC71x/pages/5178054.

  4. Right-click any column header on the list to display an Action menuClick Import Template.

  5. Select Import from the menu, enter the directory containing the Universal Template file(s) that you want to import, and click OK.

...

  1. the template ZIP file and Import..

When the template has been imported successfully, the Universal Template will appear on the list. Refresh your Navigation Tree to see these tasks in the Automation Center Menu.

Configure Inter-Cloud Data Transfer Universal Tasks

...

Anchor
Configure the Connection File
Configure the Connection File
Configure the Connection File

In the The connection file , configure contains all required Parameters and Credentials to connect to the Source and Target Cloud Storage System; for example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage.The .

The provided connection file, below, contains the basic connection Parameters (flags) to connect to AWS, Azure, Linux, OneDrive (SharePoint), Google and HDFS. Additional Parameters can be added if required. Refer to the rclone documentation for all possible flags: Global Flags.

The following connection file must be saved in the Universal Controller script library; for example, cloud2cloud.conf. This file is later referenced in the different Inter-Cloud Data Transfer tasks.

connections.conf

Code Block
languagexml
titleconnections.conf
linenumberstrue

[amazon_s3_target]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_target_credential}')}
secret_access_key = ${_credentialToken('${ops_var_target_credential}')}
region = us-east-2


[amazon_s3_source]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_source_credential}')}
secret_access_key = ${_credentialToken('${ops_var_source_credential}')}
region = us-east-2


[microsoft_azure_blob_storage_sas_target]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_target_credential}')}


[microsoft_azure_blob_storage_sas_source]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
key = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential_credential}')}
key = ${_credentialPwd('${ops_var_target_credential}')}


[google_cloud_storage_source]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_source_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[google_cloud_storage_target]
type = google cloud storage
service_account_file = ${_credentialPwd('${ops_var_target_credential}')}
object_acl = bucketOwnerFullControl
project_number = clagcs
location = europe-west3


[onedrive_source]
type = onedrive
token = ${_credentialToken('${ops_var_source_credential}')}
drive_id = ${_credentialUser('${ops_var_source_credential}')}
drive_type = business
update_credential = token


[onedrive_target]
type = onedrive
token = ${_credentialToken('${ops_var_target_credential}')}
drive_id = ${_credentialUser('${ops_var_target_credential}')}
drive_type = business
update_credential = token


[hdfs_source]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[hdfs_target]
type = hdfs
namenode = 172.18.0.2:8020
username = maria_dev


[linux_source]
type = local


[linux_target]
type = local






Creation of the Connection File

...