/
Inter-Cloud Data Transfer - Obsolete

Inter-Cloud Data Transfer - Obsolete

This integration has been sunset

It is replaced by:
- Inter-Cloud Data Transfer: Focused on non-monitor related actions like Copy, Move, Sync, List, Create, Delete.
- Inter-Cloud Data Monitor: Focused only on Monitoring actions leveraging on Universal Events.

Disclaimer

Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/

Overview

The Inter-Cloud Data Transfer integration allows you to transfer data to, from, and between any of the major private and public cloud providers like AWS, Google Cloud, and Microsoft Azure.

It also supports the transfer of data to and from a Hadoop Distributed File System (HDFS) and to major cloud applications like OneDrive and SharePoint.

An advantage of using the Inter-Cloud Data Transfer integration over other approaches is that data is streamed from one object store to another without the need for intermediate storage. 

Integrations with this solution package include: 

  • AWS S3
  • Google Cloud
  • Sharepoint
  • Dropbox
  • OneDrive
  • Hadoop Distributed File Storage (HDFS)

Software Requirements

Software Requirements for Universal Agent

  • Universal Agent for Linux or Windows Version 7.2.0.0 or later is required.

  • Universal Agent needs to be installed with python option (--python yes).

Software Requirements for Universal Controller

  • Universal Controller 7.2.0.0 or later.

Software Requirements for the Application to be Scheduled

  • Rclone: v1.58.1 or higher needs to be installed on server where the Universal Agent is installed.

  • Rclone can be installed on Windows and Linux

  • To install Rclone on Linux systems, run:

    curl https://rclone.org/install.sh | sudo bash

    Note

    If the URL is not reachable from your server, the Linux installation also can be done from pre-compiled binary.

  • To install Rclone on Linux system from a pre-compiled binary:

    Fetch and unpack

    curl -O <https://downloads.rclone.org/rclone-current-linux-amd64.zip> 
    unzip rclone-current-linux-amd64.zip 
    cd rclone-*-linux-amd64

    Copy binary file

    sudo cp rclone /usr/bin/ 
    sudo chown root:root /usr/bin/rclone 
    sudo chmod 755 /usr/bin/rclone

    Install manpage

    sudo mkdir -p /usr/local/share/man/man1 
    sudo cp rclone.1 /usr/local/share/man/man1/ 
    sudo mandb 
  • To install Rclone on Windows systems:

    • Rclone is a Go program and comes as a single binary file.

    • Download the relevant binary here.

    • Extract the rclone or rclone.exe binary from the archive into a folder, which is in the windows path

Key Features

Some details about the Inter-Cloud Data Transfer Task:

  • Transfer data to, from, and between any cloud provider

  • Transfer between any major storage applications like SharePoint or Dropbox

  • Transfer data to and from a Hadoop File System (HDFS)

  • Download a URL's content and copy it to the destination without saving it in temporary storage

  • Data is streamed from one object store to another (no intermediate storage)
  • Horizontal scalable by allowing multiple parallel file transfers up to the number of cores in the machine
  • Supports Chunking - Larger files can be divided into multiple chunks
  • Preserves always timestamps and verifies checksums

  • Supports encryption, caching, and compression

  • Perform Dry-runs

  • Dynamic Token updates for SharePoint connections

  • Regular Expression based include/exclude filter rules

  • Supported actions are:

    • List objects

    • List directory

    • Copy / Move
    • Copy To / Move To

    • Sync Folders
    • Remove object / object store

    • Perform Dry-runs

    • Monitor object, including triggering of Tasks and Workflows

    • Copy URL


Import Inter-Cloud Data Transfer Universal Template

To use this downloadable Universal Template, you first must perform the following steps:

  1. This Universal Task requires the Resolvable Credentials feature. Check that the Resolvable Credentials Permitted system property has been set to true.
  2. To import the Universal Template into your Controller, follow the instructions here.
  3. When the files have been imported successfully, refresh the Universal Templates list; the Universal Template will appear on the list.

Configure Inter-Cloud Data Transfer Universal Tasks

To configure a new Inter-Cloud Data Transfer there are two steps required:

  1. Configure the connection file.

  2. Create a new Inter-Cloud Data Transfer Task.

Configure the Connection File

The connection file contains all required Parameters and Credentials to connect to the Source and Target Cloud Storage System.

The provided connection file, below, contains the basic connection Parameters (flags) to connect to AWS, Azure, Linux, OneDrive (SharePoint), Google and HDFS.

Additional Parameters can be added if required. Refer to the rclone documentation for all possible flags: Global Flags.

The following connection file must be saved in the Universal Controller script library. This file is later referenced in the different Inter-Cloud Data Transfer tasks.

connections.conf (v4)

connections.conf (v4)
# Script Name: connectionv4.conf
# 
# Description: 
# Connection Script file for Inter-Cloud Data Transfer Task
# 
#
# 09.03.2022   Version 1   Initial Version requires UA 7.2
# 09.03.2022   Version 2   SFTP support
# 27.04.2022   Version 3   Azure target mistake corrected
# 13.05.2022   Version 4   Copy_Url added

#

[amazon_s3_target]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_target_credential}')}
secret_access_key = ${_credentialPwd('${ops_var_target_credential}')}
region = us-east-2
acl = bucket-owner-full-control


[amazon_s3_source]
type = s3
provider = AWS
env_auth = false
access_key_id = ${_credentialUser('${ops_var_source_credential}')}
secret_access_key = ${_credentialPwd('${ops_var_source_credential}')}
region = us-east-2
acl = bucket-owner-full-control
role_arn = arn:aws:iam::552436975963:role/SB-AWS-FULLX


[microsoft_azure_blob_storage_sas_source]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_sas_target]
type = azureblob
sas_url = ${_credentialPwd('${ops_var_target_credential}')}


[microsoft_azure_blob_storage_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
key = ${_credentialPwd('${ops_var_source_credential}')}


[microsoft_azure_blob_storage_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential}')}
key = ${_credentialPwd('${ops_var_target_credential}')}


[datalakegen2_storage_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
key = ${_credentialPwd('${ops_var_source_credential}')}


[datalakegen2_storage_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential}')}
key = ${_credentialPwd('${ops_var_target_credential}')}


[datalakegen2_storage_sp_source]
type = azureblob
account = ${_credentialUser('${ops_var_source_credential}')}
service_principal_file = ${_scriptPath('azure-principal.json')} 
# service_principal_file = C:\virtual_machines\Setup\SoftwareKeys\Azure\azure-principal.json


[datalakegen2_storage_sp_target]
type = azureblob
account = ${_credentialUser('${ops_var_target_credential}')}
service_principal_file = ${_scriptPath('azure-principal.json')} 
# service_principal_file = C:\virtual_machines\Setup\SoftwareKeys\Azure\azure-principal.json


[google_cloud_storage_source]
type = google cloud storage
service_acco