Panel | |
---|---|
|
...
The following connection file must be saved in the Universal Controller script library. This file is later referenced in the different Inter-Cloud Data Transfer tasks.
connections.conf
(v4)
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
# Script Name: connectionv4.conf # # Description: # Connection Script file for Inter-Cloud Data Transfer Task # # # 09.03.2022 Version 1 Initial Version requires UA 7.2 # 09.03.2022 Version 2 SFTP support # 27.04.2022 Version 3 Azure target mistake corrected # 13.05.2022 Version 4 Copy_Url added # [amazon_s3_target] type = s3 provider = AWS env_auth = false access_key_id = ${_credentialUser('${ops_var_target_credential}')} secret_access_key = ${_credentialPwd('${ops_var_target_credential}')} region = us-east-2 acl = bucket-owner-full-control [amazon_s3_source] type = s3 provider = AWS env_auth = false access_key_id = ${_credentialUser('${ops_var_source_credential}')} secret_access_key = ${_credentialPwd('${ops_var_source_credential}')} region = us-east-2 acl = bucket-owner-full-control role_arn = arn:aws:iam::552436975963:role/SB-AWS-FULLX [microsoft_azure_blob_storage_sas_source] type = azureblob sas_url = ${_credentialPwd('${ops_var_source_credential}')} [microsoft_azure_blob_storage_sas_target] type = azureblob sas_url = ${_credentialPwd('${ops_var_target_credential}')} [microsoft_azure_blob_storage_source] type = azureblob account = ${_credentialUser('${ops_var_source_credential}')} key = ${_credentialPwd('${ops_var_source_credential}')} [microsoft_azure_blob_storage_target] type = azureblob account = ${_credentialUser('${ops_var_target_credential}')} key = ${_credentialPwd('${ops_var_target_credential}')} [datalakegen2_storage_source] type = azureblob account = ${_credentialUser('${ops_var_source_credential}')} key = ${_credentialPwd('${ops_var_source_credential}')} [datalakegen2_storage_target] type = azureblob account = ${_credentialUser('${ops_var_target_credential}')} key = ${_credentialPwd('${ops_var_target_credential}')} [datalakegen2_storage_sp_source] type = azureblob account = ${_credentialUser('${ops_var_source_credential}')} service_principal_file = ${_scriptPath('azure-principal.json')} # service_principal_file = C:\virtual_machines\Setup\SoftwareKeys\Azure\azure-principal.json [datalakegen2_storage_sp_target] type = azureblob account = ${_credentialUser('${ops_var_target_credential}')} service_principal_file = ${_scriptPath('azure-principal.json')} # service_principal_file = C:\virtual_machines\Setup\SoftwareKeys\Azure\azure-principal.json [google_cloud_storage_source] type = google cloud storage service_account_file = ${_credentialPwd('${ops_var_source_credential}')} object_acl = bucketOwnerFullControl project_number = clagcs location = europe-west3 [google_cloud_storage_target] type = google cloud storage service_account_file = ${_credentialPwd('${ops_var_target_credential}')} object_acl = bucketOwnerFullControl project_number = clagcs location = europe-west3 [onedrive_source] type = onedrive token = ${_credentialToken('${ops_var_source_credential}')} drive_id = ${_credentialUser('${ops_var_source_credential}')} drive_type = business update_credential = token [onedrive_target] type = onedrive token = ${_credentialToken('${ops_var_target_credential}')} drive_id = ${_credentialUser('${ops_var_target_credential}')} drive_type = business update_credential = token [hdfs_source] type = hdfs namenode = 172.18.0.2:8020 username = maria_dev [hdfs_target] type = hdfs namenode = 172.18.0.2:8020 username = maria_dev [linux_source] type = local [linux_target] type = local [windows_source] type = local [windows_target] type = local [sftp_source] type = sftp host = 3.19.76.58 user = ubuntu pass = ${_credentialToken('${ops_var_source_credential}')} [sftp_target] type = sftp host = 3.19.76.58 user = ubuntu pass = ${_credentialToken('${ops_var_target_credential}')} [copy_url] |
...
connection.conf
in the Universal Controller script library
Considerations
...
LINUX
AWS S3
Azure Blob Storage
Google GCS
Microsoft One Drive incl. Share Point, including SharePoint
HDFS
HTTPS URL
- SFTP
Note | ||
---|---|---|
| ||
If you want to connect to a different system (for example, Dropbox), you should contact Stonebranch for support. |
...
Action | Description |
list directory | List directories; for example,
|
copy | Copy objects from source to target. |
copy-tocopyto | Copies an a single object from source to target and allows to rename the object on the target object. |
move | Move objects from source to target. |
move-tomoveto | Moves an a single object from source to target and allows to rename the object on the target object. |
sync | Sync the source to the destination, changing the destination only. |
list objects | List objects in an OS directory or cloud object store. |
remove-object | Remove objects in an OS directory or cloud object store. |
remove-object-store | Remove an OS directory or cloud object store. |
create-object-store | Create an OS directory or cloud object store. |
copy-url | Download a URL's content and copy it to the destination without saving it in temporary storage. |
monitor-object | Monitor a file or object in an OS directory or cloud object store and, optionally, launch Task(s) when an object is identified by the monitor. |
...
Important Considerations
Before running a move or , moveto, copy command, copyto or sync command, you can always try the command by setting the Dry-run option in the Task.
The field
max-depth
(recursion depth
) limits the recursion depth.max-depth 1
means that only the current directory is in scope. This is the default value.Note title Attention If you change max-depth to a value greater than 1, a recursive action is performed. This should be considered in any Copy, Move, sync, remove-object and remove-object-store operations.
...
Field | Description | ||
Agent | Linux or Windows Universal Agent to execute the Rclone command line. | ||
Agent Cluster | Optional Agent Cluster for load balancing. | ||
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] list directories; for example,
| ||
Storage TypeEnter a storage Type name as defined in the Connection File; for example, | Select the storage type:
For a list of all possible storage types, refer to Overview of Cloud Storage Systems. | Connection File | In the connection file, you configure
|
Source Credential | Credential used for the selected Storage Type. | ||
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. | ||
UAC Rest Credentials | Universal Controller Rest API Credentials. | ||
UAC Base URL | Universal Controller URL; for example, https://192.168.88.40/uc. | ||
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. | ||
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to avoid a recursive action. | ||
max-depth ( recursion depth ) | limits the recursion depth
|
Example
The following example list all AWS S3 aws s3 buckets in the AWS account configured in the cloud2cloudconnection.conf
file.
No sub-directories are displayed, because max-depth is set to 1.
Action: copy
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Move objects from source to target. |
Source | Enter a source storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove objects in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
File Path | Path to the directory in which you want to remove the objects. For example: File Path: stonebranchpmtest Filter: report[1-3].txt This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory you want to remove. The directory can be an object store or a file system OS directory. The directory needs to be empty before it can be removed. For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...