Panel | |
---|---|
|
...
Flag | Description |
---|---|
max-depth 1 | Limits the recursion depth. max-depth 1 means only the current directory is in scope. Attention: If the flag is not set, a recursive action is performed. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object, and remove-object-store in the task field Other Parameters to avoid a recursive action. |
ignore-existing | Skips all files that exist on destination. Examples:
Recommendation: Add the flag --ignore-existing to all copy and move tasks, which avoids a file being deleted if the source is equal to the destination in a copy operation. |
error-on-no-transfer | The error-on-no-transfer flag let the task fail in case no transfer was done. |
update | To Skip files that are newer on the destination during a move or copy action, you can add the flag --update. |
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Move objects from source to target. |
Source | Enter a source storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Target | Enter a target storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Filter Type | [ include, exclude, none ] Define the type of filter to apply. |
Filter | Provide the Patterns for matching file matching; for example, in a copy action: Filter Type: include
For more examples on the filter matching pattern, refer to Rclone Filtering. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove objects in an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
File Path | Path to the directory in which you want to remove the objects. For example: File Path: stonebranchpmtest Filter: report[1-3].txt This removes all S3 objects matching the filter: report[1-3].txt( report1.txt, report2.txt and report3.txt ) from the S3 bucket stonebranchpmtest. |
Connection File | In the connection file you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...
Field | Description |
Agent | Linux or Windows Universal Agent to execute the Rclone command line. |
Agent Cluster | Optional Agent Cluster for load balancing. |
Action | [ list directory, copy, list objects, move, remove-object, remove-object-store, create-object-store, copy-url, monitor-object ] Remove an OS directory or cloud object store. |
Storage Type | Enter a storage Type name as defined in the Connection File; for example, amazon_s3, microsoft_azure_blob_storage, hdfs, onedrive, linux .. For a list of all possible storage types, refer to Overview of Cloud Storage Systems. |
Directory | Name of the directory you want to remove. The directory can be an object store or a file system OS directory. The directory needs to be empty before it can be removed. For example, Directory: stonebranchpmtest would remove the bucket stonebranchpmtest. |
Connection File | In the connection file, you configure all required Parameters and Credentials to connect to the Source and Target Cloud Storage System. For example, if you want to transfer a file from AWS S3 to Azure Blob Storage, you must configure the connection Parameters for AWS S3 and Azure Blob Storage. For details on how to configure the Connection File, refer to section Configure the Connection File. |
Other Parameters | This field can be used to apply additional flag parameters to the selected action. For a list of all possible flags, refer to Global Flags. Recommendation: Add the flag max-depth 1 to all Copy, Move, remove-object and remove-object-store in the task field Other Parameters to avoid a recursive action. Attention: If the flag max-depth 1 is not set, a recursive action is performed. |
Dry-run | [ checked , unchecked ] Do a trial run with no permanent changes. |
UAC Rest Credentials | Universal Controller Rest API Credentials. |
UAC Base URL | Universal Controller URL. For example, https://192.168.88.40/uc |
Loglevel | Universal Task logging settings [DEBUG | INFO| WARNING | ERROR | CRITICAL]. |
...
Copy
Move
Remove-object
Monitor-object
N# | Name |
Copy | |
C1 | Copy a file to target. The file does not exist at the target. |
C2 | Copy two files. One of files already exists at the target. |
C3 | Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target. |
C4 | Copy a file to a target, which already exists with Parameter error-on-no-transfer set. |
C5 | Copy all files except one, from one folder into another folder. |
C6 | Copy non-recursively a file from a folder into another folder using the parameter max-depth 1. |
C7 | Copy files using multiple Parameters. |
Move | |
M1 | Move without deleting if no transfer took place. |
M2 | Move a file with same source and target - config error case. |
Remove-object | |
R1 | Delete a file in a specific folder by using the parameter: max-depth 1. |
R2 | Delete all files matching a given name recursively in all folders starting from a given entry folder. |
Monitor-object | |
FM01 | Monitor a file in AWS - file does not exist on start of monitor. |
FM02 | Monitor a file in AWS - file exists on start of monitor. |
Copy
The following file transfer examples will be described for the Copy action:
N# | Name |
C1 | Copy a file to target. The file does not exist at the target. |
C2 | Copy two files. One of files already exists at the target. |
C3 | Copy a file to a target with Parameter ignore-existing set. A file with the same filename exists on the target. |
C4 | Copy a file to a target, which already exists with Parameter error-on-no-transfer set. |
C5 | Copy all files except one, from one folder into another folder. |
C6 | Copy non-recursively a file from a folder into another folder using the parameter max-depth 1. |
C7 | Copy files using multiple Parameters. |
Note | ||
---|---|---|
| ||
Before running a copy command, you can always try the command by setting the Dry-run option in the Task. |
...
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type: amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter: report2.txt
Other Parameters: --ignore-existing
Result
Task Status: successfailure
Output:
There was nothing to transfer.
Exiting due to ERROR(s).
Copy a File to a Target, which Already Exists with Parameter Flag error-on-no-transfer Set.
Copy a file report2.txt
to a target, which already exists with Parameter error-on-no-transfer set .
When error-on-no-transfer is set, the task will fail in case no file is transferred.
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type:
amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter:
report2.txt
Other Parameters: --error-on-no-transfer
Result
Task Status: failure
Output:
report2.txt: Unchanged skipping
There was nothing to transfer
Exiting due to ERROR(s).
Copy all Files except One, from One Folder into Another Folder
Copy contents of folder in into folder out excluding the file index.html
.
Note | ||
---|---|---|
| ||
If the folder out does not exist, it would be created. |
Configuration
Source Storage Type: linux_source
Source: stonebranchpmtest/in
Target Storage Type:
amazon_s3_target
Target: stonebranchpmtest/out
Filter Type: exclude
Filter:
index.html
Other Parameters: none
Code Block | ||
---|---|---|
| ||
stonebranchpmtest
├── in
│ ├── report1.txt
│ └── report2.txt
│ ├── sub01
│ ├── report1.txt <- this file will be copied incl. directory sub01
│ └── index.html <- will not be copied, because of the exclude filter
├── out
│ └── report10.txt
After execution of the delete action:
├── out
│ ├── report1.txt
│ └── report2.txt
│ ├── sub01
│ ├── report1.txt <- this file will be copied incl. directory sub01
│ └── report10.txt |
Result
Task Status: success
Output:
report1.txt: Copied (new)
report2.txt: Copied (new)
sub1/report1.txt: Copied (new)
Copy Non-Recursively a File from a Folder into Another Folder by Using the Parameter max-depth 1.
Copy the file report1.txt f
rom the folder stonebranchpmtest/in into folder stonebranchpmtest2/. The Parameter max-depth 1 is used to copy non-recursively.
max-depth limits the recursion depth (default -1).
max-depth 1 means only the current directory is in scope.
Note | ||
---|---|---|
| ||
The folder in will not be created on the target; only the file is copied to the provided target directory. |
Configuration
Source Storage Type: amazon_s3_source
Source: stonebranchpmtest/in
Target Storage Type:
amazon_s3_target
Target: stonebranchpmest2/out
Filter Type: include
Filter:
report1.txt
Other Parameters: --max-depth 1
Code Block | ||
---|---|---|
| ||
stonebranchpmtest
├── in
│ ├── report1.txt <- this file will be copied
│ ├── report2.txt
│ ├── sub01
│ ├── report1.txt <- this file will not be copied, because of --max-depth 1
├── report1.txt <- this file will not be copied
stonebranchpmtest2
├── report10.txt
After execution of the delete action:
stonebranchpmtest2
├── report1.txt
├── report10.txt |
Result
Task Status: success
Output:
report1.txt: Copied (new)
Copy Files Using Multiple Parameters
It is possible to provide multiple Parameters in the Other Parameters field of the Task.
Example:
Parameters: --max-depth 1 --error-on-no-transfer
In this case, both Parameters are applied.
Configuration
Source Storage Type: amazon_s3_source
Source: stonebranchpmtest/in
Target Storage Type:
amazon_s3_target
Target: stonebranchpmest2/out
Filter Type: include
Filter:
report1.txt
Other Parameters: --max-depth 1, --error-on-no-transfer
Result
The Parameters: --max-depth 1 and --error-on-no-transfer are both applied.
Move
The following file transfer examples will be described for the Move action:
N# | Name |
M1 | Move without deleting if no transfer took place. |
M2 | Move a file with same source and target - config error case. |
Note | ||
---|---|---|
| ||
Before running a Move command, you can always try the command by setting the Dry-run option in the Task. |
Move Without Deleting if no Transfer Took Place
If you want only the source file to be deleted when a file transfer took place, then add the flag ignore-existing in the field Other Parameters
Example: You move a file to a destination that already exists. In this case, rclone will not perform the copy but deletes the source file.
If you set the flag --ignore-existing, the delete operation will not be performed.
Note | ||
---|---|---|
| ||
|
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type: amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter: report4.txt
Other Parameters: --ignore-existing, --max-depth 1, --error-on-no-transfer
Result
Task Status: failed
The file
report4.txt
is not deleted at the source “/home/stonebranch/demo/out/”
Output:
There was nothin to transfer
- Return Code: 9
Exiting due to ERROR(s).
Move a File with Same Source and Target - config error Case
Accidently, the source has been configured the same as the target.
The --ignore-existing parameter avoids that a file is deleted, if the source is equal to the destination in a copy operation.
The --ignore-existing parameter avoids that a file is deleted, if the source is equal to the destination in a copy operation.
The max-depth 1 parameter avoids a recursive move of files.
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type:
linux_source
Target: /home/stonebranch/demo/out/
Filter Type: include
Filter:
report4.txt
Other Parameters: --ignore-existing, --max-depth 1
Result
Task Status: success (because --error-on-no-transfer was not set)
The file
report4.txt
is not deleted.
Output:
There was nothin to transfer
- Return Code: 0
Remove-object
The following file examples will be described for the Remove-object action:
N# | Name |
R1 | Delete a file in a specific folder by using the parameter max-depth 1. |
R2 | Delete all files matching a given name recursively in all folders starting from a given entry folder. |
Note | ||
---|---|---|
| ||
Before running a remove-object command, you can always try the command by setting the Dry-run option in the Task. |
Delete a File in a Specific Folder by Using the Parameter max-depth 1
Delete only the report1.txt in the folder stonebranchpmtest/in/
To delete a specific file in a folder, provide the File Path to the file to delete and provide the Parameter: --max-depth 1 in the Other Parameters field.
max-depth limits the recursion depth (default -1).
max-depth 1 means only the current directory is in scope.
Configuration
File Path: stonebranchpmtest/in/
Storage Type:
amazon_s3_source
Filter Type: include
Filter:
report1.txt
Other Parameters: --max-depth 1
Code Block | ||
---|---|---|
| ||
stonebranchpmtest
├── in
│ ├── report1.txt <- this file will be deleted
│ ├── report2.txt
│ ├── sub01
│ ├── report1.txt <- this file will not be deleted because of --max-depth 1
├── report1.txt
After execution of the delete action:
stonebranchpmtest
├── in
│ ├── report2.txt
│ ├── sub01
│ ├── report1.txt
├── report1.txt
|
Result
Task Status: success
Output:
report1.txt Deleted
Delete all Files Matching a Given Name Recursively in all Folders Starting from a Given Entry Folder
Delete report1.txt recursively in all folders starting from the given entry folder stonebranchpmtest/ will be deleted.
Configuration
File Path: stonebranchpmtest/
Storage Type:
amazon_s3_source
Filter Type: include
Filter:
report1.txt
Other Parameters: none
Code Block | ||
---|---|---|
| ||
stonebranchpmtest
├── in
│ ├── report1.txt <- this file will be deleted
│ ├── report2.txt
│ ├── sub01
│ ├── report1.txt <- this file be deleted
├── report1.txt <- this file be deleted
After execution of the delete action:
stonebranchpmtest
├── in
│ ├── report2.txt
│ ├── sub01
│ ├──
├── |
Result
Task Status: success
Output:
in/report1.txt Deleted
in/sub1/report1.txt Deleted
report1.txt Deleted
Monitor-object
The following examples describe how file monitoring can be performed.
N# | Name |
FM01 | Monitor a file in AWS - file does not exist on start of monitor. |
FM02 | Monitor a file in AWS - file exists on start of monitor. |
Note | ||
---|---|---|
| ||
Before running a monitor-object command, you can always try the command by setting the Dry-run option in the Task. |
Monitor a File in AWS - File does not Exist on Start of Monitor
With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for exxample, Linux).
max-depth 1 means only the current directory is in scope.
Configuration
Directory: stonebranchpmtest/in/
Storage Type:
amazon_s3_source
Filter Type: include
Filter:
test3.txt
Interval: 10s
Trigger on Existence: no
Other Parameters: --max-depth 1, --use-server-modtime
Result
Task Status: running
The monitor will stay in running until the file
test3.txt
is copied to stonebranchpmtest/inOutput:
INFO - REST SB library version 1.0
INFO - ############ Monitor Objects Action ###############
INFO - ############ List Objects Action ###############
INFO - sleeping for 10 seconds now
Monitor a File in AWS - File Exists on Start of Monitor
With the Parameter Flag --use-server-modtime, you can tell the Monitor to use the upload time to the AWS bucket instead of the last modification time of the file on the source (for example, Linux).
max-depth 1 means only the current directory is in scope.
Configuration
Source Storage Type: linux_source
Source: /home/stonebranch/demo/out/
Target Storage Type:
amazon_s3_target
Target: stonebranchpm/in/
Filter Type: include
Filter:
report10.txt
Trigger on Existence: yes
Other Parameters: --ignore-existing, --max-depth 1, --error-on-no-transfer
Result
Task Status: success
The task goes to success because the file report10.txt already exists in stonebranchpmtest/in ( trigger on existence is set to true ).
Output:
###### object in stonebranchpmtest/in/ ######
[{"Path":"report10.txt","Name":"report10.txt","Size":15,"MimeType":"text/plain","ModTime":"2021-12-08T16:26:18.000000000Z","IsDir":false,"Tier":"STANDARD"}]
Found, Already exists: {'Path': 'report10.txt', 'Name': 'report10.txt', 'Size': 15, 'MimeType': 'text/plain', 'ModTime': '2021-12-08T16:26:18.000000000Z', 'IsDir': False, 'Tier': 'STANDARD'}