Tuning Parameters
EnterpriseBacula Enterprise Only
This solution is only available for Bacula Enterprise. For subscription inquiries, please reach out to sales@baculasystems.com.
This set of parameters are again common to some other plugins and modify general things not directly associated to the S3 Plugin. They are also advanced ones. They should not be modified in general.
They can be used to tune the behavior of the plugin to be more flexible in particularly bad network environments or when there is significant job concurrency, etc.
Option |
Required |
Default |
Values |
Example |
Description |
|---|---|---|---|---|---|
backup_queue_size |
No |
30 |
0-50 |
1 |
Number of maximum queued internal operations between service static internal threads (there are 3 communicating through queues with the set size: fetcher, opener and general publisher to Bacula core). This could potentially affect S3 API concurrent requests and consequently, Google throttling. It is only necessary to modify this parameter, in general, if you are going to run different jobs in parallel |
concurrent_threads |
No |
5 |
0-10 |
1 |
Number of maximum concurrent backup threads running in parallel in order to fetch or open data for running download actions. This means every service fetcher and service opener will open this number of child concurrent threads. This will affect s3 api concurrent requests. S3 API could throttle requests depending on a variety of circumstances. It is only necessary to modify this parameter, in general, if you are going to run different jobs in parallel. If you want to have a precise control of your concurrency through different jobs, please set this value to 1. Also, please be careful with the memory requirements. Multi-threaded jobs can significantly increase job memory consumption |
general_network_retries |
No |
5 |
Positive integer (number of retries) |
10 |
Number of retries for the general external retry mechanism |
general_network_delay |
No |
50 |
Positive integer (seconds) |
100 |
General Plugin delay between retries |
S3 has a very reasonable bandwidth for running concurrent request against any bucket and throttling should generally not be an issue. However, there still exist some limits, you can learn more about them in the following link:
See also
Previous articles:
Next articles:
Go back to: S3 Plugin: Configuration.