Troubleshooting AWS Provider #
8100 - Emory was not able to upload files to the destination in the provider - 8100 #
Backups fails when uploading to S3 and you see the following error on emory log file:
time="2021-03-31T10:54:04+02:00" level="error" msg="Backup Error" BackupID="1617180357145" Code="8100" Description="Emory was not able to upload files to the destination in the provider" EBID="81405" ErrorMsg="aws.UploadObject: Error while uploading objects to AWS. Additional Information: MultipartUpload: upload multipart failed upload id: _EHp7lQOJEXA4TJB9xhDXmzb06UCNftERuTegG7.DT2QM_SlCq0ffV0h......UXIz6lJaYTl5SYWn5UAtPycxiTQaU.jaaFyckX5.v4gV1T9ClBzw-- caused by:* TotalPartsExceeded: exceeded total allowed configured MaxUploadParts (10000). Adjust PartSize to fit in this limit" Hint="Check that the server has the correct permissions"* Hostname="XXXXXXXXX" Level="COMPLETE" Name="/usr/sap/XXX/SYS/global/hdb/backint/DB_ZZZ/20210331_104557_databackup_3_3" PID="102497"
AWS S3 SDK upload function can handle a file which size is at maximun MemoryBufferSize * 10.000 bytes. So your MemoryBufferSize configuration parameter will define which is your maximun file size that emory will be able to upload correctly.
By default, depending on your database engine, this parameter will be:
|Engine||Size (Bytes)||Maximun file size to upload (GBytes)|
Minimun value for MemoryBuffer size is 5242880 bytes.
Depending on the database:
You can configure your HANA database engine to open more channels that just one when backup/restoring the database. This will launch several emory processes in parallel that each one will handle less amount of data as the whole data will be splitted onto smaller parts.
You can get more information on the following SAP Community Blog:
Other solution is to increase the MemoryBuffer size until it meets your max file size requirements.