agowa338
Goto Top

Php error with Nextcloud big file uploads

Hi,

I've had some issues with the official Nextcloud (apache) container. When uploading some bigger files (I assume they're bigger, don't know for each individual one, it says 156GB in 13 files) using the desktop sync client it keeps failing and repeating the upload indefinitely.
Does anyone know what I could try to debug this issue further?
I suspect some issue with either the apache2 configuration (or the haproxy) regarding chunking or caching.

apache2 log output:

Content-Key: 1452177506

Url: https://administrator.de/contentid/1452177506

Printed on: March 23, 2023 at 05:03 o'clock

Member: BirdyB
BirdyB Nov 01, 2021 at 06:15:14 (UTC)
Goto Top
Hi,

did you check the maximum-upload size in your php config?

Kind regards
Member: agowa338
agowa338 Nov 01, 2021, updated at Nov 02, 2021 at 08:39:30 (UTC)
Goto Top
Thanks, I completely overlooked that one because the client sent the file completely. I just assumed that an upload limit in PHP would cancel the upload and not just receive it completely and drop it afterward...

upload_max_filesize was set to 512M via an environment variable.

Therefore I now just set these environment variables like:
(because it does not appear to prevent anything as the whole file was received anyway)...

Thanks for your help face-smile

Edit: Nope it wasn't that. Now it also works with the default limits of 512M (localhost deployment though) ?!? think I've to do some more testing ace-sad"

Edit2: Another few hours later I now found the problematic module. It's related to the S3 module.
As soon as I switch to the S3 bucket instead of local files it causes errors.

Edit3: Noticed why I initially thought the problem was solved. If you just don't do anything and let the client "do its thing", it'll incorrectly show that it synced everything successfully until hitting "Force sync now" or looking into the folder at the status icon of the failing file...

Edit4: Think we're done here. I've opened an Issue upstream. It looks like there is an error-handling API responses from the S3 bucket properly. Can be reproduced (even though slightly different) with another S3 bucket (localstack).