Skip to content

Conversation

@prestonbrown
Copy link

The problem

uploadFormFile() loads the whole file into memory before sending. That's fine for small files, but when you're uploading 50-100MB files on an embedded device with 512MB RAM... yeah, not great.

I hit this while building a 3D printer touchscreen UI. Uploading large G-code files was causing the whole system to choke.

What this adds

New function uploadLargeFormFile() - like uploadFormFile() but streams the file in chunks (same pattern as uploadLargeFile() vs uploadFile()).

// Basic usage (filename derived from path)
auto resp = requests::uploadLargeFormFile(
    "http://server/upload",
    "file",                    // form field name
    "/path/to/huge_file.bin",
    params,                    // extra form fields
    [](size_t sent, size_t total) {
        printf("%.1f%%\n", 100.0 * sent / total);
    }
);

// With custom upload filename (when local path differs from desired name)
auto resp = requests::uploadLargeFormFile(
    "http://server/upload",
    "file",
    "/tmp/modified_version.bin",
    "original_name.bin",       // upload filename override
    params,
    progress_cb
);

Memory usage for a 100MB file:

  • uploadFormFile(): ~100MB
  • uploadLargeFormFile(): ~40KB (just the chunk buffer)

Implementation

I extracted a shared helper to avoid duplicating the streaming loop from uploadLargeFile():

  • New: internal::streamFileToConnection() helper - extracted from uploadLargeFile()
  • Refactor: uploadLargeFile() now uses the shared helper (same behavior, no breaking change)
  • New: uploadLargeFormFile() for multipart uploads, also uses the shared helper

No breaking changes - existing code works exactly as before.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant