[desktop] Fix an OOM on large library imports

Should reduce the following occurrences (This should make it better, but there
might be other reasons for the OOM too): -
https://github.com/ente-io/ente/issues/2500 - -
https://github.com/ente-io/ente/discussions/3420

---

Here, the issue is that the combineChunksToFormUploadPart function, while not
incorrect, is terribly inefficent in how it combines Uint8Arrays byte by
byte. This apparently causes an allocation pattern that the V8 garbage
collector, Oilpan, doesn't like, and crashes the renderer process with:

    [main] <--- Last few GCs --->
    [main]
    [main] [17639:0x13000e90000]    39409 ms: Mark-Compact (reduce) 48.1 (57.8) -> 47.7 (52.8) MB, pooled: 0 MB, 35.08 / 0.04 ms  (average mu = 0.857, current mu = 0.906) CppHeap allocation failure; GC in old space requested
    [main]
    [main]
    [main] <--- JS stacktrace --->
    [main]
    [main] [17639:1025/145540.195043:ERROR:v8_initializer.cc(811)] V8 process OOM (Oilpan: Large allocation.).

The effort was primarily spent in getting it to a reproducible-ish state, and I
can now sporadically reproduce this watching a folder full of large videos, and
setting the network conditions in DevTools to 3G. For real users, what probably
happens is, depending on network speed, there is a potential race condition
where 4 multipart uploads may start within the same GC cycle (but I'm guessing
here, since the setup I have for reproducing this is still very sporadic).

Here is a smaller isolated example. This code, when repeatedly invoked in a
setTimeout (independent of any uploads or anything else in the app), causes the
renderer to OOM within a minute.

    import { wait } from "@/utils/promise";

    async function combineChunksToFormUploadPart() {
        const combinedChunks = [];
        for (let i = 0; i < 5 * 5; i++) {
            const { done, value: chunk } = await readDo();
            if (done) {
                break;
            }
            for (let index = 0; index < chunk.length; index++) {
                combinedChunks.push(chunk[index]!);
            }
        }
        return Uint8Array.from(combinedChunks);
    }

    const readDo = async () => {
        await wait(10);
        const ENCRYPTION_CHUNK_SIZE = 4 * 1024 * 1024;
        return {
            done: false,
            value: Uint8Array.from(
                Array(ENCRYPTION_CHUNK_SIZE).fill(Math.random()),
            ),
        };
    };

---

Some flags which helped in debugging:

    app.commandLine.appendSwitch("js-flags", "--expose_gc --trace_gc --trace_gc_verbose");
This commit is contained in:
Manav Rathi
2024-10-25 20:15:49 +05:30
parent 24009842d7
commit 3511fcf723

View File

@@ -37,6 +37,7 @@ import {
} from "@/new/photos/services/upload/types";
import { detectFileTypeInfoFromChunk } from "@/new/photos/utils/detect-type";
import { readStream } from "@/new/photos/utils/native-stream";
import { mergeUint8Arrays } from "@/utils/array";
import { ensure, ensureInteger, ensureNumber } from "@/utils/ensure";
import { CustomError, handleUploadError } from "@ente/shared/error";
import { addToCollection } from "services/collectionService";
@@ -1573,9 +1574,7 @@ async function combineChunksToFormUploadPart(
if (done) {
break;
}
for (let index = 0; index < chunk.length; index++) {
combinedChunks.push(chunk[index]);
}
combinedChunks.push(chunk);
}
return Uint8Array.from(combinedChunks);
return mergeUint8Arrays(combinedChunks);
}