• git [he/him, comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 months ago

    Data duplication as an optimisation only really makes sense for optical media where you have exact control over where data is pressed or burned to a disc. For example Halo games where each map file contains basically the whole game and level geometry/scenario in a compiled format minus the bits not needed for that particular level so it’s easy for the DVD drive to sequentially load all data for a given level.

    In the context of hard drives where you can’t control that layout for various reasons it’s just a meme parroted by devs who don’t bother with or can’t implement a PC-oriented asset loading pipeline (e.g. pre-caching lists, centralised shared assets, virtual file systems/pack formats) and instead kick storage costs down to end users because disk sizes are so large nowadays and it’s easy to just statically bundle assets that might share aspects across each other e.g materials, shaders. Of course siloed development teams makes it very tempting to ship games this way, as does pressure from management so I do sympathise.

    File duplication on a hard drive saves a few seconds in the best case scenario with a defragmented disk and even that is margin of error because guess what the OS and filesystem will do whatever the fuck it likes in a random access scenario.