In early May of 2019, Google submitted patches to merge support for the Incremental File System into the Linux kernel. According to the documentation that Google submitted, Incremental FS is a “special-purpose Linux virtual file system that allows execution of a program while its binary and resource files are still being lazily downloaded over the network, USB etc.” The purpose of this feature is “to allow running big Android apps before their binaries and resources are fully downloaded to an Android device.”
Isn’t this already possible in various other ways, though? I mean, PlayStation 4 games can be played well before they’re entirely downloaded, as can Blizzard games, to name a few. I’m pretty sure those just load early-game assets first, so I’m not sure if that aligns with that Google is doing here, but this kind of feels like a solved problem.
i can already imagine people making speedruns of said games, just to try to break this system.
[email protected],
Running remotely isn’t really novel, windows has done this forever and almost any network file system on linux supports this. Apps generally don’t care where they run from, it’s just a matter of performance. The worst case scenario is that the data isn’t locally cached and it will work like a network file system and will depend on the bandwidth between you and server.
Google’s file system downloads on demand to satisfy reads immediately with a low priority task to opportunistically download the rest of the data in the background, I can see the uses for this. Google’s readonly implementation is much easier to code than a r/w version. IMHO they should have included an invalidation mechanism for content updates. I realize that “write-once” keeps it simpler, although it limits potential mirroring use cases.
I personally would push back on google’s patch until they address whatever code limitations are responsible for “It is focused on incremental delivery for a small number (under 100) of big files (more than 10 megabytes).” and not supporting “more than a couple hundred files and directories.” That is completely artificial. There’s no excuse for not efficiently handling thousands of files of arbitrary sizes other than inefficient structures and laziness on google’s part. I wouldn’t want this mainlined until they address it.
I’m guessing that the difference is that the Playstation/Blizzard games need to be written this way first or have major work to retrofit the lazy loading system. Whilst what Google is suggesting sounds like it may require no (or iminimal) changes to existing games.
PS: to yoshi314 – it is a *really* bad idea to use your email address as a nickname on a public forum. At best it will get scraped and you’ll be spammed to death
sloth,
You may not realize this, but all of our email addresses are getting leaked through the gravatar links, which expose the md5sum of our emails. This poses two risks:
1) It’s possible to brute force emails. I successfully reversed several of the osnews staff email addresses a while back to demonstrate this risk using hashcat with GPU acceleration.
2) It’s trivial for companies like google to identify and track user accounts across the web using gravatar links.
I’ve brought this up on osnews in the past, alas gravatar is still being used. Past researchers have demonstrated the ability to reverse a significant portion of gravatar hashes. It’s something everyone needs to keep in mind if they consider emails private information.
https://www.wordfence.com/blog/2016/12/gravatar-advisory-protect-email-address-identity/