"Always Online" software is an interesting paradigm. I haven't seen much written where the logical split of offline/online software should be, so thought it might make for a good topic.
As a rule of thumb, if you have a hard dependency on some resource, you might as well go all the way with that dependency.
Let's say you have a Bitcoin miner. The miner is worthless to you if it's not able to reach the mining pool. And reaching the mining pool requires Internet access, generally speaking.
So let's define two possible models for the mining software.
Miner design #1
Miner boots Ubuntu and has a service which launches xmrig to do the actual mining. While none of the boot process requires Internet access, the system is still useless to you if it's not mining. This also requires it has some kind of harddrive, which increases your per-miner cost and heat output slightly.
Miner design #2
Miner boots into a MinerOS via iPXE. Your router is configured to hand out iPXE images and boot parameters over PXE. Alternatively, the miner might have a USB stick which has iPXE and relevant configuration already loaded.
The miner fetches down the latest, fastest MinerOS in a couple minutes and starts mining shortly after. While the boot time is longer, you no longer have to worry about updates on the server. Whenever it boots, it's running the latest, fastest software.
In this case, a Bitcoin miner is a good application to mandate total Internet access. Why have an operating system installed on some drive that needs to be maintained and upgraded? Why not use an upstream MinerOS that does it for you and is always ready to go? Depending on how you have it setup, launching new miners is as simple as connecting an ethernet cable and booting it up, automatically launching as a miner.
Not everything is a cryptocurrency miner
Let's say I wanted to run the latest operating system on my desktop and it was also iPXE booted. Every time it boots, it's on the latest software. This is quite clever while it works. However one day, my router is cooked or my ISP is having issues and I can't tell which. I turn on my desktop and it's completely unusable, I'm unable to diagnose my connection from it.
I've now demoted my desktop to a brick.
Let's say I pull out my installation media and try to install some kind of BSD or Linux distribution on the desktop so I can find out what's going on. A lot of installation media is designed for online installs. It's much smaller and fetches the latest packages, which is nice when everything is working. Why not install the latest packages and have a slimmer installer when you can?
The problem is that my "online install" media is still mostly a dud. Maybe I can get a slim shell on my desktop with it, but it might not even have
ping installed. My desktop, a machine capable of getting man on the moon and thensome, is basically a paperweight. Not even an effective calculator. All because I don't have any installation media that works offline.
At one point, there were more systems offline than online. I'm not sure when that changed, but it's fairly telling both in the Netflix effect with DVD vs streaming services and store bought games in a box, vs Steam and the like. At one point, you could be in Antartica and play whatever video game you wanted if you had a generator. Now, you might have to be online just to unlock the game's DRM. Again, depending on what you want your machine to do, it quickly becomes a paperweight
Of all dependencies, the Internet is the biggest
There's something "perfect" about standalone software. An old Palm Pilot that had some kind of calendaring system. Now on some phones, the only calendar might be Google Calendar and you can't even tell what day your parents' birthdays fall on if you're not online. To be fair, this means you have a calendar that can be updated from any of your devices and is always kept in sync as long as you're online. So the drawbacks aren't without benefits.
Many software developers go to great lengths to have as minimal of dependencies as possible. The pinnacle is an application with no shared libraries or local services that need to run. Yet, many accept software that refuses to function if another server, running tens of thousands more lines of code doesn't have a valid SSL certificate, an unexpired domain, and DNS pointing to its IP address. Somehow, that requirement is seen as acceptable over and over. In reality, sometimes it's not a single service that's depended upon, but tens of them. Any one of them down and the entire application breaks, depending on how it's written.
While software is updated at faster rates than ever before, it's also much more fragile. What if a meteor hits one of AWS' major datacenters? What if a single bug erases every single S3 bucket? It's unlikely but also completely possible.
With hardware being more advanced than ever, storage being cheaper than ever, we're able to self-host far more than ever before. Yet, our software tends to have more single points of failure than ever before, even at installation time. What if Github went offline? What if Pypi went offline? Would you be left with a bunch of paperweights? We're approaching chicken vs egg level problems in the Internet if major resources went offline and we tried to get back to where we started.
I won't say that self-hosting doesn't have tremendous drawbacks. It's not for everyone. Sometimes cloud services are by far the most sensible option. I think though that self-hosting can be brought up to modern, "cloud level" standards. This is already the case with much Golang software, designed to be easily compiled and have as few dependencies as possible.
go get leaves you with all the code you need locally to alter your software completely. I like that the default workflow also happens to be the most resilient, apocalypse ready one.
Anyway, these are just things I've been thinking about while I write software and hoard data. It's a bit of an excessive exercise, but I'd like to know that I could function as "normally as possible" if I were cut off from the rest of the Internet, or if it was cut off from me. Of course, I'd be quite curious about how to connect back up which could prove quite the feat, but that's a very different mental excerise. Probably worthy of consideration.
You can have every movie you'd ever want to watch on a 12TiB external harddrive that goes with you anywhere, or be paying a streaming service and bound to their terms and rates for life. One, you can utilize with a $100 generator and some gasoline (or propane). The other is so far removed and out of your hands whether it works or not.
Thanks for reading.