I enjoyed the article, but I’m skeptical of the “democratize via hardware + networking” path. Most people won’t run a Pi, manage updates/backups, or debug home networking, and that’s fine (as you note).
But I do think we’re reaching a turning point on the software side. The barrier to building custom, personalized apps is trending toward 0. I’m not naive enough to think every grandma will suddenly start asking ChatGPT to “build me an app to do XYZ,” but with the right UX it can be implicit. Imagine you tell an assistant: “My doctor says my blood sugar is high. Research tips to reduce it.” -> it not only replies with tips, it also proactively builds a custom app (that you own and control) for tracking your blood sugar (measurements, meals, reminders, charts, etc.). You can edit it by describing changes (“add a weekly trend graph,” “don’t nag me after 8pm,” etc.).
This doesn’t fully solve your Big Co control issue (they own the flagship models today), but open-weight + local options keep improving. I'm hopeful we have a chance to tip the scales back toward co-owner and participant.
> You can edit it by describing changes
Even this is hard. Most people don't know what they want, and/or they don't know how to describe it/imagine it. They don't even know what a trend graph is.
They just want someone else to do the mental effort of creating a nice product. Hence iOS > android for most people. They don't want to customise basically anything other than colours.
That's why i predict Lovable/replit etc will not go mainstream. And why chatgpt will just offer you their UIs mainly. Artifacts weren't a big hit
Truly democratizing the web requires that "Compute Server" becomes a typical home appliance that is no more difficult to use than an oven or a furnace including the widespread access to vocational technicians who come to your house and fix it for you.
I mean -- my (completely non-technical) mother, after a few hours of my guidance, has started vibe-coding apps and websites for her local community organizations. And, like -- it works.
"it also proactively builds a custom app"
Does it deploy it as well?
This post resonated with me. I have tried self hosting multiple times over the years and always gave up cause it is so hard to manage and there was always an online service that was good enough.
This weekend I vibe coded (dont shoot me) a homelab platform that hosts a bunch of useful services on a MacMini and lets me deploy my own apps on top of. Using tailscale I can access the apps from my phone. I have multiple users with their own SSO to control access. I even have a pi as part of the network that hosts public facing content. All done with Claude Code and OpenClaw (as a kind of devops tool)... hardly any code written by me. Its been a seriously fun experiment that I will try to progress some how.... if only because I love the dream "Digital sovereignty" even if the reality is its unlikely to happen again. It got me thinking though if I could get inference hardware and a good enough open LLM to work with my setup it might just be possible. The OP advocates a form of basic computing that is understandable but when we are able to host our own LLM's we could end up in a very different but more capable paradigm.
The repo for homelab anyone who has an interest: https://github.com/briancunningham6/homelab