You can go a long way with just Termux. You can upcycle old phones by installing or building code in Termux to turn the phones into a compute grid, AI inference nodes, file servers, compute servers, web servers.
I was actually just going to do that with an old Galaxy S24. Seems like there's no easy way to add something like docker. Best I can find is to try to use qemu to get a full Linux VM.
Do you happen to know what kind of performance you can expect? Or perhaps a better way?
> AI inference nodes
Are phones any good for that? (I agree with the rest, and I'm a big fan of termux, I just wouldn't have thought of a phone - especially an old phone - as a useful way to run AI)
https://www.analyticsinsight.net/gadgets/old-android-phone-r...