Your current laptop is still a fine thin client. Unless you program in the woods, it's probably cheapest to build a home inference box and route it over Tailscale or something.
Or just an API server for all other devices to connect and do stuff with it.
Or just an API server for all other devices to connect and do stuff with it.