> Consider these ideas at broader scale too - What if you could amortize the cost of that power-hungry timer thread for thousands of gamers instead of just 1? Are there maybe some additional benefits to moving 100% of the game state to the server and shipping x265 frames to the end users?
I don't know of any online game that doesn't track the official game state on the server. This is required for anti-cheat reasons, if nothing else. However, networking would typically be optimized to send small packets of information, and typically over UDP. If a packet is missed, the next packet will just be overwriting it anyway, so no big deal. Clients would basically just locally simulate game state and then reconcile that against the game state being received from the server.
Also, rendering has to be done per-client anyway, since each one has a different camera state. So there's no economy of scale for doing that on the server rather than the client. In fact, there's probably anti-economy: Servers that aren't rendering don't need expensive video cards unless they're using them for physics.
I have the understanding that more modern games do sometimes use TCP sockets. And obviously modern bandwidth has made streaming frames realistically possible. Hence the recent emergence of streaming game services.
"Serverless" as in one of the peers is nominated as the Source of Truth? Or truly decentralized state? Because the first option is just changing who owns the server.
I don't know of any online game that doesn't track the official game state on the server. This is required for anti-cheat reasons, if nothing else. However, networking would typically be optimized to send small packets of information, and typically over UDP. If a packet is missed, the next packet will just be overwriting it anyway, so no big deal. Clients would basically just locally simulate game state and then reconcile that against the game state being received from the server.
Also, rendering has to be done per-client anyway, since each one has a different camera state. So there's no economy of scale for doing that on the server rather than the client. In fact, there's probably anti-economy: Servers that aren't rendering don't need expensive video cards unless they're using them for physics.
I have the understanding that more modern games do sometimes use TCP sockets. And obviously modern bandwidth has made streaming frames realistically possible. Hence the recent emergence of streaming game services.