Let me clarify: We have a certain amount of latency when streaming games from both local and internet servers. In either case, how do we improve that latency and what limits will we run in to as the technology progresses?
Theoretically, the latency between the streamer and viewers could be zero or near zero.
For playing games online, the minimum possible latency is the speed of light delay. We’re pretty much already at the limit for that one, and we’re even using a lot of pretty clever techniques to mitigate latency such as lag compensation.
Ooh, we're not at the speed of light as a limit yet, are we? Do you mean "point A to point B" on fibre, or do you actually mean full on "routed-over-the-internet"?
Even with fibre (which is slower than the speed of light), you're never going in a straight line. And, at least where I live, you're often back-tracking across the continent before your traffic makes it to the end destination, with ISPs caring more about saving money than routing traffic quickly.
For most of us, there is no difference though; you get what you get.
I live in a nice neighborhood but I won’t ever get fiber… we have underground utilities and this area is served by coaxial cable. There’s no way in hell they are digging up miles of streets to lay fiber; you get what you get.
My ISP latency is like 16-20ms but when sim racing it just depends on where the race server is (and where my competitors are). As someone on the US west coast, if I’m matched with folks in EU and some others in AUS/NZ, the server will likely be in EU and my ping will be > 200. My Aussie competitors will be dealing with 300-400.
It’s not impossible to share a track at those latencies, but for close racing or a competitive shooter… errrr that just doesn’t work.
The fact that I’m always at around 200ms for EU servers might be improved if we could run a single strand of fiber from my house to the EU sever (37ms!) but there would still be switching delays, etc. so yeah the speed of light is the limit, but to your point, there’s a lot of other stuff that adds overhead.
Even with fibre (which is slower than the speed of light)
This makes no sense. Are you referring to the speed of light in a vacuum? Fiber transmits data using photons which travel at the speed of light. While, yes, there is often some slowing of signals depending on whether the fiber is single-mode or multi-mode and whether the fiber has intentionally been doped, it’s close enough to the theoretical maximum speed that it’s not really worth splitting hairs (heh) over
There are additionally some delays added during signal processing (modulation and demodulation from the carrier to layer 3) but again this is so fast at this point it’s not really conceivably going to get much faster.
The bottleneck really is contention vs. throughput, rather than the media or modulation/demodulation slash encoding/decoding.
At least to the best of my knowledge!
you’re often back-tracking across the continent before your traffic makes it to the end destination, with ISPs caring more about saving money than routing traffic quickly
That’s generally not how routing works - your packets might take different routes depending on different conditions. Just like how you might take a different road home if you know that there’s roadworks or if the schools are on holiday, it can be genuinely much faster for your packets to take a diversion to avoid, say, a router that’s having a bad day.
Routing protocols are very advanced and capable, taking many metrics into consideration for how traffic is routed. Under ideal conditions, yes, they’d take the physically shortest route possible, but in most cases, because electricity moves so fast, it’s better to take a route that’s hundreds of miles longer to avoid some router that got hacked and is currently participating in some DDoS attack.
I played on Google Stadia from day 1 until it got shut down. I mainly played racing games like F1 and GRID, with the occasional session in RDR2 or The Division 2. Latency was never a problem for me.
The main problem that occured over and over in the community was people's slow or broken internet connection at home or their WiFi setup.
I would say the technology for cloud gaming is here today, but the home internet connections of a lot of people aren't ready yet.
Many people don't understand the continued importance of a home wired LAN. WiFi is, and probably always will be, a fraction of the performance of an ethernet connection.
WiFi is, and probably always will be, a fraction of the performance of an ethernet connection
In terms of bandwidth, sure, but not in terms of latency, in fact, theoretically, WiFi could be faster than Ethernet. WiFi uses radio waves, which travel faster in air than electrons do in copper and photons do in glass.
The limitation for WiFi is really at the physical layer - i.e. encoding/decoding. With that said, we do already have WiFi with transcoding fast enough to give sufficient performance for fast-paced gaming. While you’re totally correct that, at the moment, Ethernet is more capable in terms of bandwidth and latency, that’s not necessarily going to be true forever, and WiFi is good enough for any purpose at home use. The biggest issues are interference and attenuation - e.g. thick walls, sources of electromagnetic interference
I would say the technology for cloud gaming is here today, but the home internet connections of a lot of people aren't ready yet.
You witness this a lot with video conferencing. People tell one person their audio/video is shitty, and that person just shrugs and says "yeah, I have bad internet." In my head I'm screaming "Well, what have you tried?!" or "I see you sitting beside the refrigerator there!"
Yeah... or microphones... I really wish they'd start putting the noise cancelling as an option on the receiving end... lots of people don't care to set up their audio right and then you get god awful static, crunching, or breathing in your ears.
It's especially prevalent in gaming where headset mics dominate. 🙃
Those games are quite well matched with cloud streaming. An example of a game which isn’t suitable for cloud gaming would be competitive FPS games such as rainbow 6 siege, where the additional delay imposed by connection between the player and the game can be quite a significant disadvantage. The only way that this would be low enough to become acceptable would be if you live close enough to the host device that the latency is very low, or or the host device is very close to the game server itself.
I had Stadia too and played a lot of Destiny 2. I must say that I was highly impressed by the low latency. I literally couldn't notice that I wasn't playing locally, unless my internet went down.
Only when I took Stadia with me to a random airbnb did I start noticing any type of latency, and then we just played Mortal Kombat or other fighting games where you can just mash the buttons.
The speed of light, so 50ms or so assuming locations on Earth. In practice a bit more because you have to go around it rather than through the core. Servers already have to make retroactive calls, which is why it looks like you hit but then you didn't sometimes.
Interestingly enough, Starlink has lower latency than wire despite the longer path because light travels slower than c through glass fiber.
Obviously light doesn't have to travel quite as far, but 50ms is not a bad estimation for a worst case. Also you have to add processing delays at each router, which makes everything far slower.
The base limit is the speed of light/electricity it takes X time for a signal to travel. This is your base latency. For example it takes about 70ms for light to travel half way round the world (it has to go round, not through). This can be improved by talking to servers that are closer to you and by taking links that are direct. But can't be improved beyond the rules of physics.
On top of this you get really small amounts of processing delays as data is passed through various routers/computers on the way to the destination.
The real problem comes from congestion - if there is a lot of data being transferred between two destinations, the infrastructure between them might not be able to cope. This may result in messages being queued (causing a delay) or dropped (your controls don't make it to the server!) To avoid this, the network will route your message via somewhere else with less demand, increasing the distance and delay (but spreading the load)
Unfortunately, if that overloaded cable is the one bringing data into your neighborhood, then there likely isn't an alternative route. In the UK at least, we are (finally) building out a fiver to the premises internet network that effectively fixes any local bottlenecks.
If you want to see where your latency is coming from, you can run a trace route using various applications (or even directly in windows). This will show you the latency between each router that your data is traveling through on its route to it's destination.
Edit addition: for game streaming the network delays are added onto the natural delays of running the game (controls -> computer -> processing -> display/speakers).
The other big additional delay for streaming is that in order to reduce the network load of streaming the game the image is compressed and encoded to be sent to you (much more than is done for your monitor cable).
This is a computationaly intensive operation that can take a good few ms. The better the computers at either end, the faster this can be done. However the big way forward here is hardware encoding/decoding. By using hardware that is made to just do encoding/decoding and nothing else this can be done much faster.
These encoders are commonly on graphics cards, and the graphics parts of CPUs. As newer encoding formats are created and hardware encoders created (and actually included) this area will becomeuch faster.
Source: programmer with a computer science degree and a vague interest in networking.
The lag has several components. Input lag between the peripherals and your computer, the network transmissions to the server, the regular rendering of the game, live transcoding the game, the network again, decoding the stream on your device. The rest are pretty much insignificant.
The biggest way to reduce lag I can think of is if the server is literally in your city, and the connection between it and you have the least amount of nodes between you and the server. Some video streaming services will partner with ISPs to put their servers in the same place to reduce overhead and improve the user experience. I'd assume that gaming would benefit from that too, but this is harder to implement since.
Another way to improve networking lag is by prioritising game streaming data over other data, QoS (quality of service), is really important both for the home network and on the ISP side.
This should be obvious, but don't use a VPN.
For the video transcoding, it can be pretty quick, but having dedicated hardware like NVENC would be faster than using the CPU, not just in terms of FPS, but also in latency if given the same FPS (through FPS cap).
Higher FPS. The more frames per second, the lower the input lag, though it only matters if you eliminate network lag first.
I should mention that I have never used any game streaming service, and I don't have the equipment to test lag either.
I think we are constantly progressing in that field. One issue for latency was that controllers used to contact your device, and then the server. Now they can connect directly to the server. Things will improve, like it or not.
For right now, I think the biggest hurdle is with ISPs.
Data caps can be quite common, in many countries. Essentially creating a huge limit on how much you can (if at all) play.
Most people’s router, and access point hardware needs upgrading. A lot of the stock router AIOs from ISPs are really bad. Creating a bottleneck before the data even reaches the servers.
Another hurdle I can see is companies profit sharing. Everyone wants a large cut, so I’d expect multiple streaming options… and many failures, like what we’re seeing on the movies/series streaming model… just with games it’ll be soooo much worse.
Steams solution is about as simple as it gets. Install steam on both devices (or the steam link app/ a physical steam link box), pair controller, log in, hit play.