Work on Microsoft's next gaming consoles is already underway, with reports suggesting a tiered family of Xbox devices under the Xbox Scarlett codename. While a high-end flagship comparable to Xbox One X is likely, we also expect a low-cost alternative, pairing with the newly-unveiled Project xCloud game-streaming service.
A new report from Wccftech further expands on the rumored cloud-backed device, stating plans to use a semi-custom AMD chipset, based on its Picasso Accelerated Processing Unit (APU) series. Bundling its CPU and GPU into a single die, the chip will ensure a healthy balance of performance and power consumption, aiding a compact form factor. An earlier report also claimed Microsoft plans to use the Picasso series in its next Surface Laptop refresh.
The report also backs our prior Project xCloud coverage, detailing a planned hybridized solution to improve remote play. While latency-sensitive elements of titles would be locally processed, graphics-intensive components would be handed off to the cloud backbone. With the help of deep learning, player actions can be predicted too, aiming to cut latency further.
Project xCloud is shaping up as a critical aspect of Microsoft's future gaming efforts, expected to launch in 2020 or prior. Expanding on its success with Xbox Game Pass, it plans to use xCloud to reach new gamers, banking on its mobile extension.
So, gameplay related stuff (cpu) runs locally. And graphical non-gameplay related stuff runs partially (partially + AI input assist) on the server, so it won't affect much gameplay e.g. latency... Ha, finally. I think I now have a picture of how this work... an interesting clever trick actually. And I suppose, this trick only works on Xbox, and with a patch, PC/WoA running XPA games.
As for Android and iOS, everything runs on server I suppose. But then, you don't need 4K/8K on a small screen. But well, seeing is believing.
Im curious how the input prediction would work. Is this the game playing for you? If it's just predicting what the imput would be, then I don't see the benefit other than maybe 1 to 3mS. Of course the system would then need to verify its prediction, so is there really any savings in processing time? If it's playing for you instead, this is a slippery slope to climb on...
I would think it would probably work similar to speculation control in CPUs (I think most all of us got a decent education on that with the exploits) the AI would map out a set of probable inputs kind of ignoring the others and then when the actual one was inputted it would use it but since all of them were in "memory" so to speak any of them would be faster than one that wasn't.
No, I don't see why it wouldn't run the same way, processing of streaming doesn't change for less capable devices. Less powered mobile devices will just not be able to play streaming games. The main focus for this device is to allow an inexpensive but powerful enough device to be hooked to a display (smart TVs) that themselves have no real CPU/GPU power beyond specialized chips. And I highly doubt they will be doing 4K any time soon. I do see there will be barriers for many tablets and phones from playing streaming games, no matter who that streaming service is from because it will require heavy local CPU processing. For example, Google streaming tech recently tested (using the web browser) had relatively needy CPU requirements for game streaming. It didn't work on many Chrome, devices that were barely a year old because of the lack of CPU/GPU capability required. Even those recent devices couldn't support the required processing. I suspect the same will happen with XCloud gameplay with similar devices because game streaming will require much more CPU power locally and will not just spin-up more CPU resources from the cloud just because you are using an Android tablet you bought in 2011 for $99. But, since phones and mobile devices, especially iPads and Surface-like devices, are already fairly powerful and will only grow more so in the coming year or two it will be less of a barrier for the needed local processing of any XCloud service in two years than it would be today for devices we carrier around, that are already a years old and will be replaced in a few years time when this service arrives.
MS Research project, "Outatime" was the foundational work for this service.
Originally conducted under codename "DeLorean", they managed to get Doom3 to run hybridized. The research paper(s) are available for review.
Was a fun read, thank you.
What I find most interesting is claim that the Picasso chip will be used in the next Surface Laptop. If true, then it would mean finally there is proper cross departmental collaboration. As well as indicating that AMD is finally making headway in laptop segment in terms of power savings and higher charge to charge times (longer run time on a single charge).
I hope Microsoft isn't putting too many eggs in this basket.
Thank you for signing up to Windows Central. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.