I think VRs big thing is plug and play ease of use and making everything compatible for big fat slobs to do it from the couch, which from what Aaz has told me is the gateway AR holds the key to.
Plug and play was not really an option 4 years ago, and I would say even today it's still mostly enthusiast territory. Unless people got the chance to try it from a friend or at an event or something most of them would just sleep on it. I'm guilty of that as well as I've been following the VR scene on and off since its reintroduction to the market but never had the ideal circumstances to set one up for myself.
That's my marketing strategy: BIG. FAT. SLOBS.
Remember me after you get rich.
Now, on the other hand, something can look great in its time, and even comparable to today without directly comparing, but once you see the difference you can never unsee it.
To be completely honest, Dark Souls III has never really looked that good, even when it first came out. Graphics weren't exactly the strong point or main focus of any FromSoftware game, so comparing it with this remake which clearly puts much more accent on that aspect of the experience is not ideal.
Don't expect anything as dramatic as that UE5 demo in the future.
Of course not, the engine itself is not even out yet, but they could create something similar to specifically show off how the new features look on a PS5, like graphics manufacturers commonly do.
Microsoft has done some side by side stuff for the loading, and I think Sony might have as well during Cerny's ultra boring tech presentation at the beginning of the year, and that's probably the extent of what a side-by-side comparison will show. Better loading times and slightly finer visuals that don't amount to much unless you're hyper focused on it.
I haven't snooped out Microsoft in a while, but Cerny's tech presentation I had watched almost in full when it came out, and there was no side by side gameplay, which is what I'm thinking of. Taking old and new Spider Man or Horizon and comparing the way they look and run, something like that, although the differences might not be that obvious for those, as they are transitional launch titles, which is why a dedicated demo taking full advantage of the hardware would be a better idea.
By the way, that UE5 demo ran at 1440p with 30 fps, and featured no ray tracing. It did a good job showcasing new UE features, but I found it too focused on Quixel, which is of limited use for AAA titles.
Not surprised by that. As for the Quixel assets, it's their demo after all, it only makes sense for them to use in-house software to make it, but I get your concern.
I don't think I am. And I mean, you do agree with me that "partial ray tracing" won't make a difference on its own, right? I don't want to get into the technicals of how ray tracing works and the limitations of what's currently labeled that way, but I really don't think it'll matter much for this gen. And I say that as someone using a 2080 Ti. "RTX On" is a marketing gimmick.
I'm not putting all my expectations into the ray tracing alone, even though compared to Nvidia's RTX-ON crap which doesn't work properly without optimization and even DXR, games made to specifically run on the PS5 and make use of its GI and RT features might not be too bad actually. As I said though, it's this stuff combined with the other functionalities like the shading and rendering of more detailed assets and textures in real time without sacrificing performance or memory, which is in no small part made possible by the SSD, by the way.
However I do think moving on from HDDs will make for a notable qualify of life improvement. From that perspective, I found the Ratchet and Clank demo to be a good showcase, with seamless world loading.
The seamless loading is significant and will make designing games easier (the Ratchet and Clank gameplay was indeed a pretty cool demonstration of that, now that you mention it), but the custom flash controller will also help on the visual front, as I detailed above, so it's really more than just quality of life stuff.
Now obviously, more powerful/faster components (CPU, GPU, RAM) will allow for better graphics and more objects on screen at once, but I don't foresee that being as big of a deal as, say, moving up from PS2 to PS3. And that will be especially true at first, given that Sony has chosen a strategy of slowly transitioning from PS4 to PS5, with games working on both at first (same for Microsoft).
Raw graphics quality improvements nowadays are hardly ever drastic. The real advancements will be more on the functional side of things, which will influence visual immersiveness in other ways than just a sharper image.
The PS5 will be backwards compatible with the PS4, but not the other way around. The first games will be transitional indeed, but you won't be able to just pop them into a PS4. I might have misunderstood you though, so feel free to clarify.
You know, speaking of that, I'll be curious to see which PS5 games will truly run in 4K at 60 fps. I suspect a non-trivial amount will stick to either an upscaled 1440p or 30 fps, but time will tell.
I agree, that will likely be the case for most AAA titles, especially the ones that choose to utilize the next-gen functionalities. True 4K at 60 FPS is probably gonna be reserved for indie games.
Anyway, like Griff said above, I do think UHD is objectively less of a step up than SD to HD was, especially if you don't go for a much bigger TV. And it'll be even less dramatic when we go from 4K to 8K. It's normal, it's diminishing returns given the amount of information the retina can take. That's also why smartphone screens don't move up to 4K, because it doesn't make a difference at such a small screen size.
Technologically speaking, it's actually a bigger step-up, but because of the diminishing returns you mentioned which I had also wanted to bring up, it's not really perceived as such a revolutionary shift by the general public as HD was in the 90s. If I had to choose what's more important I would go with higher refresh rates and color accuracy rather than pixel counts, but again, this is not why I mentioned Ultra HD in the first place.
Misleading trailers and promotional pictures (commonly called "bullshots") are a staple of the video game industry, especially with Sony. Remember Killzone 2? It's not some unfortunate thing they couldn't avoid before.
In terms of a downgrade due to the wide performance gap between consoles and PC, which is why I brought up Dark Souls II as an infamous example and what we were saying will hopefully not be a problem anymore, that's a different thing. Now, for the bullshit screenshots and misleading promotional material, there's an argument to be made that the better games look on their own, the reasons to embellish them and the amount of embellishments that can be added on top without making it too obvious will also diminish. So even if they are enhanced, the original is probably not much worse.