Ok, I think we should take another look at this whole scenario:
|
Quote: |
|
|
|
|
Originally Posted by The JareBear |
|
|
|
|
|
|
|
|
I don't think we know for sure if HFR mode has downsampling or not.
I think the SCEA_Chris posted that the base mode on Pro has downsampling, but I dunno if that also is included with HFR mode as well. For 1080p displays and the Pro you may have to pick between stock framerate and increased image quality (base mode) or higher framerate and stock image quality (HFR mode).
This isn't official though, and nobody outside of SCEA knows for sure yet. We might just have to wait for GamingBolt, Nexus, DigitalFoundry, one of those sites to post a video breakdown
ANd yes, downsampling basically takes the internal resolution on the Pro, be it 1440p, 1800p, or 4k checkerboarding/native, and shrinks it down to your 1080p display, basically acting as increased anti-aliasing giving a sharper, cleaner imaque, reducing jaggies and shimmering, stuff like that
|
|
|
|
|
|
Ok, I am gonna take another stab at this.
Yesterday I was super excited to get impressions from all the fellas. After Philly's post, about enhanced visuals requiring 2.2, I was super confused to the point of feeling down on myself because I couldn't understand why I was struggling so hard to comprehend. I know the difference between HDR and resolution, I know how the Pro works, I have followed the whole deal since the very beginning.
Something about the way it seems to be set up, as Philly explained in his quoted post, doesn't make sense to me. By the way, this isn't attacking Philly, I was just quoting his post since his is the most recent/relevant. The way he describes it seems to be true, but I can't help but wonder why.
NO PS4 Pro mode supported game, to this point, has required HDMI 2.2 for downsampling on 1080p displays.
Downsampling is handled internally on the Pro. The 4k, or 1800p, or 1440p, or whatever resolution, is rendered on the Pro then downsampled to your 1080p display.
As a 1080p PS4 Pro gamer, this is the first I have heard anything about HDMI 2.2. I know 2.2 is required for 4k/60hz signal, but that has never applied to me as of yet.
Based on SCEA_Chris' post, and a whisper he sent to me on Twitch responding to a question I asked him directly
(thank you, Chris, sincerely, I appreciate your time and attention) I think we may be looking at this incorrectly.
It seem, to me, that "enhanced visuals" mode has nothing to do with 1080p displays, period. Why would a 1080p display need HDMI 2.2 for a 1440p mode? Again, if this is the case it is something deliberately done by SCEA for this game, and would be the ONLY case of such a setup on the Pro so far.
I think "enhanced visuals" is a mode specifically for 4k users. It is for 4k users who who want a higher IQ than base, but better framerate than the native 4k mode (which seems to possibly be a little choppy here and there according to impressions).
If SCEA_Chris is correct, there is downsampling on 1080p displays for Pro users. It is not clear if it only applies to the base visual mode on Pro, or if it also applies to the HFR mode on Pro, could be one or both.
Does anyone who has the game early game on a 1080p non-2.2 set? If so, can you switch between base visual rendering mode, and the HFR mode, and try to notice any difference in clarity?
Does anyone who has the game early game on a 1080p HDMI 2.2 set? If so, can you confirm that it lets you pick enhanced visuals mode? I don't think this will be the case, as I explained above, it just seem to make sense, and would be the only case on Pro so far to have this kind of setup for 1080p displays.
Or, maybe guys with 4k sets can change your PS4 output to 1080p in the system settings and see if you can still select enhanced visuals mode?
Edit:
The hangup for me is what HDMI 2.2 has to do with enhanced visuals mode. When I think of every other game PS4 Pro game with options, like Nioh, Shador of Mordor, Tomb Raider, you can get all the downsampling you want with no issues on HDMI 1.4 1080 displays.