Bethesda's epic sci-fi RPG is here, and it's a big one. From shipbuilding to exploring the surface of Mars, our thoughts so far.
Starfield Review... In Progress
The first trailer for Grand Theft Auto 6 is finally here.
Grand Theft Auto 6 Trailer
We take an in-depth look at Avatar: Frontiers of Pandora and tell you why it should be heavily on your radar!
Avatar: Frontiers of Pandora - a Deep-Dive into its Potential
Range-wise, the ROG Rapture GT6 is phenomenal, and it's ideal for all gaming and non-gaming-related tasks.
ASUS ROG Rapture GT6 WiFi 6 Mesh System Review
The 4K Difference
Post by KostaAndreadis @ 03:25pm 18/04/18 | Comments
What the Latest in Display Technology Brings to Media, And How It Works.

Sponsored by Universal Sony

Walk into any electronics store these days and you’d be hard pressed to find a new display that doesn’t output a 4K or UHD image. That being, the latest advancement in HDTV technology that began over a decade ago with the introduction of the first 720p digital screens. Full-HD 1080p displays arrived a few years later with today heralding the 4K, 2160p, or Ultra High-Definition era.

When visualised like this, the 4K difference is clear

Okay, so that’s quite a few somewhat numbers with ‘p’ added to the end of them. So, let’s get to the meaning. Each of these numbers refers to a resolution or number of pixels that make up the image on a HD display. A full-HD or 1080p image is made up of 1920 (horizontal) x 1080 (vertical) pixels. In the traditional upgrade sense, one might expect that a 4K image would provide double the pixel count as it’s made up of 3,840 (horizontal) x 2,160 (vertical) pixels. 1080p vs 2160p.

That’s not the case though, as a true 4K image offers an increase of over four-times the pixel count - when all the maths is said and done. What do all these numbers mean? Well, with a 4K source, whether it be a 4K Blu-ray like recent blockbuster Spider-Man: Homecoming or an Xbox One X game like the excellent Forza Motorsport 7 – a 4K image will instantly look cleaner, sharper, and more detailed than its full-HD counterpart.

More pixels mean more definition. Especially on today’s larger displays.

When Size Matters

A great superhero movie that's made all the better in 4K

The introduction of 4K TVs in the past couple of years and subsequent current-day affordability of 4K displays comes at the perfect time for home theatre enthusiasts. Namely, 4K’s arrival matches that of the steady increase in the typical size of a HDTV over the past decade. From 32 inches to 46 to 55 and now the impressive 65-inch screen. Without a doubt the increased wall real-estate has led to a rise in the sheer immersive value of the home cinema experience. Larger displays coupled with surround sound mean that watching the explosive spectacle of Transformers: The Last Knight or the high-speed escapism of Baby Driver in 4K can just about rival that of heading out to your local theatre.

And hey, when they start selling affordable Frozen Coke machine ones might never have to leave the house. Except to buy more Frozen Coke making materials of course.

Many of the big screen makers from LG to Samsung to Sony have simply stopped offering anything but a 4K image after displays reaches a certain size. And for good reason too. The initial allure of 4K and 4K media is the clarity of the image. The crisp detail, when put next to a standard 1080p image, makes even a standard Blu-ray look blurry or out of focus in comparison. No doubt there are people out there, as there was with the advent of DVD, that may not be able to notice a difference based on resolution alone.

Or even when they do, can shoulder shrug off the benefit of increased clarity as good but not essential. Well, that’s where HDR comes in.

HDR, The Second Half of the Equation

One of the great 4K HDR game experiences Assassin's Creed Origins

HDR or High-Dynamic Range lighting is the Scottie Pippen to 4K’s Michael Jordan. The milk to the UHD tea. The, err, tomato sauce on a meat pie. Without HDR, 4K simply isn’t as impressive. And that comes down to the fact that HDR essentially means more colour, increased contrast between darker and brighter images, and a more vibrant and overall brighter image. And by more colour detail, we mean verging into the realm of real-life. HDR technology though, does come at a cost. And the quality of HDR implementation is usually the reason why one 4K TV might cost considerably more than another. It all comes down to how bright a display can get, without sacrificing black levels. Which means if one section of the image is the bright light of a spotlight in the dark, then the dark should look, well, dark. Black even.

As depicted in top the line display technology featuring OLED per-pixel lighting from LG or Sony, HDR results are often breathtaking. Providing the sort of wow factor you got the first time you saw a HDTV in action. Playing Assassin’s Creed: Origins in 4K on an Xbox One X and exploring a tomb at night and then firing up a torch, will produce a realistic and bright fiery light that looks and behaves completely real. Shadow detail will astound, as will the sheer wonder of a blue sky when you emerge into the sun. Pop in a 4K Blu-ray like Atomic Blonde and the bright neon glow of a night-club will sparkle in a fashion that will make you feel like someone from the 1940s. As you’re sitting there wondering how they shrunk all those people and locations down to fit them inside this magic black window thing on your wall.

From games to movies to streaming services like Netflix, all the major players are embracing 4K with HDR. But this shift, and improved image quality, does mean that HDTV standards and technology has also changed along the way. So, there are some additional things you need to know.

The New HDMI Standard, And What They Don’t Tell You

Atomic Blonde's vibrant colours leap of the screen thanks to HDR

For a 4K HDR image to make its way from an Xbox One X, PlayStation 4 Pro, or UHD 4K Blu-ray player to a 4K display required the use of the latest HDMI standard – called HDMI 2.0. It’s a subtle and often mis-understood evolution mainly because the cables still look the same, as do the ports on the back of a TV. The reason for the 2.0 upgrade though is to include all the additional data required to carry a 4K signal with additional HDR information. But, you’ll also need a HDMI 2.0 cable. One that looks identical to every other HDMI cable you own; but isn’t.

And then to get the proper 4K image you’ll need to ensure that you’re connected to the correct HDMI port on your TV, and you enable “wide colour gamut”. Which is basically HDR.

We mention this because strangely, all the big TV makers still require this to be manually set or activated when setting up a new display. Another small downside is that thanks to 4K and HDR, adjusting the picture is now akin to rocket science. As a rule of thumb 4K HDR is best experienced with a TV’s contrast at near maximum settings with brightness used to adjust black levels and shadow detail. Colours should be vibrant but never unnatural, with most TV makers falling into the trap of overly saturating an image to instantly wow before making you realise that people’s skin tones aren’t usually comparable to tropical fruits.

The Future is 4K

Shot with IMAX cameras, Transformers The Last Knight looks great in 4K

With sales of 4K TVs on the rise, and companies like LG, Sony, and others investing heavily in improving the quality of HDR – the gradual shift we saw from DVD to Blu-ray and full-HD streaming is already underway for 4K. Just about every new major film release is getting a 4K Blu-ray release alongside its standard Blu-ray counterpart, and in almost every case the different is immediate and impressive. Whether a film was shot on digital or standard cameras, in many cases 4K represents a like-for-like conversion from the big to the small screen. Whereas a standard 1080p full-HD release requires converting and lowering the quality of the source material.

The same can be said for games. With the advent of 4K consoles like the PlayStation 4 Pro and Xbox One X many games are now able to showcase their art, textures, and effects at full detail without compromise. It’s the difference we’ve been waiting for.

Latest Comments
Posted 07:54pm 18/4/18
HDMI 2.0a (or later) is what gives you HDR, my TV is only HDMI 2.0 so I don't get HDR over HDMI but I do with streaming or on the USB input.

Also it would have been good to mention that even today most amplifiers don't have HDMI 2.0a so it pays to check this if you're looking at a HDCP 2.2 compliant receiver (or any other HDMI hub device). Even HDMI 3.0 is on the cards soon and will have a higher bandwidth than the latest DisplayPort standard (which can currently carry an 8K signal at 60Hz).

Guess why this complicated mess exists and the legitimate consumer loses out again? Yep, DRM.
Posted 09:48pm 18/4/18
Ugh so complicated.
Commenting has been locked for this item.