HDMI is the number one video and audio interface on the market. It has evolved over the years, and today the latest standard is capable of transferring monstrous data rates and handling 8K. We detail here everything you need to know about this standard that has become a must.
HDMI, hard to miss! Along with USB, it is undoubtedly one of the most useful ports that equip our electronic devices. But between the different types, standards and certifications, it's easy to get lost. With this post, you have all the keys in hand to decipher the performance of the HDMI ports of your devices (computer, TV, camera...) and choose the cable that best suits your needs.
Read more: Understanding The HDMI Cables Certifications
Read more: Understanding The HDMI Cables Certifications
HDMI is the abbreviation for "High Definition Multimedia Interface". It is a digital standard capable of transmitting video data (uncompressed) and audio data (compressed or not) from one device to another. It was developed by a consortium of manufacturers including Hitachi, Matsushita Electric Industrial (Panasonic), Philips, Sony, Thomson (RCA), Toshiba and Silicon Image.
HDMI enables video and audio streams to be transferred from a computer, console or media device to an external display. In particular, it is used to connect an independent device to a screen: monitor or television. It can also be used to connect a sound system.
This standard was created in 2002, but it was not until 2003 that the first products and cables using HDMI were introduced. Like any new technology, it did not immediately gain widespread acceptance among the general public. But the progressive renewal of the computer stock has allowed its emergence and today HDMI has become a reference.
In particular, it has replaced the aging VGA, which is more powerful and capable of managing only video and not audio. Same observation for DVI, less and less used to the benefit of HDMI. HDMI itself has recently been challenged by the DisplayPortstandard, which was created by several manufacturers who were already at the origin of HDMI. But in 2020, the most popular interface is still HDMI.
Types of HDMI Connectors
There are five families of HDMI connectors, which are briefly described below:
HDMI Type A: This is the most common standard and one you've probably dealt with before. It has 19 pins and is found on TVs, media boxes, game consoles and Blu-Ray players.
HDMI Type B (Dual Link): Much rarer, it has 29 pins and theoretically offers speeds up to 20.4 Gb/s. It is used for very high resolution transfers, but is absent from the consumer market.
HDMI Type C (mini-HDMI): This is essentially a miniaturized HDMI Type A for equipment that requires a smaller port. It is therefore found on many mobile products: tablets, digital cameras, camcorders..
HDMI Type D (micro-HDMI): An evolution of the mini-HDMI that appeared with the 1.4 standard, the micro-HDMI is even more discreet than its Type-C counterpart. Manufacturers designing more and more thin devices, the HDMI Type D became indispensable. It can be found on a variety of mobile devices such as tablets and ultrabook computers.
HDMI Type E: You won't hear much about it, this standard is not intended for consumers. It is only used by the automotive industry.
Here, we've taken a look at the different types of HDMI connectors available. Remember especially the types A, C and D, certainly the only ones you will be confronted with.
Now that you've understood the differences between the different types of HDMI, let's move on to the different standards. Fortunately, HDMI has evolved over the years to adapt to new needs. It is impossible to transmit a 4K video stream through the first version of the standard for example. Each new standard has brought a host of additional features, compatibility and performance improvements. The goal here is not to give you a history, but rather to help you as a consumer. So we're going to focus on the three standards in use today that you need to be aware of: HDMI 1.4, HDMI 2.0 and HDMI 2.1. If you're going to invest in hardware, these are the standards that are in use today.
This is the latest version of the standard, validated in 2017 by the HDMI Forum consortium. Only the latest devices are therefore equipped with an HDMI 2.1 port. Of course, HDMI 2.1 remains compatible with previous generations.
HDMI 2.1 supports a maximum bandwidth of 48 Gb/s. These data rates can be achieved from 8K to 60 frames per second or from 4K to 120 frames per second. Gamers' dreams are coming true, but of course you need a display that is also compatible with this level of performance, and a PC configuration powerful enough to display such a graphic feat.
This new standard also offers compatibility with HDR Dynamic technologies, such as Dolby Vision and HDR10+, for improved color, contrast and detail. On the audio side, HDMI 2.1 supports the eARC standard, which includes Dolby Atmos and DTS:X technologies.
HDMI 2.1 also makes possible two features that are being talked about a lot right now, as they are announced for the next generation consoles, the Xbox Series X and the PS5. The first is Auto Low Latency Mode, which reduces in-game input lag. Then there's Variable Refresh Rate, a variable refresh rate feature that improves performance and smoothness, also tackling jerky images. An equivalent of the FreeSync and G-Sync technologies from AMD and Nvidia.
Prior to HDMI 2.1, we had a welcome update in 2013 with the HDMI 2.0 standard, which is probably the most common standard today. This standard offers bandwidth support up to 18 Gb/s, enough to allow 4K at 60 fps. Also emerging is support for 21:9 screen aspect ratios, which are becoming increasingly popular (this is the ideal format for cinema).
As far as sound is concerned, this standard allows simultaneous transmission of 32 audio channels, with a potential sampling frequency of 1536 kHz. The Consumer Electronics Control feature is implemented to control a chain of HDMI-connected devices via a single remote control.
When it was introduced in 2009, HDMI 1.4 was a revolution. Previously, HDMI did not support 4K, version 1.4 was the first to support it (limited to 24 frames per second however). This standard has also brought new features in terms of colorimetry with the arrival of compatibility with the sYCC601, Adobe RGB and Adobe YCC601 standards.
There is also support for 3D video streams, ARC (audio return) and Ethernet Channel (HEC). This last option avoids the need to accumulate the use of multiple Ethernet cables, as the wired connection is transmitted via HDMI. If your PC is connected via Ethernet and HDMI to a multimedia box for example, there is no need to connect your box via Ethernet. The ARC is useful for connecting to an external sound system. Video and audio signals are thus managed simultaneously and in both directions. There is therefore no need to connect an additional coaxial cable for sound.
Be careful: while the management of 3D video streams is inherent to the HDMI 1.4 standard (all ports and cables are compatible), this is not the case with ARC and HEC, features that are not present on all HDMI 1.4 products.
How to choose your HDMI cable
As you can imagine, the newer the standard of an HDMI cable, the more expensive it is. If you don't have a budget in mind, you can go for HDMI 2.1, which is backwards compatible with older HDMI ports. If you don't have an HDMI 2.1 compatible device yet, the standard is becoming a reference and you will probably soon have devices that support this standard. The Xbox Series X and PS5 will be equipped with such a socket, remember.
If you'd rather save a few dollars, the best thing to do is simply check which HDMI ports are present on the devices you want to connect, and buy a cable of the lowest standard. This is because in order to enjoy the full functionality of a standard, all parts of the system must support it: device A, the HDMI cable and device B. If you buy an HDMI 2.1 cable, your computer has an HDMI 2.0 port and your TV is in HDMI 1.4, then you will only enjoy HDMI 1.4 performance.
Beware of the certification system, which is not very confusing, as it is not directly linked to the HDMI standards. The best thing to do when choosing a cable is to check its standard, but also its certification. Since the HDMI 2.1 standard was launched, we have four levels of certification :
Standard: 1080p at 60 fps and data rates up to 10.2 Gb/s
High Speed: 4K at 30 fps and data rates up to 10.2 Gbps
Premium High Speed: 4K UHD at 60 fps, HDR10, HDR10+, Dolby Vision and data rates up to 18 Gbps
Ultra High Speed: 8K at 60 fps or 4K at 120 fps, HDR10, HDR10+, Dolby Vision and data rates up to 48 Gbps
Disclaimer: there are other certifications, but they either concern HDMI Type E reserved for the automotive industry and therefore not relevant to the general public, or they only specify "with Ethernet". Like HDMI Standard Ethernet for example, which offers the same performances as HDMI Standard, but with HEC management (explanation of this technology in the paragraph dedicated to HDMI 1.4).
It is also important to anticipate what the setup you are setting up will look like in order to know what length of HDMI cable you need. Too short, and you're going to have to shake up your plans and reorganize your setup because the cable doesn't reach device B. There's also no point in taking too long and paying more for a cable that you don't know what to do with and that's lying around on the floor.
When connecting the cable, pay attention to the choice of port. It's not uncommon for TVs to offer multiple HDMI jacks, but only one of them is compatible with the best features.