You guys notice how they’re pushing these HDMI 2.1 cables on like Xbox One s users and stuff with 4K TVs? Or even the 8K TVs that aren’t capable of the 120 frame rates and stuff like that.

It’s kind of crazy how they start marketing this s*** years and years before so you end up buying the product that doesn’t even do any good for you unless you can afford a crazy good gaming TV or monitor, or your PC player that actually knows what you’re doing, but you end up buying all this extra useless s*** don’t even realize that it’s not giving you any real upgrades to quality, and then by the time it actually is giving you upgrades to quality 2.6 is out.

My buddy just bought an 8K TV and his next door neighbor told him oh bro now you got to go get that HDMI 2.1 cable.

Your TV is not a gaming TV yes it’s 8K compatible but you can also achieve that with a 1.4. which you already own three of. And you’re certainly not going to pull 120 frames and if you do you’ll never pull 160. Not to mention your Xbox wouldn’t push 120 frames if you manually pushed 120 frames through it? Don’t really know where I was going with that. Point is let alone 160.

I tried explaining the concept of bottlenecking to him probably 75 times but he’s only ever owned one PC in his entire life and I’m pretty sure it was like a Best buy bought.

Capitalism is beautiful.

  • Vinny_93@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    3 months ago

    First off: cables don’t have version numbers. The host and the client have ports that adhere to a certain spec and the HDMI foundation made that very unclear by incorporating 2.0b into 2.1 and now not every 2.1 port supports the same things. Cables are defined by their max bandwidth, i.e. high speed, ultra high speed or high speed with ethernet. You might see marketers saying something is a 2.1 cable, that just means it is capable of supporting some or all of the 2.1 spec.

    Second: the only reason to get new HDMI cables, like you said, is if you currently have a very old one and have devices that actually make use of the bandwidth. And I’ll tell you right now, most of the high speed cables will do just fine. It’s when you start doing 8k120 with HDR and VRR with eARC you’ll need heftier cables. The only external devices to support that, though, are either supplied with cables because their makers don’t want you bottlenecking your device, or they are PCs.

    Third: the only reason HDMI is even a thing is because this joint venture behind it successfully lobbied their inferior product to TV manufacturers. DisplayPort has always been and will always be the better interface for video.

    • WolfLink@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      3 months ago

      I have a 4k120hz gaming monitor and I have some HDMI cables that don’t support that quality.

      I also just use DisplayPort because it’s better anyway (e.g. lower latency).

    • Ptsf@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      While I almost completely agree with you, never underestimate the power of using the right tool for the right job. HDMI is actually far more resilient to signal corruption in my experience than display port since it implements TMDS and the cables are more commonly well shielded since they expect them to be used in device dense environments, which isn’t really applicable to anyone familiar with technology (don’t group up your cables next to something with significant RF noise/leaks, duh.) but does matter for the end user use case these see. The fees hdmi charge are a scam though fr and we could ask better from the industry.

      • Vinny_93@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Mostly unable to make use of certain features. Say your display supports 4k @ 120Hz. If you have an improper cable you might be able to get 4k30 or 4k60, but not 4k120.