• cynar@piefed.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Uncompressed 1080 is already approaching the eyes resolution limit, when viewing it in a living room environment. 4K is close to the monitor usage limit.

    The reason that 4K seems better is often down to bandwidth and colour depth.

    There’s zero benefit to an 8K TV. An 8K monitor might be useful, but is still well into the diminishing returns curve.

    There’s still some ground to be made up with colours and frame rates, but resolution is effectively maxed out already.

  • happydoors@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    ·
    7 days ago

    I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.

  • BlackVenom@lemmy.world
    link
    fedilink
    English
    arrow-up
    47
    ·
    7 days ago

    For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.

  • n1ckn4m3@lemmy.world
    link
    fedilink
    English
    arrow-up
    44
    ·
    7 days ago

    As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.

    High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.

    I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.

    Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.

    8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.

  • pulsewidth@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    6 days ago

    The consumer has spoken and they don’t care, not even for 4K. Same as happened with 3D and curved TVs, 8K is a solution looking for a problem so that more TVs get sold.

    In terms of physical media - at stores in Australia the 4K section for Blurays takes up a single rack of shelves. Standard Blurays and DVDs take up about 20.

    Even DVDs still sell well because many consumers don’t see a big difference in quality, and certainly not enough to justify the added cost of Bluray, let alone 4K editions. A current example, Superman is $20 on DVD, $30 on Bluray (50% cost increase) or $40 on 4K (100%) cost increase. Streaming services have similar pricing curves for increased fidelity.

    It sucks for fans of high res, but it’s the reality of the market. 4K will be more popular in the future if and when it becomes cheaper, and until then nobody (figuratively) will give a hoot about 8K.

  • Showroom7561@lemmy.ca
    link
    fedilink
    English
    arrow-up
    35
    ·
    7 days ago

    The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.

    Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.

  • skisnow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    ·
    6 days ago

    I hate the wording of the headline, because it makes it sound like the consumers’ fault that the industry isn’t delivering on something they promised. It’s like marketing a fusion-powered sex robot that’s missing the power core, and turning around and saying “nobody wants fusion-powered sex robots”.

    Side note, I’d like for people to stop insisting that 60fps looks “cheap”, so that we can start getting good 60fps content. Heck, at this stage I’d be willing to compromise at 48fps if it gets more directors on board. We’ve got the camera sensor technology in 2025 for this to work in the same lighting that we used to need for 24fps, so that excuse has flown.

  • FinishingDutch@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    7 days ago

    Not exactly surprising, considering the TV’s and monitors are outpacing the contemt creators and gaming development.

    A lot of gamers don’t even have GPU’s that can crank out 4K at the frame rates most monitors are capable of. So 8K won’t do much for you. And movies and regular TV? Man, I’m happy there’s 4K available.

    A 4K screen will be more than most folks need right now, so buying an 8K at the moment is just wasted money. Like buying a Ferrari and only ever driving 25 mph.

    • melroy@kbin.melroy.org
      link
      fedilink
      arrow-up
      16
      ·
      7 days ago

      Also to add to this. 8k sounds 2x as large as 4k. But that isn’t true. 8k is four times the pixels of 4k, so can you imagine what kind of GPU or content stream you will need to make sense…

    • TBi@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 days ago

      Also I think the improvements in HDR and brightness recently are more substantial than the update to 8K. At normal viewing TV distance you’d be hard pressed to see the individual pixels, even on a 1080p screen.

      Even for PCs there isn’t much reason to go about 2k screens (1440p).

      • GreatAlbatross@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 days ago

        This is why I often refer to 4K as UHD: The WCG and HDR being available to consumers is far more impactful than end users having a few more pixels.

        (Also because I’m a snarky pedant, and consumer 4K UHD is only 3840 wide, while DCI4K is actually 4096)

      • HugeNerd@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 days ago

        But I need the surface brightness of the Sun in my living room! It adds so much depth to the characters and stories!

      • odelik@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        4k monitors in portrait orientation are amazing for productivity. It’s a shame more people don’t do this

          • odelik@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 days ago

            Screen space.

            I work in tech doing performance, memory management, and developer workflow tooling and automation for a large 3D Rendering/Creation tool.

            Being able to throw a long setup doc, or a large class file on a 4k portrait monitor allows me to read things through with a ton of context and far less scrolling.

            It’s also useful for putting two window tiles that have related content, or one is a reference content.

            I currently have a tie-fighter monitor setup (2x4k portrait on either side of a ultrawide) and will put comms and email/calendar on my left monitor, core work in the center, and overflow reference/research on the right.

            It’s less hectic for personal use, but I still use all the space.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      It’s just a race. Perhaps you don’t need the biggest and newest available thing, but you also will subconsciously discard what’s “less” than what you already have or what’s normal as obsolete. This creates an engine for a race, where good faith players can’t compete.

      Like with web browsers, a hypertext networked system even with advanced formatting, executable content and sandboxing can be so simple, that there’d be hundreds of independent implementations. But if you always race the de-facto standards with the speed you the monopolist group can maintain, and good faith competitors can’t, then you’ll always be the “best”.

      The Matrix movie actually talks about that, with its “there’s no spoon” moment. It’s not a usual market game, it’s a meta-market game. And most people don’t understand the rules of the meta layer, being sitting ducks there.

      Nobody can compete with the industry leaders on their field. And unlike with steel or gasoline or even embedded electronics production, there’s no relativity in the field at all. But the new possible fields are endless. Everyone can discover new pastures here, because it’s not discovery, it’s conception. But since that’s counterintuitive, and the network effects work on psychology too, most people are not trying.

      It’s a bit like military logic, there were Western “controlled escalation” doctrines, because slow gradual escalation works in favor of the side with most resources, thus the West, and the Soviet “scientific-technical revolution” doctrines, which despite sounding stupid is a correct name, when you’re the second in the race, your best chance lies in being unpredictable, unreasonable and changing the rules. One of the reasons Soviet doctrines gained such a crappy reremovedtion as compared to Western ones is that, well, they are kinda similar to preventively going all out guns-a-blazing before you are forced to fight by the enemy’s rules, which requires willpower from those making the decisions (and also capability to, well, do anything scientific and technical, LOL), and which means you prepare for some sort of general battle (that be nuclear war, or short highly concentrated offensives, such stuff) at the expense of “aggressive negotiations” scenarios. So - in our time anyone trying to heal the Silicon Valley’s effects is playing USSR and can only expect anything good from breaking rules.

    • SaveTheTuaHawk@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 days ago

      they will just use a shitty upscale algorithm.

      You don’t sell performance to people, you sell numbers.

  • flop_leash_973@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    7 days ago

    Another possibility for why consumers don’t seem to care about 8k is the common practice by content owners and streaming services charging more for access to 4k over 1080p.

    Normalizing that practice invites the consumer to more closely scrutinize the probable cost of something better than 4k compared to the probable return.

  • ☂️-@lemmy.ml
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    7 days ago

    i can’t tell the difference between 1080 and 4k at the distance i use it. let alone 8k.

    we already have nice enough tvs. what about you guys focus on healthcare and shit now?