• NIB@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    3 months ago

    The only thing keeping 4080(and 5080) cards “reasonably” priced is the fact that they only have 16GB, therefore they arent that good for ai shit. You dont need more than 16gB vram for gaming. If those cards had more vram, the ai datacenters would pick them up, keeping their price even higher than it is.

    • bitwaba@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      I have a 7900xt and was using over 17gig in Jedi Survivor. No ray tracing, no frame gen. Just raw raster and max AA.

      Granted, that’s because that game is so horribly optimized. But still… I used more than 16gig.

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      3 months ago

      You kinda can… Nvidia card users have been having the toughest time with the Hunt Showdown update because CryEngine is happily gobbling up VRAM. For AMD cards it’s not a problem but various Nvidia card owners have been having bad experiences running at the resolutions they normally do.

      Maybe 16GB is the number where things are okay, I haven’t heard complaints on cards above 12GB. However, point being… Nvidia being VRAM stingy has bit some folks and at least one game developer.

      Still 32 seems EXCESSIVE.

    • Evil_Shrubbery@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      nVidia & production yields decide/plan how much RAM they are gonna give them.

      If it made financial sense (a market existed) nVidia would stop making desktop cards overnight.