Currently trying to refund the new Indiana Jones game because it’s unplayable without raytracing cri. My card isn’t even old, it’s just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn’t look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.

  • machinya [it/its, fae/faer]@hexbear.net
    link
    fedilink
    English
    arrow-up
    51
    arrow-down
    1
    ·
    7 days ago

    that one is atrocious but another thing i also find nasty is the amount of disk space new games need. sure, buying more disks is way cheaper than getting more graphical power but downloading +100Gb for a game I might just play once feels like an incredible waste

    games should have a lo-fi version where they use lower textures and less graphical features for the people that cannot actually see the difference in graphics after the ps2 era

  • Gorb [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    35
    ·
    edit-2
    7 days ago

    I’m finding the latest in visual advancements feels like a downgrade because of image quality. Yeah all these fancy technologies are being used but its no good when my screen is a mess of blur, TAA, artifacting from upscaling or framegen. My PC can actually play cyberpunk with path tracing but i can’t even begin to appreciate the traced paths WHEN I CAN’T SEE SHIT ANYWAY.

    Currently binging forza horizon 4 which runs at 60fps on high on my steam deck and runs 165fps maxed on my PC with 8x msaa and it looks beautiful. And why is it beautiful? Its because the image is sharp where I can actually see the details the devs put into the game. Also half life alyx another game that is on another level with crisp and clear visuals but also ran on a 1070ti with no issues. Todays UE5 screen vomit can’t even compare

    All games these days know is stutter, smeary image, dx12 problems and stutter

    • Gucci_Minh [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      20
      ·
      7 days ago

      TAA, dof, chromatic aberration, motion blur, vignetting, film grain, and lens flare. Every modern dev just dumps that shit on your screen and calls it cinematic. Its awful and everything is blurry. And sometimes you have to go into an ini file because it’s not in the settings.

      • genderbitch [she/her, it/its]@hexbear.net
        link
        fedilink
        English
        arrow-up
        14
        ·
        7 days ago

        Chromatic aberration! When I played R&C: Rift Apart on PS5 I was taking screenshots and genuinely thought there was some kind of foveated rendering in play because of how blurry the corners of the screen looks. Turns out it was just chromatic aberration, my behated.

        Hate film grain too because I have visual snow and I don’t need to stack more of that shit in my games.

        • Gucci_Minh [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          14
          ·
          7 days ago

          Dev: should we make our game with a distinctive style that is aesthetically appealing? Nah slap some noise on the screen and make it look like your character is wearing dirty oakleys and has severe astigmatism and myopia that’ll do it.

  • GnastyGnuts [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    23
    ·
    7 days ago

    I just want ps2-level graphics with good art direction (and better hair, we can keep the nice hair) and interesting gameplay and stories. Art direction counts for so much more than graphics when it comes to visuals anyway. There are Playstation 1 games with good art direction that imo are nicer to look at than some “graphically superior” games.

  • Moonworm [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 days ago

    I got a special fucking bone to pick with Cities Skylines 2. I’ve never had a game look like such vaseline-smeared ass while making my computer sound like it’s about to take off. It’s a shame because it’s definitely come a long way as a game and has some really nice buildings now, but to play it I start to get nervous after like half an hour and have to let my computer cool down, fuck that shit.

  • makotech222 [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    14
    ·
    7 days ago

    The gamers yearn for Forward+ rendering…

    Yeah i think gaming as an industry is becoming ‘more specialized’ which is not necessarily good. All the engine developers are just working on very generic graphics stuff for like Unreal and Unity, rather than engine devs being a position at a company that makes games themselves, which can greatly optimize them for specific games.

  • peppersky [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    13
    ·
    7 days ago

    The new indiana jones is actually pretty decently optimized, like I run it at 1080p all high/ultra settings on my rtx 3060 12gb, with DLAA downscaling enabled at a mostly locked 60fps. Like it is leagues better than any UE5 game, it’s just the hard VRAM requirements that suck.

    I feel like a lot of the issues game graphics have nowadays is just that GPU prices have been ridiculously inflated over the last two decade because of crypto/ai. Like it is not surprising that devs will follow the newest trends and technologies when it comes to graphics, but the hardware needs of raytracing and global illumination and the likes are just too high for what gpu performance/dollar you can get in 2024. I just recently upgraded from an AMD RX480 to a used Nvidia RTX 3060 12GB (which seemed to be the best bang for the buck, an RTX 4060 would have been much more expensive for not a lot more performance), and that upgrade gets you maybe double performance in your games, for a GPU that is a whole seven years newer (and no VRAM upgrade at all when you get the base model). These cards just simply shouldn’t cost as much as they do. If you don’t have unlimited money to spend, you are going to have a much worse experience today compared to half a decade or a decade ago.

    • 7bicycles [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      7 days ago

      They were kind of correct back then two with the amount of upgrading the industry would expect you to do. That just petered off there for a while, luckily. seems to be back in full force now though

      That said, at least back then all the shit gave you actual functionalities as per graphics instead of like raytracing on retinas or some bullshit you’d never notice

      • KobaCumTribute [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        7 days ago

        I think that has to do with consoles: when a console generation is outdated mid or low range hardware that forces more general optimization and less added bullshit, especially when that generation drags on way too long and means devs are targeting what is basically a decade old gaming computer towards the end. When they’re loss leaders and there’s a shorter window between generations or upgraded same-generation versions, it means devs are only optimizing enough to run on a modern mid range gaming rig and specifically the console configuration of that.

        Although there’s some extra stuff to it too, like the NVidia 10 series was an amazing generation of GPUs that remained relevant for like a decade, and the upper end of it is still sort of relevant now. NVidia rested on their laurels after that and has been extremely stingy with VRAM because their cash cow is now high end server cards for AI bullshit and they want businesses to buy $5000+ cards instead of <$1000 ones that would work good enough if they just had a bit more VRAM. GPUs have also gotten more and more expensive because of crypto and AI grifters letting NVidia know they can just keep raising prices and delivering less and people will still buy their shit, and AMD just grinning and following after them, delivering better cards at lower prices but not that much lower since they can get away with it too.

  • AstroStelar [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    12
    ·
    7 days ago

    I am lucky enough that I’m not that interested in high-specs AAA titles to begin with: of the 100+ games I’ve put on a DIY wishlist, I’d say less than 10 of them fall in this category. It’s mostly indie/retro titles, older titles or mid-budget.

    • 7bicycles [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      16
      ·
      7 days ago

      I’m pretty sure that’s on the way out. Up until PS2/Xbox/Gamecube you bought the game and then you played the game. Then came installing the games on your console and managing disk space and shit. Now there’s pro versions and graphical settings in console games. By the time the next generation from now is in place, it’s just gonna be like choosing prebuilt PCs and futzing around with graphical settings all the same, except on way more locked hardware.

    • godlessworm [comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      7 days ago

      every game should just be made with the same engine as doom 2016. it looks amazing, runs really smooth on modest hardware, and it isn’t unreal engine 5

      frostbite engine too. battlefield looks amazing, and runs really well on modest/mid tier hardware. i actually thought the new indiana jones WAS running on the battlefield engine just based on how it looked. but apparently not

  • HappyTimeHarry@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 days ago

    I’m having the opposite experience with Indiana Jones, I expected it to be unplayable on my mobile 3060 but in fact it runs great. I had to set a weird launch option that I found on protobdb but once I did that it went from unplayable under 10fps to a smooth 60fps in 1080p

    Try enabling this setting

    __GL_13ebad=0x1
    
  • ouRKaoS@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 days ago

    The price of storage dropping quickly ruined optimization, so things turned from “how do I get the most out of a tiny bit of space” to “Fuck you. Upgrade your gear!”