• foggenbooty@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        To hide poor frame rate, that’s why. Motion blur was popularized on consoles by AAA studios that wanted everything to look really pretty, but couldn’t sustain a stable frame rate during rapid motion.

        If you have the FPS to afford it, turn that shit off.

    • BurgerBaron@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 days ago

      If that counts then in-game rendered intros on first launch running in 720p and you can’t change video/display settings until after the game finally gives you control.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 hours ago

        I played Assassin’s Creed Origins during a free weekend a few years ago and it automatically set its own graphics settings and dumped you into the game without being able to access the menu so it looked like my screen was covered in patrolium jelly half the time. About an hour into the game when I could finally access the game menu I learned why. It set all settings to their absolute highest but resolution scaling was enabled so it was trying to render graphics my PC couldn’t handle then internally reducing the resolution down to 360p or so. Once I dialed in settings that my computer could actually handle without resolution scaling it looked a million times better