Starfield's performance is locked on Xbox, Todd Howard says, causing worry about the space sim on PC, but a God of War Ragnarok dev comes to Bethesda’s defense.
People who were actually there at the time say otherwise. And so do I, because I was there too. Slow frame rates look like shit, and they have always looked like shit. The first video game I actually enjoyed because it wasn’t visually uncomfortable to look at was F-Zero X on the N64. Would you like to take a guess as to why?
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Nobody was motion sick or got eye strain.
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Removed by mod
Every video game and every TV program for DECADES ran at 30fps. 29.97, actually. Nobody was motion sick or got eye strain.
People who were actually there at the time say otherwise. And so do I, because I was there too. Slow frame rates look like shit, and they have always looked like shit. The first video game I actually enjoyed because it wasn’t visually uncomfortable to look at was F-Zero X on the N64. Would you like to take a guess as to why?
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
Most games of the NES, Genesis, and SNES era ran at 240p, 60fps (in the NTSC regions).
The difference is that TV and movies have a consistent delay between frames. That is often not the case with video games.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?