KronisLV1 hour ago
Overall, I'm pretty happy with my Intel Arc B580, if framegen helps me squeeze out a little bit more life out of that card before it becomes more or less obsolete, then I'll gladly take. Though, to be honest, with the amounts of UE5 slop out there, I'll probably need to give an unreasonable amount of my money to Nvidia or AMD sooner or later (since many games don't exactly let you turn off Lumen and Nanite). It's just unfortunate that Intel themselves won't provide the much needed market competition in the form of a B770.
bpavuk3 hours ago
I can't believe that some people are enjoying MFG, however small that group is. me personally? I hate that cognitive dissonance of "it looks like 120 FPS yet input lag is more like 40-60 FPS". plus, FG itself has performance tax, which in my case means input lag tax.

it's input lag that defines experience, not frame time. I am comfortable with 30 FPS (sometimes less frames even fits the style of the game, e.g. Dishonored 2, Clair Obscur) as long as the game responds instantaneously.

PacificSpecific1 hour ago
I remember around 2012 having discussions about how the 6fps input lag on ultimate Marvel vs Capcom 3 PS3 version was making the game borderline unplayable and that's why the PS3 version was not used at tournaments. Can't believe how far we've blasted past that benchmark.

Completely agree, input lag is the most important thing.

kingstnap2 hours ago
I think frame or even multi frame generation combined with Asynchronous Reprojection / Frame Warp might be a very good idea.

https://youtu.be/f8piCZz0p-Y?si=OLq9iZUjuRMYKPDo

If you have never heard of it, the basic idea is that you make low FPS feel responsive in first person games by having the mouse motion warp the existing frame independently of when a new frame is actually rendered.

This could be combined with some AI techniques to help sort out the edge artifacts you get from this.

short_sells_poo50 minutes ago
At 30 fps you already have 30ms between frames so you aren't getting anything close to instant input.
wmf4 hours ago
There's nothing I want less than multi-frame generation. I guess some people want to feel like they're getting their money's worth from their 240 Hz monitors.
boyter3 hours ago
If you have a high frame rate to start with it’s pretty nice and feels smoother. But a low frame rate turned into a high one looks good but feels laggy.

So arguably you never need frame gen for a game, since it only really works when it’s already pretty nice.

out_of_protocol1 hour ago
fps getting increased but latency does not, and what's what important
ece2 hours ago
Gamers chased high FPS, that's what they got.
boyter1 hour ago
Chased the wrong thing. It’s the 1% lows that matter more generally.
ece17 minutes ago
When getting rid of actual performance bottlenecks is too hard or costs too much, just make something up.

XeSS is actually pretty great, played Talos Principle 2, a UE5 game on the Steam Deck at 800p 30fps thanks to XeSS.

joe_mamba2 hours ago
If you're on Intel integrated graphics, it's a free potential upgrade that makes use of existing silicon, and you don't have to turn it on. I don't get the hate. Just don't turn it on if you don't want it.

I get that people want more real frames rather than more "fake" frames, but in that case you wouldn't be buying integrated graphics, or if you did end up with iGPU, you'd be aware of the limits and be happy for any improvements arriving via software.

It's like people let their hate of AI and LLM bubble blind them, and their brains can't compartmentalize good from bad news anymore.

bigyabai2 hours ago
It's a great option to have. Once you reach the 2-7ms frame time territory, you're approaching the CPU bottleneck for many game engines even on the fastest hardware. For newer titles like GTA VI, framegen might be the only reliable path to 120+ FPS without pinning all of your cores.

Framegen is also a good fit for low-end hardware like the Steam Deck, which can hit 30 or 45 FPS in stuff like Elden Ring but is far from the max 90hz of the OLED model's panel. For a handheld, trading a bit of 720p visual clarity for locked 90hz gameplay is a solid trade if you can get it working.

Borealid2 hours ago
Would you say a game is running at 90fps if, 45 times per socond, two frames are produced, the second of which is a linear interpolation of the frame before and after it?

How about if the two frames are 100% identical?

Does either of these situations differ substantially from what is being discussed, wherein the render pipeline can only produce a new render 45 times per second?

close0438 minutes ago
> the second of which is a linear interpolation of the frame before and after it

If I understand what you describe, this is generating a frame "in the past", an average between 2 frames you already generated, so not very useful? If you already have frames #1 and #2, you want to guess frame #3, not generate frame #1.5.

The higher the "real frame" rate, the smaller the differences from one to the next. This makes it easier to predict those differences, and "hide" a bad prediction. On the other hand if you have 10FPS you have to "guess" 100ms worth of changes to the frame which is a lot to guess or hide if the algorithm gets it wrong.

Borealid33 minutes ago
I chose the two scenarios I did to illustrate that "frames per second" is clearly not meant to be measured in terms of times the display refreshed, but rather in terms of times content was actually rendered by the game engine.

In my opinion it is quite difficult to provide a definition of "fps" that somehow makes 45-fps-native-with-frame-doubling be counted as 90 but doesn't also make either of the ludicrous examples I presented be counted as 90.

Incipient2 hours ago
My understanding is that frame generation uses motion vectors to (slightly?) adjust the scene to produce a "highly plausible" next frame to drop in before the following "real" frame.

I've only seen videos, so from a somewhat unrealistic perspective, it seems like an acceptable compromise for low end hardware in particular.

Boosting 120hz to 240hz admittedly seems silly.

Borealid29 minutes ago
My comment isn't denigrating frame generation, which can be useful.

It's pointing out the absurdity of calling "45fps plus 1-for-1 frame generation" as if it is in any sense "90fps". It's not, and you aren't hitting a 90Hz refresh rate target at any more with it than you were without it. In point of fact, it lowers real FPS because it consumes resources that would have otherwise been available for the render pipeline.

I wish reviewers in particular would stop saying e.g. "120fps with DLSS FG enabled" and instead call out the original render rate. It makes the discourse very confusing.

enjoykaz1 hour ago
The Steam Deck case is the clearest test. Boss fight in Elden Ring: your inputs are still 45hz, eyes see 90.

For readable patterns it's probably fine; for reaction-window timing you're being misled.

izacus1 hour ago
What are you being "misled" about exactly?
enjoykaz59 minutes ago
Frame gen creates frames the game engine never rendered. You see an enemy wind-up at an interpolated timestamp — react to it, and your input lands on the next real frame, up to 22ms later. At least that's my understanding of how it works — happy to be corrected.
DeathArrow35 minutes ago
Does Intel even try to compete with Nvidia? Or are they content with the bottom end?