Jump to content

Lossless Scaling


Guest The_Sextein

Recommended Posts

Guest the_sextein

One more tip.  Baldur's Gate uses V-sync by default.  In the lossless scaling app you can turn on the option to allow tearing and it will improve your response times while V-sync will still keep the tearing out of the display.  I had a very minor issue with a millisecond of blur on the edge of the fog of war.  Doing this fixed the issue so it could be based on CPU performance but it's tied to response times I'm pretty sure.  I've switched to 2X mode which results in 60FPS on a 120hz refresh.  To me, this just feels the best but it's probably because of frame times and how they line up with my refresh rate.  In games that can go beyond 30FPS naturally, they mentioned locking the frame rate to half of your monitor.  I wouldn't totally count out locking it to 1/3 and using 3X mode.  I locked my FPS to 40 and tripled the frame rate to 120 with some old shooter games and I had no issues in some of them.  For Baldur's Gate I settled with 2x mode though.

Link to comment
Posted (edited)
On 7/4/2024 at 1:00 PM, Guest The_sextein said:

Verbal diarrhea and worthless, fallacious appeals to authority

The soap opera effect is entirely reliant upon the viewer's perception of what is expected versus what is observed. If you've watched films that play at 24 FPS for the first 25 years of your life and then some misguided jackass decides "hey, let's double up the frame rate by interpolating a bunch of made-up frames in between the real frames that don't actually add anything new but do screw with the viewer's perception of motion", you understandably might not like the effect. If the viewer happened to only ever have watched films that played at the doubled FPS (interpolated or otherwise) over the course of their life, I would expect they would instead find the standard of 24 FPS to be the one that looks unnatural. As I've already implied, this would be fine, and as I've actually already said, it is also fine for someone to like this. I do not, and for already stated reasons.

Now, do you know what Baldur's Gate happens to be? A game that I have played for 25 years at a set FPS of 30 that you are telling me to insert a bunch of made-up frames into that don't actually add anything new but do screw with my perception of motion in the game. You know what I call that? The soap opera effect, on account of it being a one hundred percent analogous situation to what one can observe when you interpolate additional frames with film. I am not the only one who has observed this - not to mention how bad/artifacted the generated frames look, or the fact that it actually makes games feel laggier despite the artificial visual smoothness - which if you've read some of the negative reviews on Steam for Lossless Scaling, you yourself might have noticed. Now you can take the rest of the horseshit that you've crapped out and shove it straight back up your stupid ass.

On 7/4/2024 at 1:00 PM, Guest The_sextein said:

Nobody recommended that you watch anime with lossless scaling and it has nothing to do with this thread or Baldur's gate. 

Whoops, I guess that must be my mistake.

On 6/18/2024 at 4:48 PM, Guest The_sextein said:

frame generation for games has no soap opera effect.  It's like playing at 60FPs while the game is running at 30.  Frame generation brings smoother motion and animation with higher detail retention under movement.  CGI cutscenes are also upgraded and run at 3x the frames making the production levels seem higher.  This tool can be used for watching movies or anime as well.

Oh no, it wasn't, because you did suggest trying exactly that, in a post directly after and replying to mine when explicitly talking about the frame generation feature. Wow, imagine that, someone being that big of a sorry sack of shit. I was not able to successfully record Baldur's Gate with Lossless Scaling during the time that I owned it (it was already a nightmare to record the little bit of the show that I did), but you are more than welcome to show everyone how great it looks if you're capable of doing so. If you did, you perhaps might even get a couple of people to buy it and jump in to defend the stupid garbage you're constantly spewing! Hopefully not, though - no, hopefully other people are smart enough to realize "it's okay if if I like this when somebody else doesn't".

I remind everyone that it is perfectly fine to like the effect, which I said all the way back in my second post in the thread before I started getting attacked for not liking it. Go try it out yourself, as a matter of fact - keep and use the program if you do, refund it if you don't, Steam gives you a refund window of 2 hours of playtime, which should be more than enough to determine if you like Lossless Scaling or not. I am merely taking issue with one particularly lousy and unintelligent guest poster who insists that their subjective views are always the one hundred percent objectively correct ones all the time no matter the experiences of anyone or everyone else, not to mention in the face of all decency and common sense. "Autistic screeching" indeed...just a little more projection (and also some lovely ableist discrimination!) on the part of Mr. Shit-for-Brains guest poster here.

Edit: Having now watched Jarno's video, it is pretty amusing to see the video point out the same (and more) types of artifacts I'd already mentioned, as well as the input delay. Good to see it in recorded video and in slow motion. The author was sometimes fine with it, but at least they're cognizant of the drawbacks and why someone might not like it. They even talked about Command and Conquer 3, a M+KB-based game locked to 30 FPS (albeit not largely sprite-based like Baldur's Gate is) and pointed out that it didn't work well for it. I'd have been interested to see how they felt about, say, emulated S/NES games.

firefox_jTCtdYE7Tv.png?rlkey=nji2vyhadjt

firefox_tqeGTtYZOh.png?rlkey=k2z4h51w4ms

firefox_fa8ThfVINh.png?rlkey=xa2l74ztcwh

firefox_Vix6zWrSbj.png?rlkey=8amgydr0fdj

firefox_pMlzzBh4QT.png?rlkey=ao87c2e53ch

firefox_wsZbR1adkV.png?rlkey=afs6yqnql0l

firefox_LwYyKb6LNR.png?rlkey=5n78y210bvw

firefox_DUnpK2J5Yz.png?rlkey=jbvmisy6iak

firefox_niFVw8ucCg.png?rlkey=5i0p2a3sgc6

And the games shown to these commenters weren't even of a "only a set number of distinct frames exist in the first place" nature like Baldur's Gate is, which is where I personally feel the soap opera effect has the highest likelihood of being displayed, just as it is with movies/television that exist in the same situation.

Edited by Bartimaeus
Link to comment
Guest the_sextein

Yes there are lots of stupid people that are both for and against new technology. Quoting them doesn't help your case. I know people who play counterstrike at 600FPS and people who play it at 60FPS.  Games are rendered by graphic cards, not cameras.  All frames are fake and film was only mentioned to explain why the technology is not the same as traditional frame interpolation. Nobody ever recommended that you watch film with it.  Anime and movies are meant to be watched at 24FPS so of course I wouldn't ruin the experience by trying to watch them the wrong way.  Games are meant to be played at any frame rate.  Most games run at a variable frame rate and will switch from 90FPS to 60FPS when you move your point of view in game.

The soap opera effect you are complaining about only occurs when you watch something that was recorded at a consistent 24FPS and then added frames to it to make the recording appear differently than it was meant to be viewed. I'm talking about generating frames in real time with a graphic card.  Games are not recordings that are captured with a camera and edited for movie viewing.  They are controlled in real time and are meant to be played at the highest frame rate your hardware can manage at any given time.  This includes old games that had budget and technological issues.  If you don't like the way the game looks at 60FPS I would assume you have it set up wrong for the refresh rate on your monitor but I would respect anyone's opinion who said that they are used to seeing it at 30fps and just don't like the way it looks at 60fps.  As long as they don't make false claims and argue with facts like you did I wouldn't challenge their opinion or try to change their opinion.  I won't lie though,  It's my opinion that these people are stuck in the past and are choosing an inferior way of playing just because they can't move on.  In the same way that audiences wouldn't accept the Hobbit at 60 FPS even when it was intentionally shot and meant to be viewed at a higher frame rate.  People get stuck on things sometimes.

Link to comment

So, having watched that video ...

- The upside potential here is ... barely even noticing the difference, because human eyes don't process input that fast. 20-some FPS is already enough to look like smooth motion from a human perspective.

- If things go wrong, there will be easily visible distortion effects.

I am not in the target audience of this program, at all. I don't even know the refresh rate of my laptop's screen - that display doesn't support changing the rate, so it doesn't even bother telling the user what it is.

Link to comment
Guest the_sextein
2 hours ago, jmerry said:

So, having watched that video ...

- The upside potential here is ... barely even noticing the difference, because human eyes don't process input that fast. 20-some FPS is already enough to look like smooth motion from a human perspective.

- If things go wrong, there will be easily visible distortion effects.

I am not in the target audience of this program, at all. I don't even know the refresh rate of my laptop's screen - that display doesn't support changing the rate, so it doesn't even bother telling the user what it is.

If it doesn't say the refresh rate in the graphics control panel, one thing you could do is simply run the game at X2 mode which is what I would recommend anyway.  Your monitor will be at least 60hz and doubling the FPS from 30 to 60 will work out perfectly.  I would stick with 60FPS X2 mode even if you have a 120hz display.  This software will be constantly updated so we may see a X4 mode that could work at 120FPS on a 120hz display in the future but until then I think it's best to stay at 60FPS X2 mode based on what I've learned.  The game already has a locked 30FPS by default so you won't need special K or any other software to lock the frame rate either.

Also, regarding artifacts in games.  Sometimes they are noticeable and sometimes they are unnoticeable.  In Baldur's Gate, I can't see even the slightest hint of issues at the edge of the screen or on any of the sprite model animations or spell effects.  Camera scrolling speed can be controlled in the game options and mouse speed can also be controlled in the game options or in the lossless scaling app.  Camera motion will be more smooth and backdrop detail will have better retention in motion.  Sprite animations and spell animations will not have as much jitter and will display more smoothly.  Other than that, there is no difference.  DLDSR can bring out a higher level of detail from the backdrops if your GPU supports it.  You will need to check the graphics control panel.  You can probably do that by right clicking the desktop and looking for a control panel from Nvidia or AMD in the menu that pops up.

Link to comment
Guest the_sextein
3 hours ago, jmerry said:

So, having watched that video ...

- The upside potential here is ... barely even noticing the difference, because human eyes don't process input that fast. 20-some FPS is already enough to look like smooth motion from a human perspective.

- If things go wrong, there will be easily visible distortion effects.

I am not in the target audience of this program, at all. I don't even know the refresh rate of my laptop's screen - that display doesn't support changing the rate, so it doesn't even bother telling the user what it is.

One other thing that I thought of.  DLDSR can be very demanding and it might be hard on a laptop.  If you don't have a modern Nvidia card then you probably don't have the option anyway.  You could try using the bicubic image scaling option in the lossless scaling app and increase the sharpness through that.  This usually does a pretty good job at bringing out further detail.  I didn't notice any visual problems with the bicubic upscaling mode but it's possible it could introduce some issues so you may want to keep that in mind.  Turning off scaling and using frame generation by itself is the safest option to avoid visual anomalies but I think you should experiment and find what works best for you and your setup.

Link to comment
Posted (edited)
8 hours ago, Guest the_sextein said:

Yes there are lots of stupid people that are both for and against new technology. Quoting them doesn't help your case. I know people who play counterstrike at 600FPS and people who play it at 60FPS.  Games are rendered by graphic cards, not cameras.  All frames are fake and film was only mentioned to explain why the technology is not the same as traditional frame interpolation. Nobody ever recommended that you watch film with it.  Anime and movies are meant to be watched at 24FPS so of course I wouldn't ruin the experience by trying to watch them the wrong way.  Games are meant to be played at any frame rate.  Most games run at a variable frame rate and will switch from 90FPS to 60FPS when you move your point of view in game.

The soap opera effect you are complaining about only occurs when you watch something that was recorded at a consistent 24FPS and then added frames to it to make the recording appear differently than it was meant to be viewed. I'm talking about generating frames in real time with a graphic card.  Games are not recordings that are captured with a camera and edited for movie viewing.  They are controlled in real time and are meant to be played at the highest frame rate your hardware can manage at any given time.  This includes old games that had budget and technological issues.  If you don't like the way the game looks at 60FPS I would assume you have it set up wrong for the refresh rate on your monitor but I would respect anyone's opinion who said that they are used to seeing it at 30fps and just don't like the way it looks at 60fps.  As long as they don't make false claims and argue with facts like you did I wouldn't challenge their opinion or try to change their opinion.  I won't lie though,  It's my opinion that these people are stuck in the past and are choosing an inferior way of playing just because they can't move on.  In the same way that audiences wouldn't accept the Hobbit at 60 FPS even when it was intentionally shot and meant to be viewed at a higher frame rate.  People get stuck on things sometimes.

From the beginning, all I have said is that I think frame interpolation looks bad because [subjective observations and experiences]. I didn't say it wasn't worthwhile for Nvidia or other companies to pursue, I didn't say it was wrong for anyone else to like it, I didn't try to go "ackshually" on why everyone should realize it's bad because of my own reasons. I just said that I didn't like it and why, and that I couldn't find a use case for where it subjectively seemed the positives outweighed the negatives, including even trying the specific examples you told us to try (yes, including trying it on animated shows, as was a suggestion from a post I directly quoted from you, unless you're saying somebody else made a post under your guest alias). But I also don't like motion blur in video games - or for that matter, depth of field effects the vast majority of time as well. I think those effects actively make games look worse... Yet, they're included and enabled by default on the vast majority of modern 3D games, despite the fact that doing a cursory search engine query shows how many people hate these visual effects. But do the majority of gamers hate it? I don't know, probably not, if the majority of gamers even notice it in the first place...but certainly a sizable subsection do. What I am trying to say by all of this is that appeals to authority in the way of Nvidia et al. trying to push this technology are completely irrelevant to the fact that plenty of people like myself think this effect, and others which are also designed to make games look better, instead look bad. Beyond maybe trying to make someone notice what the benefits are so that they aren't only noticing the drawbacks, there's absolutely nothing you can do or say that can change that. I can appreciate that it works the same way in reverse, which is why I have not attempted to convince you that you are wrong for liking it or are in dire need of education beyond correcting your attempts to explain to me things I already know.

With regards to the soap opera effect...I see it, other people it, there's clearly something there that is making people experience what seems like an identical or at least similar visual phenomenon to the soap opera effect, otherwise it wouldn't keep getting mentioned by many more people than just me. I also don't really understand why you're so stuck on "24 FPS and film" versus "30 FPS and video game" when it otherwise seems like a completely analogous situation. My brain expects 24 FPS when watching film and it gets upset when it sees these distracting interpolated frames that mess with perceived motion, my brain expects 30 FPS when playing BG and it gets upset when it sees these distracting interpolated frames (or generated frames, if you so insist - I don't see any difference except for a matter of the particular technique to blend true/natural frames together) that mess with perceived motion. Is there some article you can point to me that says the soap opera effect can only apply only to film? As an aside, your specific example of The Hobbit was actually a real 48 FPS in theaters (not 60, like soap operas have historically been shot at AFAIK...and I'm not sure if the motion smoothing setting of TVs attempt to do 60, just a doubled 48, or something else entirely). I never saw the 48 FPS version of The Hobbit, so I can't really comment on how its real 48 FPS might have looked compared to a 24 -> 48 interpolation, but given people's overwhelmingly negative reaction towards it even though it was real, it would seem to support my position of it being an issue of expectations...just like is the case for me with Baldur's Gate. So regardless of whether we're talking about film or video games, this is fundamentally about visuals and frame rate and expected vs. perceived motion, and in the particular case of the Infinity Engine games, these are games where the visuals that have always displayed at a consistent FPS just like film does. Well, that's not quite true, because silent films used to be shot and displayed at many different frame rates, anywhere between 12 and 40 FPS. I watch more films than I play video games these days by far, and I occasionally watch silent films, and I cannot say that I've ever experienced anything akin to what these interpolated (or "generated") frames look like. I've yet to see even a single example of where they look or feel like true frames...as would be naturally output by the game engine (or video player) and subsequently rendered by your GPU, as opposed to some machine learning algorithm that just makes its best guess by looking at what the next and previous frames look like and trying to make some blended amalgamation of the two. If someone could make an engine hack that has the game render internally at 60 FPS while just doubling up the frames of the sprites (i.e. a natural 60 FPS for stuff like camera pans and cursor movement, but all the sprite animations stay the same because of being doubled up and not sped up or interpolated), I don't think we'd be having this conversation, because it just looks and feels fundamentally different to me.

4 hours ago, jmerry said:

The upside potential here is ... barely even noticing the difference, because human eyes don't process input that fast. 20-some FPS is already enough to look like smooth motion from a human perspective.

If you have a 480 Hz screen and watch 479 frames of the exact same image - let's say white - but one frame is totally different - black - you should be able to notice that single black frame. I'm not sure exactly what the upper limit is (some Google searching suggests the polling rate of the human eye is somewhere between 700 and 1000), but extreme differences are perceptible at least up to 480 Hz. This is also pretty noticeable with...say, moving your white mouse cursor across a dark screen very quickly: the difference in the smoothness of motion of the mouse cursor between 20 and just 30 is huge, the difference between 30 and 60 (I don't think any computer/laptop screens have been made with less than 60 in the last 20 years at the very least) is about equally as big, and there's a pretty big difference between 60 and 120 as well. As you go beyond that, the differences start to get smaller but given a big enough jump, it can certainly still be perceptible (e.g. 120 to 180, or at least 240), but admittedly less so when you're not paying strict attention to it. Now, the tricky bit with this kind of interpolation technology is that it attempts to give you the perception of smoothness (just as you would from a naturally high frame rate) by using generated frames that mostly kind of look like natural frames. By their very nature of deliberately trying to look like the previous and subsequent frames, the differences are not extreme and therefore much less perceptible, especially when they come in at a relatively decent speed like 120 FPS...but then again, if literally half of your frames are machine learning-generated (which...I mean, it just flat-out doesn't do a good enough job, at least not yet or in the implementations I've seen), you might still be able to notice that something is wrong after all, even if you can't always concretely put your finger on exactly what without recording and slowing down the footage. It will depend on the exact use case, how fast paced it is, the type of camera you're wielding, and probably some other factors. Much more perceptible, in my opinion, is the input delay: the more input delay you have, the more sluggish and drunker your controls feel, no matter how smooth it looks, and that's not a trade-off I'm willing to make pretty much ever even without the other problems...but especially so with them. I must intone again that I genuinely struggle to understand how anyone could play a fast-paced first person shooter like Cyberpunk 2077 with frame generation on: it feels so bad to look around while having that additional input delay. Even the mouse cursor in Baldur's Gate felt a little bit wonky while using frame generation, and it's a completely static isometric camera. I am famously hypersensitive, though...but I take comfort in the fact that I am by no means alone.

Edited by Bartimaeus
Link to comment
Guest the_sextein
26 minutes ago, Bartimaeus said:

From the beginning, all I have said is that I think frame interpolation looks bad because [subjective observations and experiences]. I didn't say it wasn't worthwhile for Nvidia or other companies to pursue, I didn't say it was wrong for anyone else to like it, I didn't try to go "ackshually" on why everyone should realize it's bad because of my own reasons. I just said that I didn't like it and why, and that I couldn't find a use case for where it subjectively seemed the positives outweighed the negatives, including even trying the specific examples you told us to try (yes, including trying it on animated shows, as was a suggestion from a post I directly quoted from you, unless you're saying somebody else made a post under your guest alias). But I also don't like motion blur in video games - or for that matter, depth of field effects the vast majority of time as well. I think those effects actively make games look worse... Yet, they're included and enabled by default on the vast majority of modern 3D games, despite the fact that doing a cursory search engine query shows how many people hate these visual effects. But do the majority of gamers hate it? I don't know, probably not, if the majority of gamers even notice it in the first place...but certainly a sizable subsection do. What I am trying to say by all of this is that appeals to authority in the way of Nvidia et al. trying to push this technology are completely irrelevant to the fact that plenty of people like myself think this effect, and others which are also designed to make games look better, instead look bad. Beyond maybe trying to make someone notice what the benefits are so that they aren't only noticing the drawbacks, there's absolutely nothing you can do or say that can change that. I can appreciate that it works the same way in reverse, which is why I have not attempted to convince you that you are wrong for liking it or are in dire need of education beyond correcting your attempts to explain to me things I already know.

With regards to the soap opera effect...I see it, other people it, there's clearly something there that is making people experience what seems like an identical visual phenomenon to the soap opera effect, otherwise it wouldn't keep getting mentioned by many more people than just me. I also don't really understand why you're so stuck on "24 FPS and film" versus "30 FPS and video game" when it otherwise seems like a completely analogous situation. My brain expects 24 FPS when watching film and it gets upset when it sees these distracting interpolated frames that mess with perceived motion, my brain expects 30 FPS when playing BG and it gets upset when it sees these distracting interpolated frames (or generated frames, if you so insist - I don't see any difference except for a matter of the particular technique to blend true/natural frames together) that mess with perceived motion. Is there some article you can point to me that says the soap opera effect can only apply only to film? As an aside, your specific example of The Hobbit was actually a real 48 FPS in theaters (not 60, like soap operas have historically been shot at...and I'm not sure if the motion smoothing setting of TVs attempt to do 60, just a doubled 48, or something else). I never saw the 48 FPS version of The Hobbit, so I can't really comment on how its real 48 FPS might have looked compared to a 24 -> 48 interpolation. But regardless of whether we're talking about film or video games, this is fundamentally about visuals and frame rate and expected vs. perceived motion, and in the particular case of the Infinity Engine games, these are games where the visuals that have always displayed at a consistent FPS just like film does. Well, that's not quite true, because silent films used to be shot and displayed at many different frame rates, anywhere between 12 and 40 FPS. I watch more films than I play video games these days by far, and I occasionally watch silent films, and I cannot say that I've ever experienced anything akin to what these interpolated (or "generated") frames look like. I've yet to see even a single example of where they look or feel like true frames...as would be naturally output by the game engine (or video player) and subsequently rendered by your GPU, as opposed to some machine learning algorithm that just does its best guess by looking at what the next and previous frames look like and trying to make some amalgamation of the two. If someone could make an engine hack that has the game render internally at 60 FPS while just doubling up the frames of the sprites (i.e. a natural 60 FPS for stuff like camera pans and cursor movement, but all the sprite animations stay the same FPS because of being doubled up), I don't think we'd be having this conversation, because it just looks and feels fundamentally different to me.

If you have a 480 Hz screen and watch 479 frames of the exact same image - let's say white - but one frame is totally different - black - you should be able to notice that single black frame. I'm not sure exactly what the upper limit is (some Google searching suggests the polling rate of the human eye is somewhere between 700 and 1000), but extreme differences are perceptible at least up to 480 Hz. This is also pretty noticeable with...say, moving your white mouse cursor across a dark screen very quickly: the difference in the smoothness of motion of the mouse cursor between 20 and just 30 is huge, the difference between 30 and 60 (I don't think any computer/laptop screens have been made with less than 60 in the last 20 years at the very least) is about equally as big, and there's a pretty big difference between 60 and 120 as well. As you go beyond that, the differences start to get smaller but given a big enough jump, it can certainly still be perceptible (e.g. 120 to 180, or at least 240), but admittedly less so when you're not paying strict attention to it. Now, the tricky bit is this kind of interpolation technology is that attempts to give you the perception of smoothness (just as you would from a naturally high frame rate) by using generated frames that mostly kind of look like natural frames. By their very nature of deliberately trying to look like the previous and subsequent frames, the differences are not extreme and therefore much less perceptible, especially when they come in at a relatively decent speed like 120 FPS...but then again, if literally half of your frames are machine learning-generated (which...I mean, it just flat-out doesn't do a good enough job, at least not yet or in the implementations I've seen), you might still be able to notice that something is wrong after all, even if you can't always concretely put your finger on exactly what without recording and slowing down the footage. It will depend on the exact use case, how fast paced it is, the type of camera you wield, and probably some other factors. Much more perceptible, in my opinion, is the input delay: the more input delay you have, the more sluggish and drunker your controls feel, no matter how smooth it looks, and that's not a trade-off I'm willing to make pretty much ever even without the other problems...but especially so with them. I must intone again that I genuinely struggle to understand how anyone could play a fast-paced first person shooter like Cyberpunk 2077 with frame generation on: it feels so bad to look around while having that additional input delay. Even the mouse cursor in Baldur's Gate felt a little bit wonky while using frame generation, and it's a completely static isometric camera. I am famously hypersensitive, though...but I take comfort in the fact that I am by no means alone.

That's fine, I'm tired of arguing about it to be honest.  I never tried to tell you to like it or not.  I told you it is not the same as old school interpolation and you are not experiencing the soap opera effect.  Outside of that I mentioned that the app can be used for anime and movies because it can.  I never recommended that you use it for those things and would never use them myself.  Unlike you, I'm a purist who plays back movies exactly as they are intended.  I asked you to stop posting about it because it was off topic.  You didn't state your opinion about lossless scaling on Baldur's Gate until we argued about those things and those have been the things I have continued to argue about.  I honestly don't care if you like lossless scaling or not.  It's completly unimportant to me.

I read that the hobbit was supposed to be 60FPS but the studio refused to foot the bill.  A test audience was played a small amount of footage at 48FPS and it didn't do well so they scrapped it.  I could be wrong but I don't think it ever saw the light of day above the usual 3D 24FPS theater release.

Cyberpunk uses DLSS which actually increases the response times.  Frame generation slows it back down and then reflex increases it more.  As long as you have a 60FPS base the game feels fine with frame generation on in my opinion.  Baldur's gate looks and feels the same as it always did regarding everything but animation fluidity which I like because the original is very jerky and has terrible blur issues when the camera moves.  I get if you are used to seeing something a certain way and you don't want to change it.  I don't have the issue but I'm fine with it.

Link to comment
Posted (edited)
55 minutes ago, lynx said:

In GemRB you can simply set a higher (maximum) drawing FPS. It's not tied to 30 or the AI speed any more. :)

Well, fancy that. And now, having actually tested it out...I can truthfully report that GemRB looks and feels quite nice in comparison to the original frame-capped engine. I did it with an initially uncapped FPS and noticed from the noise of my GPU that my computer didn't seem to much like that (no surprise, an uncapped FPS that goes into the many hundreds or even into the thousands makes just about any GPU start to meltdown, no matter how undemanding a game's visuals are), so I went to 60 FPS and thought it seemed really good...and then I restarted it with 30 FPS and thought "oh no". If this is what other people see and feel when they try Lossless Scaling's frame generation on an Infinity Engine game, I could understand why someone might like it so much, because it is quite swanky. But...it didn't look or feel like this to me at all.

60: https://dl.dropboxusercontent.com/scl/fi/6a8zhj5m383sx9pqfdpa6/gemrb_3afu1jkbpf.mp4?rlkey=9dqnwt3rnjvkp1xj7hsi0wxza&dl=0

30: https://dl.dropboxusercontent.com/scl/fi/qjh35ttqbjysppjq742th/gemrb_c8qYBP52Tt.mp4?rlkey=y75g4456dvt5wkzqxflmqwjry&dl=0

The weather effects and clock in the bottom left of the screen seem like they're going at double speed, but I don't think anything else is. The fact that it so obviously looks and feels so much better (which I think can only be properly experienced by actually playing with both GemRB and Lossless Scaling's frame generation feature) really only makes me even more inclined to believe that frame interpolation always has been, currently is, and will always be bad...but hey, different strokes for different folks.

Edited by Bartimaeus
Link to comment
Guest the_sextein
1 hour ago, lynx said:

In GemRB you can simply set a higher (maximum) drawing FPS. It's not tied to 30 or the AI speed any more. :)

I appreciate the suggestion but I'd rather stick with lossless scaling.  GemRB delivers 30FPS sprite animations and 60FPS UI animations and camera animations sped up.  Lossless scaling offers 60FPS sprite animations UI animations and camera animations with zero changes to the game code and running without any visible artifacts or sluggish behavior.  Lossless scaling does cost $7 but it works on all games and is being constantly improved.  I just can't find a flaw with it in any of the infinity engine games.  To each their own.

Link to comment

Thanks for the test! I've now fixed both problems you reported. :) Weather still moves with the viewport, but that's a separate known issue.

Link to comment
Guest the_sextein
29 minutes ago, Jarno Mikkola said:

A video of this would ... but there isn't one.

You can refund it if you don't like it.  See for yourself or listen to any of the reviewers on youtube who all praised it.  In the video you posted they said the issues were hard to spot on most games and they found the program to be awesome most of the time.  They especially praised it for old games like Baldur's Gate that are locked to 30 or 60FPS.  Even command and conquer worked well other than some sluggishness in the mouse.(which has already been fixed with a mod that works with lossless scaling)  They recommended it to gamers in general and most user reviews are super positive as well.  I have no reason to lie about it that is for sure.  Maybe your hardware will work different than mine but if that's the case you can always refund it.

Link to comment
5 hours ago, Guest the_sextein said:

I appreciate the suggestion but I'd rather stick with lossless scaling.  GemRB delivers 30FPS sprite animations and 60FPS UI animations and camera animations sped up.

Err no, the FPS cap is global, so everything has the same FPS in the end. Of course, this means that the same frame might be drawn several times if there is nothing in the data (eg. some minor animations are set to 15 FPS in their files) or if the limit is high enough.

I don't know what you mean by camera animations though.

Link to comment

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...