Mentioned this in another thread yesterday:
Like many UE games over the years, they didn’t properly optimize Unreal itself for their use, and there were already several ini tweaks up on the Nexus to remedy this the day of launch.
Went from 27 average fps when in exterior cells to a solid 60, with an unsupported GPU by just using one of these ini tweaks.
While the updated config I installed helped, I still get noticeable frame drops on my pretty beefy PC in the overworld.
Nice, amateur hour it seems.
I’m having relatively good performance in 6600rx on Linux but after a while theres some sort of GPU memory leak (would be my guess) where fps halves until the game is restarted.
It’s poorly optimized. At version 0.4 is probably the first thing that looked decent, with final art in place, but no QA or optimization done. My bet is that they had to launch earlier than expected due to the rumors, or they extended way past the due date and the money for the project ran out. If successful, probably optimization will take place, but they are waging on it.
I’ve been playing it on steam deck, it’s definitely playable but I wouldn’t call it smooth.
Does it freeze up all the time like in the Digital Foundry video?
If not I’m wondering if it’s that stupid shader compiling thing that has plagued PC games all generation.
I’ve gotten a lot of freezing and stuttering playing on my desktop PC (Linux with Proton). The deck actually seems to be more stable, though it is locked to 30 fps and textures still take a minute to load sometimes.
It is verified for the Steam deck though.
At 30fps if you call that verified
That’s fine honestly, provided it’s smooth. In the video, there was a fair amount of hitching though…
It’s definitely not smooth. Interior cells run decently, but my OLED Steam Deck dips into the low 20’s in exterior cells.
Ouch.
For day one, performance is actually fine. I have much bigger gripes than getting fps dips in the open zones. Like levelling ffs. I have 100 strength, willpower, and blades, but am doing less damage to mobs now than I was doing in the beginning of the game. Or levelled loot drops and quests.
So … just like the original Oblivion?
The key to oblivion is to pick tag skills that you won’t use. If your build is a stealth archer, pick block blunt and restore. You only level when your tagged skills level, so your archery illusion and sneak will be 100 but your character will be sub level 10 so you’ll basically be a god
The level system doesn’t work that way anymore. Now when you level up, it doesn’t matter what skills you leveled up when you get a new level, you always get 12 points (called “virtues”) to spread around to any stat. Luck, however, takes 4 “virtues” to level one point, while the others are just 1:1 and you can add up to 5 at a time.
I can level up entirely through using Agility linked skills but then put my stat points into Strength and Intelligence instead of agility.
The real issue has to do with the level scaling on enemies still being the worst of any Elder Scrolls game because they didn’t change anything about that from the OG. So once you’re level 50, everything has the best weapons and armor on them.
Doesn’t work in the remaster; they changed so that all skills contribute to level up progress.
So they took the mechanic in the game yhat was universally hated, and made it worse…?
Sort of. The new leveling system has minor skills contribute to your levels, to a lesser degree. IIRC it’s something like 10 major levels or 20 minor levels (or some combination thereof) to get a character level.
There are mods that help with this
If somebody didn’t realize it was almost certainly going to run poorly the second it was revealed to use UE5, I wouldn’t even know what to say to them.
Can we please stop blaming UE5 for sloppy development and poor QA?
I’m not blaming UE5, but I’m capable of pattern recognition. There’s a pattern of developers not fixing UE5 issues and releasing games with them still present. The fault lies with both game developers and UE developers.
You just touched on the problem, which is a confluence of Base Rate Neglect and Availability Bias.
UE is the most popular gaming engine, so it’s used on the most projects and has a high amount of visibility. No matter which engine you build a game with, there are many factors to keep in mind for performance, compatibility, and stability. The engine doesn’t do that for you.
One problem is that big studios build games for consoles first, since it’s easiest to build for predictable systems. PC then gets ignored, is minimally tested, and patched up after the fact. Another is “Crysis syndrome”, where developers push for the best graphics they can manage and performance, compatibility, and stability be damned - if it certifies for the target consoles, that is all that matters. There is also the factor of people being unreasonable about their hardwares capabilities, expecting that everything should always be able to run maxxed out forever… and developers providing options that push the cutting edge of modern (or worse, hypothetical future) hardware compounds the problem. But none of these things have anything to do with the engine, but what developers themselves make on top of the engine.
A lot of the responses to me so far have been “that’s stupid because” and then everything after “because” is related to individual game development, NOT the engine. There is nothing wrong with UE, but there are lots of things wrong with game/software development in general that really should be addressed.
As soon as someone releases a UE5 game that doesn’t run like ass
Clair Obscure runs pretty well out of the gates.
I had to use upscaling to get it to 60 but there’s certainly worse
Avowed ran well for me.
Fortnite, Wukong, Tekken 8, Layers of Fear, Firmament, Everspace 2, Dark and Darker, Abiotic Factor, STALKER 2, Jusant, Frostpunk 2, Satisfactory, Expedition 33, Inzoi, Immortals of Aveum, Starship Troopers: Extermination, Ninja Gaiden 2 Black, Lords of the Fallen, Robocop, Myst (UE5 remake), Riven (UE5 remake), Palworld, Remanant 2, Hellblade 2, Subnautica 2… and the list keeps growing.
When a big studio skips QA and releases a broken game, it’s not the engine’s fault, it’s the studios fault. As long as consumers tolerate broken games that can maybe be fixed later (if we’re lucky) then companies will keep releasing broken, unfinished, unpolished, untested games. Blaming UE5 is like blaming an author’s word processor for a poorly written novel.
Well that’s a pretty shit list. You have there games that aren’t using UE5 (Layers of Fear 2), that are known to have poor performance (STALKER 2), that just released into early access (Inzoi) and that haven’t even released into early access (Subnautica 2).
I’d throw half the list out the window, actually probably more because the other half of the list are mostly games I don’t know enough to evaluate their performance.
Also, “I don’t know what I’m talking about, so your list is invalid” isn’t the dig you seem to think it is.
I didn’t know the original game was remade. I assumed you meant layers of fear 2 because the original layers of fear wasn’t even on Unreal Engine and Layers of fear 2 is on UE4. Nothing I said was explicitly wrong. It was wrong in the context only because you weren’t precise with what you’re saying.
And how nice of you to pick out the one thing I was wrong on while completely ignoring all the other examples. For instance how the fuck can you put Subnautica 2 on that list when it’s not even in early access?
Satisfactory alone would be enough, the game runs so smoothly for the amount of shit going on there, it’s amazing.
Idk I think the only one of those on that list that I’ve played that ran well enough that I’d consider it ok was tekken and I’m assuming that’s more because it’s a fighting game.
So what you’re saying is that Tekken being a fighting game just magically made a “bad engine” run well?
Fighting games would run well on a fucking smart fridge. They’re by far the least performance hungry game genre. There is no live loading of assets, the Background scenery is 100% static and there are usually just two characters on the screen on any given moment. It would take actual effort to fuck up the performance of something so simple
Right. So it’s not the engine, but what you do with it.
No I’m saying that it being a fighting game meant that it’s much easier to optimize because you have such a fixed camera angle and few characters on screen.
So it’s because the developers paid attention to optimization and polish to ensure the game ran well on the largest number of devices.
My point exactly. It’s not the engine, it’s what you do with it and how you do it.
Its Lumen. Its 100% Lumen.
Yes. It runs like dog water. And it seems people are just looking past it because of the nostalgia effect.
I had to tweak quite a bit but it’s running at a stable 60 fps at 1440p now. I wouldn’t say I’m looking past it, just enjoying it in spite of the performance issues.
My FPS drops from 60 to like 25, but that’s rarely. It’s not like it’s a constant 25.
And this is why I don’t buy day 1. Performance actually looks reasonable compared to other day 1 releases, but it’s still not what I want to play. I bet most of these issues will be resolved in a month or two, and definitely resolved by the first sale, so I’ll hold off. It’s not like there’s going to suddenly be content to miss out on, it’s a remaster, so waiting is absolutely reasonable.
PC and console experiences have so far shown to be very, very different. Console players from what I can tell are having a more stable experience
Yeah I play on my PC and I’ll cross play my save on my Xbox when I want to use the TV. The series X is quite a bit smoother. Sucks lol. Every UE5 game I’ve played on PC has not been a good experience lol. (I can play star citizen around 60fps in cities, KCD 2 on the highest srtting, for reference)
Avowed is UE5 and that ran well for me.
Yeah that’s a bummer. I’m curious how the new I’m building will do. 9080 + 9800x3d
I run it on ultra at 1440p with RT on high and FSR performance upscaling. I get 60 fps consistently with these settings, no drops. I have a 9900X/7900 XT so I imagine you’ll be able to get quite a bit more out of it.
Nice thanks!
Probably good. What’s a 9080? Lol I have a 5800x3d. Love that chip
New AMD GPU. There’s a 9080 and 9080xt
Ah okay. Nice
i’m looking past it because my laptop is 7 years old and i’m happy it even runs lol
Right. But your laptop and my PC shouldn’t be playing the game at the same performance ya know.
yeah i’m aware, i wasn’t generalising it was just my personal experience
It runs like most UE5 games.
Like shit.
It’s playable though, that’s all I want right now.
It runs like most Bethesda games.
Like shit.
It’s Unreal 5 slop with OG Oblivion running in the Background. Of course it has these issues.
Yeah, I wonder if that’s perhaps the result of basically stapling the old game engine onto UE5 in order to preserve the core gameplay. Back when Oblivion was first released, multicore CPUs were incredibly rare, so it’s likely the engine was not built to take advantage of them. But ever since then, most of the improvements in CPUs have come in the form of adding more cores rather than increasing clock speed, and it’s by no means trivial to convert single-threaded code into multi-threaded. Most likely it would require a complete rewrite, which they’d probably want to avoid in order not to introduce more bugs.
But of course, it could also just be UE5’s fault, since even a single core on a modern CPU should not be slower than a 2006 model.
Ngl I’m running a pirated copy through Lutris and it’s not too bad. Beggars can’t be choosers though lmao.
[Have a coin, beggar.]
Lmao I’ll buy it when I can. :)
Bethesda doesn’t need any more money, spend it on an indie game.
Tell that to the Gray Fox!
Metal gear! /s
I run it medium with a 7600xt at 3440x1440. Seems fine to me
I don’t trust this shit anymore after City Skylines 2 ran just fine. A bunch of people lost their shit anyways.
I mean I enjoy CS2 as well but I can’t deny it had pretty major performance problems. It’s gotten better over time but launch day was a disaster.
And yet I had non other than some minor stuttering that was completely ignorable in a city builder.
Good for you.
It ran fine? I feel like it had all sorts of bottlenecks but it’s been a while
Just fine on my PC
X to doubt…
Maybe you just have low expectations, but I’ve seen so many threads about terrible performance on top tier hardware. That’s inexcusable.
I tried it again recently and starts out tolerable but gets worse the bigger your city gets, even when you lower settings. It would be one thing if the game looked amazing and had these deep, detailed simulations… but it just looks okay and the digital corner-cutting trickery becomes obvious when you start looking closely. I feel like there is something fundamentally wrong under the hood of Skylines 2.
I’ve had cities in the hundreds of thousands with no issues even if it does start to drop in framerate.
Heads up, because I imagine the DF guys were too PC master race to notice, but you can smooth out a lot of the hitches by using framegen.
There’s this weird implementation in the game where if you set frame gen to auto it seems to automatically turn it off if you’re over the fps cap and then turn it on when you drop below and it’s worth giving that a shot. It took some tweaking but I did end up finding a mix where between that and VRR with a low enough cap to maintain it most of the time but high enough to get acceptable latency the game is… mostly playable?
It was still a shock to go outside for the first time (most of the hitches seem to be around outdoors traversal) and it’s still not perfect, but it did clean up a lot of it. Well, some of it. Your mileage may vary based on hardware and expectations, though.
IME framegen hasn’t meaningfully reduced the open-world hitching. It gets the framerate nice and smooth while standing still, fighting a bandit or whatever… until you walk a bit and the game becomes CPU-limited while streaming in new cells, at which point you noticeably hitch.
The performance in interior cells (including cities) is very good even on Ultra settings.
I suspect that this is one of the compromises they made by keeping the old engine running under the hood, because as DF notes this also happened in the original.
That is entirely possible. My setup seems to be in this sweet spot where the normal performance is high enough over the cap AND the framegen gets you enough extra smoothness AND the VRR is able to eat enough miliseconds off the hitches that it is noticeably improved (but crucially not perfect, so if you’re more sensitive than me that may also be part of it). Still, even if it doesn’t help for everybody it’s worth a shot and not covered in the video.
I bet there is something to having to load the world in chunks in the underlying engine and then having to render the chunk all at once in UE5 that makes UE5’s struggles even worse. Still, the game was a shadowdrop, you have to assume they could have taken some time to try to figure it out a bit better.
The worst case scenario is that further optimization isn’t an option, but… I mean, even if it is related to what people think it’s related they should be able to find some way to ease some of the load off. The observation that a lot of the performance hit is related to hardware Lumen alone points that way. Especially since having a faster base framerate does seem related to having smaller hitches. But hey, who’s to say? I guess we’ll see where they go from here.
Frame gen shouldn’t be a crutch, and by design it’s only supposed to enhance games that’s get above 60fps naturally. It doesn’t do anything good for the open world that constantly tanks to 45. It’s not a master race thing, it’s a poor optimization thing.
No, the point is the DF video never even tested framegen or upscaling before deeming the issue unsolvable in this video. I’m just trying to offer additional options to tune settings they don’t cover in the video that may help.
Frame gen, for the record, is fundamentally a crutch. Specifically for CPU limits. It serves no other purpose. If you don’t need it as a crutch you don’t need it, period. It takes you from wherever you can get natively to hopefully closer to your monitor refresh rate. If you can reach your refresh rate then you don’t need it in the first place.
Or at least it does that in the default implementation from GPU vendors where you’re locked into uncapped, non vsynced FPS when using it.
I’m calling out that there seems to be a specific implementation here to use it with a frame cap. And with that fame cap if you can get yourself to, say 45-60 fps you can get a semi-decent 90 or 120 cap out on the other end that does trim down some of the stutters, especially if you also have VRR to eat a few extra miliseconds.
So it’s not ideal, you’re effectively locking the game to 90 or 120 and then trying to scrape by at 45-60 and double up with frame gen just so you can use an AI frame to slide in between the 45-80ms spike and eat the rest of the time difference with VRR. But hey, it kinda works, at least in my setup. Crutch or no crutch it makes the game more playable for me. I don’t have the tools to measure exactly how much more playable, and I’d like to see DF test it, but at a glance it seems to help.
That doesn’t mean they shouldn’t look into the cause and patch improvements, but if it can take the game from unplayable to playable for some people on some setups that’s a good thing.