I know this has been asked a lot and upgrading Unity takes a lot of work, but… I’m gaming with a Nvidia 3080 and even then I have to drop a lot of settings quite low to come even close to 60FPS in 4k in combat.
That is imo kind of unacceptable, considering there are more graphically detailled games out there that run more smoothly without a need to sacrifice anything, frequently thanks to DLSS. Though it doesn’t help that Unity has never been known for its great performance
If you need a sales pitch: how about being able to reach a larger audience because more people would be able to run the game smoothly?
It’s unlikely to be your gpu, given all of the ai, procs, etc are calculated by your pc it’s more likely to be your cpu. I would therefore think its a reasonable assumption that multiplayer may help in this regard as most of the calculations should be offloaded to the server.
Also, it’s not that they don’t want to upgrade to a version of unity that supports this, its that they’ve not budgeted for it and it would take a considerable amount of resources to do (why, I have no idea) meaning that they’d have to drop other things to get this done.
I’m also not entirely sure that DLSS/etc would actually help much since it’s not your gou slowing things down.
Thanks for clarifying Llama! I assumed it was my GPU as it’s making a fair bit of noice when playing and my CPU doesn’t go that high in terms of usage. Either way, it’ll definitely be interesting to see what’ll happen performancewise when multiplayer hits.
It’s just like any piece of software that is central to the development and build process, really - never to be taken lightly. There’s a myriad of things to be tested, potentially changed, and go wrong, and it can be very time consuming to get it right. Done with insufficient caution and time, it can very easily wreak havoc on your whole pipeline and bring development to a screeching halt. So, dedicating a lot of resources to it makes good sense.
As an example, I believe (I was only involved towards the end so I’m not 100%) my own team’s most recent foray into upgrading Unity took 3-6 months of cumulative work across multiple teams, and even then there were still explosions once the upgrade was merged into mainline and it had to be backed out at least once. Granted, our build system is somewhat old and brittle as hell, and almost nobody who originally built much of it is still around, so that helped, but still.
I think the main issue here is the underlying performance of LE - CPU & GPU…
DLSS & FFX are ways to improve certain performance by utilising the capabilities of the GPU independant of the games own optimisation. In my opinion, these are “band-aid” technologies to the primary issue of performance optimisation in the game and a lot of people cannot even use them anyway (older or unsupported gpus). And in LE’s specific case (as pointed out by Llama8) the CPU is more likely the culprit in most of the instances of low FPS.
First prize for me would be that the game didnt NEED you to want to even consider using DLSS/FFX to get better than 60fps at 4k.