this post was submitted on 18 Aug 2023
274 points (100.0% liked)

Gaming

30564 readers
144 users here now

From video gaming to card games and stuff in between, if it's gaming you can probably discuss it here!

Please Note: Gaming memes are permitted to be posted on Meme Mondays, but will otherwise be removed in an effort to allow other discussions to take place.

See also Gaming's sister community Tabletop Gaming.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games' graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics' level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn't need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

top 50 comments
sorted by: hot top controversial new old
[–] LanAkou@lemm.ee 87 points 1 year ago (6 children)

Is it diminishing returns? Yes, of course.

Is it taxing on your GPU? Absolutely.

But, consider Control.

Control is a game made by the people who made Alan Wake. It's a fun AAA title that is better than it has any right to be. Packed with content. No microtransactions. It has it all. The only reason it's as good as it is? Nvidia paid them a shitload of money to put raytracing in their game to advertise the new (at the time) 20 series cards. Control made money before it even released thanks to GPU manufacturers.

Would the game be as good if it didn't have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn't be as good if Nvidia hadn't paid them, and that means raytracing has to be included.

A lot of these big budget AAA "photorealism" games for PC are funded, at least partially, by Nvidia or AMD. They're the games you'll get for free if you buy their new GPU that month. Consoles are the same way. Did Bloodborne need to have shiny blood effects? Did Spiderman need to look better than real life New York? No, but these games are made to sell hardware, and the tradeoff is that the games don't have to make piles of money (even if some choose to include mtx anyway).

Until GPU manufacturers can find something else to strive for, I think we'll be seeing these incremental increases in graphical fidelity, to our benefit.

[–] Kir@feddit.it 45 points 1 year ago (2 children)

This is part of the problem, not a justification. You are saying that companies such Nvidia have so much power/money, that the whole industry must spend useless efforts into making more demanding games just to make their products relevants.

[–] RiikkaTheIcePrincess@kbin.social 19 points 1 year ago (1 children)

Right? "Vast wealth built on various forms of harm is good actually because sometimes rich people fund neat things that I like!" Yeah sure, tell that to somebody who just lost their house to one of the many climate-related disasters lately.

I'm actually disgusted that "But look, a shiny! The rich are good actually! Some stupid 'environment' isn't shiny cool like a videogame!" has over fifty upvotes to my one downvote. I can't even scrape together enough sarcasm at the moment to bite at them with. Just... gross. Depressing. Ugh.

load more comments (1 replies)
load more comments (1 replies)
[–] jmcs@discuss.tchncs.de 27 points 1 year ago (1 children)

So the advantage is that it helps create more planned obsolescence and make sure there will be no one to play the games in 100 years?

[–] LanAkou@lemm.ee 22 points 1 year ago (1 children)

Is that a real question? Like, what are we even doing here?

The advantage is that game companies are paid by hardware companies to push the boundaries of gamemaking, an art form that many creators enjoy working in and many humans enjoy consuming.

"It's ultimately creating more junk so it's bad" what an absolutely braindead observation. You're gonna log on to a website that's bad for the environment from your phone or tablet or computer that's bad for the environment and talk about how computer hardware is bad for the environment? Are you using gray water to flush your toilet? Are you keeping your showers to 2 minutes, unheated, and using egg whites instead of shampoo? Are you eating only locally grown foods because the real Earth killer is our trillion dollar shopping industry? Hope you don't watch TV or go to movies or have any fun at all while Taylor Swift rents her jet to Elon Musk's 8th kid.

Hey, buddy, Earth is probably over unless we start making some violent changes 30 years ago. Why would you come to a discussion on graphical fidelity to peddle doomer garbage, get a grip.

[–] Gaywallet@beehaw.org 31 points 1 year ago (1 children)

reminder to be nice on our instance

[–] LanAkou@lemm.ee 12 points 1 year ago (1 children)

Sorry hoss lost my cool won't happen again 😎

load more comments (1 replies)
[–] theangriestbird@beehaw.org 12 points 1 year ago* (last edited 1 year ago) (4 children)

Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.

To place another point on this: Control got added interest because their graphics were so good. Part of this was Nvidia providing marketing money that Remedy didn't have before, but I think the graphics themselves helped this game break through the mainstream in a way that their previous games did not. Trailers came out with these incredible graphics, and critics and laygamers alike said "okay I have to check this game out when it releases." Now, that added interest would mean nothing if the game wasn't also a great game beyond the initial impressions, but that was never a problem for Remedy.

For a more recent example, see Baldur's Gate 3. Larian plugged away at the Divinity: OS series for years, and they were well-regarded but i wouldn't say that they quite hit "mainstream". Cue* BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics. The actual gameplay is not dramatically different from the Divinity games, but the added graphics made people go "this is a Mass Effect" and suddenly this is the biggest game in the world.

We are definitely at a point of diminishing returns with graphics, but it cannot be denied that high-end, expensive graphics drive interest in new game releases, even if those graphics are not cutting-edge.

load more comments (4 replies)
load more comments (3 replies)
[–] MentalEdge@sopuli.xyz 51 points 1 year ago* (last edited 1 year ago) (6 children)

Shadow can definitely look a lot better than this picture suggests.

The biggest advancements in game graphics have not occurred in characters, except for perhaps in terms of animation and subsurface scattering tech.

The main character always gets a disproportionate graphical resource allocation, and we achieved "really damn good" in that category a while ago.

Adam Jensen didn't look that much better in Mankind Divided, than he did in Human Revolution, but Prague IS SO MUCH MORE DETAILED than Detroit was.

Then there's efficiency improvements in rendering brought by systems like nanite, material shader improvements, more detailed lighting systems and more efficient ambient occlusion.

Improvements in reverse kinematics is something I'm really excited about, as well.

load more comments (6 replies)
[–] mcforest@kbin.social 43 points 1 year ago (3 children)

The thought that today's state of technology is enough and we should stop improving sounds pretty Amish to me.

[–] DragonTypeWyvern@literature.cafe 22 points 1 year ago (1 children)

The word you want is Luddite

[–] verbalbotanics@beehaw.org 22 points 1 year ago (1 children)

Luddites, the original ones were pretty rad. They were anti tech for anti capitalist reasons.

I agree that Luddite is the more correct term since it's more general now, but I hate that the term got warped over time to mean anyone that hates any new tech

[–] T0RB1T@lemmy.ca 9 points 1 year ago

I quite like what Cory Doctorow has to say about it. (Author of Little Brother, coiner of the term "enshittification" (and much much more obviously))

Article

Podcast version

[–] Doods@infosec.pub 15 points 1 year ago (1 children)

I didn't mean we should stop improving, what I meant is we should focus more on the efficiency and less on the raw power.

[–] mcforest@kbin.social 11 points 1 year ago* (last edited 1 year ago) (2 children)

I think game and engine developers should do both. If it's possible to improve efficiency and performance it should be done. But at the same time hardware is improving as well and that performance gain should be used.

I'm kinda worried a little bit about the recent development in hardware though. At the moment GPU power mostly increases with energy consumption and only a little with improved architecture. That was different some years ago. But in my eyes thats a problem the hardware manufactorera have, not the game developers.

[–] icesentry@lemmy.ca 8 points 1 year ago* (last edited 1 year ago)

Performance is always about doing as much as possible with as little as possible. Making a game runs faster automatically makes it more efficient because the only way it can run faster is by doing less work. It's just that whenever you can run faster it means the game has more room for other things.

load more comments (1 replies)
load more comments (1 replies)
[–] usrtrv@lemmy.ml 26 points 1 year ago (3 children)

I understand the sentiment, but it seems like you're drawing arbitrary lines in the sand for what is the "correct" amount of power for gaming. Why waste 50 watts of GPU (or more like 150 total system watts) on a game that something like a SteamDeck will draw 15watts to do almost identically. 10 times less power for definitely not 10 times less fidelity. We could all the way back to the original Gameboy for 0.7 watts, the fidelity drops but so does the power. What is the "correct" wattage?

I agree that the top end gpus are shit at efficiency and we should could cut back. But I don't agree that fidelity and realism should stop advancing. Some type of efficiency requirement would be nice, but every year games should get more advanced and every year gpus should get better (and hopefully stay efficient).

load more comments (3 replies)
[–] Phen@lemmy.eco.br 22 points 1 year ago

Because it impresses people and so it sells. If they didn't do that, all those EAs and Ubisofts would have to find a new selling point like making their games good or something.

[–] QuentinCallaghan@sopuli.xyz 22 points 1 year ago (1 children)

Yes, it basically boils down to diminishing returns, it also eats up all the good potential for creative graphic design.

load more comments (1 replies)
[–] The_Terrible_Humbaba@beehaw.org 21 points 1 year ago* (last edited 1 year ago) (3 children)

I already wrote another comment on this, but to sum up my thoughts in a top comment:

Most (major) games nowadays don’t look worlds better than the Witcher 3 (2015), but they still play the same as games from 2015 (or older), while requiring much better hardware with high levels of energy consumption. And I think it doesn't help that something like an RTX 4060 card (power consumption of a 1660 with performance slightly above a 3060) gets trashed for not providing a large increase in performance.

[–] McLovin@beehaw.org 17 points 1 year ago (2 children)

Its not so much the card itself, its the price and false market around it (2 versions to trick the average buyer, one with literally double the memory). Also its a downgrade from previous gen now cheaper 3070. Its corporate greed with purpose misleading. If the card was 100 € cheaper, it would be actually really good. I think that is the census on reviewers like GN but don’t quote a random dude on the Internet ahah

load more comments (2 replies)
[–] ArtZuron@beehaw.org 10 points 1 year ago (1 children)

I remember seeing an article somewhere about this. Effectively, there really bad diminishing returns with these game graphics. You could triple the detail, but there's only so much that can fit on the screen, or in your eyes.

And at the same time, they're bloating many of these AAA games sizes with all manner of garbage too, while simultaneously cutting the corners of what is actually good about them.

load more comments (1 replies)
[–] Kolanaki@yiffit.net 9 points 1 year ago

still play the same as games from 2015

I wish they played more like games from the late 90's, early 2000's, instead of stripping out a lot of depth in favor of visuals. Back then, I expected games to get more complex and look better. Instead, they've looked better, but played worse each passing year.

[–] GreenMario@lemm.ee 21 points 1 year ago (1 children)

I like seeing advances in graphics technology but if the cost is 10 year dev cycle and still comes out s-s-s-stuttering on high end PCs and current gen consoles then scale back some.

I think we hit a point where it's just not feasible enough to do it anymore.

[–] dillekant@slrpnk.net 39 points 1 year ago (8 children)

"I want shorter games with worse graphics made by people who are paid more to work less and I'm not kidding"

load more comments (8 replies)
[–] Renacles@discuss.tchncs.de 20 points 1 year ago

I've been honestly blown away with how newer games look since I upgraded my graphics card.

Remnant 2 is not even a AAA game but does such a good job with light and reflections that it looks better than anything released 5+ years ago.

Then you have games like Kena: Bridge of Spirits, which have a very nice art style but take advantage of current hardware to add particles everywhere.

[–] SenorBolsa@beehaw.org 20 points 1 year ago* (last edited 1 year ago) (2 children)

I think in some cases there's a lot of merit to it, for example Red Dead Redemption, both games are pretty graphically intensive (if not cutting edge) but it's used to further the immersion of the game in a meaningful way. Red Dead Redemption 2 really sells this rich natural environment for you to explore and interact with and it wouldn't quite be the same game without it.

Also that example of Tomb Raider is really disingenuous, the level of fidelity in the environments is night and day between the two as well as the quality of animation. In your example the only real thing you can tell is the skin shaders, which are not even close between the two, SotTR really sells that you are looking at real people, something the 2013 game approached but never really achieved IMO.

if you don't care then good for you! My wallet wishes I didn't but it's a fun hobby nontheless to try and push things to their limits and I am personally fascinated by the technology. I always have some of the fastest hardware every other generation and I enjoy playing with it and doing stuff to make it all work as well as possible.

You are probably correct in thinking for the average person we are approaching a point where they just really don't care, I just wish they would push for more clarity in image presentation at this point, modern games are a bit of a muddy mess sometimes especially with FSR/DLSS

It mattered a lot more early on because doubling the polygon count on screen meant you could do a lot more gameplay wise, larger environments, more stuff on screen etc. these days you can pretty much do what you want if you are happy to drop a little fidelity in individual objects.

load more comments (2 replies)
[–] DebatableRaccoon@lemmy.ca 20 points 1 year ago (2 children)

Yatzee from The Escapist recently did a video on this exact topic.

Here's the link for those interested.

load more comments (2 replies)
[–] Kir@feddit.it 18 points 1 year ago (1 children)

Everything is ruined by marketing it's capitalist roots, and game development is no exception.

They push for fidelity just because sells well, the fact that this makes for the need of much powerful hardware is not a drawback for them. It's actually good, since it's someone you can profits on.

Games need artistic direction and vision, much more than they need photorealism (which is great for some kind of games, but not a universal standard).

load more comments (1 replies)
[–] Stuka@lemmy.ml 18 points 1 year ago (2 children)

I'm with you. Barely notice the changes in graphics, just the increasing of my gpu fan speeds over the years.

I'm more interested in games that graphics that look good enough, but do more interesting things with the extra horsepower we have these days.

load more comments (2 replies)
[–] HappyMeatbag@beehaw.org 18 points 1 year ago* (last edited 1 year ago)

Games don’t need better, more complex graphics. They need adequate time and resources during the development process. They need to actually be completed by their release date, not just barely playable. They need to be held to a higher standard of quality when publishers judge if they’re ready to sell.

[–] sculd@beehaw.org 17 points 1 year ago (8 children)

Pushing for even more realistic graphics will make the cost of making even higher with no significant change in enjoyment of players.

Players enjoyed games when we had Supernintendos and DOS games. They actually gave players more room for imagination.

load more comments (8 replies)
[–] ColdWater@lemmy.ca 16 points 1 year ago* (last edited 1 year ago)

Realistic graphics are fuxking sucks, not only it requires remotely high end PC to get a somewhat playable fps but it also eat so much storage space which is unnecessary if you play on medium or low graphics which defeated the purpose of "realism"

[–] Poggervania@kbin.social 16 points 1 year ago (1 children)

Art style will always be more important than graphic fidelity, imo. Having a less realistic-looking game with a strong art style allows the game to age better with time.

Take a look at the first Borderlands - it’s from like 2006 or 2008, but it still looks good today because of the cel-shaded art style it went with. Meanwhile the first Uncharted looks goofy as hell today because it was trying to look realistic during that same timeframe that Borderlands was released.

load more comments (1 replies)
[–] squaresinger@feddit.de 15 points 1 year ago

That's why I almost exclusively play indie games. They don't invest massively in graphics, microtransactions or dumb features not related to the game (like the chess/darts/drinking simulators in Watchdogs). Instead, they focus on making games that do one thing and to that one thing great.

[–] sizzle_burn@feddit.de 13 points 1 year ago (3 children)

I prefer good gameplay over great graphics. The brain knows how to fill in the gaps and allow for immersion. Thats why most recently Battlebit Remastered has left such a big impression. It's gameplay is really good and it is a much better game than the last Battlefield, although the Battlefield has much better graphics.

The same goes for some of my other all time favourites. Deep Rock Galactic, Terraria, Minecraft all do well without amazing graphics and take up much less drive space as a nice side effect.

[–] sab@kbin.social 10 points 1 year ago* (last edited 1 year ago) (1 children)

I think some of the issue also has to do with art style more than graphics. Realism is by far the hardest style to achieve, and it seems to be the preferred one for a lot of gamers probably because it makes them feel more adult or something. But I think a lot of games could gain a lot from striving to look good rather than realistic, settling for an art style and committing to it.

From what I've seen around it seems like Baldurs Gate is doing just that, and it seems to have shook up the entire industry.

load more comments (1 replies)
load more comments (2 replies)
[–] Kolanaki@yiffit.net 13 points 1 year ago* (last edited 1 year ago) (1 children)

Play something like The Quarry and you'll want them to be a tad more realistic, cuz it's not quite there and triggers the uncanny valley effect. Seeing the likeness of a real person in a video game even with the best graphics available is still very easily seen to be a video game. Some stuff in the works right now yet to be released built on UE5 is even closer. Some things have been shown to look almost photorealistic, such as Unrecord or The Matrix tech demo. Like so much so, people thought they were fake at first.

load more comments (1 replies)
[–] Wahots@pawb.social 12 points 1 year ago

Mostly I just want games with good stories that are really, really fun to play. And games where I can play with 1-8 of my friends. Games like Sons of the Forest or Raft are perfect for this.

[–] jasonhaven@kbin.social 10 points 1 year ago (2 children)

I'm a big proponent of going with a distinct art style over "realism", because the latter tends to kind of fall apart over time as technology improves. The former, will always look good though.

load more comments (2 replies)
[–] xuxxun@beehaw.org 10 points 1 year ago

I care about the story of the game and general enjoyment. As long as I can see and understand what is going on my screen, i do not care that much about the graphics. I am fine with playing old games with potato graphics. Also, I know we could have both, but if studios had the choice between more accesibility options for more different demographics, or better graphics, i wishthey would always choose to put the resources in accessibility.

I feel that sometimes realistic graphics are what a game needs- like some simulators or horror titles going for that form in immersion. We're not quite over the uncanny valley in AAA titles, and pushing the boundaries of real-time rendering technology can lead to improvements in efficiency of existing processes.

Other times, pushing the boundaries of realism can lead to new game mechanics. Take Counter-Strike 2 and their smoke grenades: they look more realistic and shooting through them disturbs only a portion of the cloud.

I do miss working mirrors in games, though.

[–] whataboutshutup@discuss.online 9 points 1 year ago (2 children)

Honestly, I agree to an extent. I like watching at a well-designed scenery but I think it hurts games if it takes the priority. I'm not playing games for that, but for cool gamedesign ideas and my own experiences with mechanics. That's tl;dr, next is my rant, for I had a long bus ride.

Graphics are very marketable and ad-friendly, easier to implement\control rather than changes to engine or scripts (you need to understand first) and they may cover up the lack in other departments. Effective managers love that. CGI guys at Disney are on strike because this sentiment held as true in movie industry too, and they are overloaded, filming the whole movie over chromakey. Computer graphics almost replaced everything else.

In my perspective, this trend in AAA lowers the quality of the end product, makes it safer to develop (formulaic reiteration) but just ok to play, mostly unremarkable. Indie and small gamestudios can't compete with them in visuals, so they risk and try to experiment, bring novelty, and sometimes win a jackpot.

Like, obviously, Minecraft, that was initially coded by Notch alone. It invented indie scene as we know it now. It put tech and mechanics over looks, and the whole world was playing it. No one cared for how abstract it is being addicted to the gameplay.

Playing older games, I see, that they were in this race too, like how (recently remastered) Quake 2 was a great visual upgrade over Quake 1. People sold an arm and a leg to play them on HIGH at that time. And how they nodded like yeah, now it's just like a real life watching at a 640x320 screenshot, or how marketologists sold it. But somehow they were made completely different in many ways, not gfx alone, and that's for a braindead shooter. I feel it with my fingers. I see it in how the game logic works. This sensation was greater for me than anything I see on the screen.

Not being able to recall what happened in what CoD game, I become more amused with how gamedesign, presented via code, affects the feeling of a game. How in Disco Elysium all these mental features made it stand out. How Hotline: Miami did extreme violence so stylish. How Dwarf Fortress taught me to care about ASCII symbols on my screen but accepting the fun of loosing them. How the first MGS's Psycho Mantis read my savefiles from other games and vibrated my controller on the floor with his psychic power.

These moments and feelings can't be planned and managed like creation of visual assets. And they are why I like games, as outdated as NES ones or as ugly as competitive Quake config looks. They, like making love with a loving partner, hits different than a polished act of a fit and thin sex-worker. They bring unique experience instead of selling you a horse-painted donkey.

And that's why I don't really care about graphics and dislike their unending progress.

load more comments (2 replies)
load more comments
view more: next ›