The huge majority of indie games never make any money at all. This link is a little older, but it claims that 50% of indie games on steam never make more than $4000, only 25% ever make more than $26 000 and only 14% cross the $100k mark.
Considering the cost of developers, that’s about 1-2 man years for the $100k mark, and then there’s only a 14% chance of even recouping that.
Passion projects work out because the people making them don’t value their time as work time, don’t make a salary from it, and even then in the huge majority of cases, it doesn’t work out financially.
Imagine having 10k employees and not setting aside an indie dev team or two for passion projects.
This statement holds true for pretty much every other corporation. Imagine owning a huge farm and not setting aside a few farm hands to grow old artisan vegetables. Imagine owning a supermarket chain and not setting aside a few shops for exotic sweets from Central Africa. Imagine owning a fast food chain and not setting aside a few restaurants for artisan burger variations.
Yes, every corporation could afford to do stuff like that, but they aren’t there to advance humanity by investing in arts and crafts, but for making every last drop of money they can. And yes, there’s much to criticise about this goal, but making little indie passion projects doesn’t work well with corporations.
Yes, being “safe” means you won’t make the next Minecraft, where a hobby budget turns into the best selling game of all time. But it also means that the people who buy every instalment of Fifa or Assassin’s Creed will also buy it.
These popular franchises almost always turn a calculable profit as long as they don’t experiment and do something new that bombs.
As sad as it is, it actually does work out.
That’s why we gamers shouldn’t trust on AAA titles bringing something great to the market. If you want to play a game like you watch linear TV (plonk down on the couch/in front of the PC and to whatever to relax and waste time), then AAA is great. If you want to play something new, something exciting, something that you haven’t played before, then go with lower-budget titles.
AAA is the McDonalds of games. You don’t go to McDonalds for the freaky hand-crafted vegan fusion kitchen bacon burger with crazy Korean curry mayo and caramelized lettuce.
Fear of failure becomes self fulfilling, yeah. You get so worried about making the wrong move and losing money that you can have your spotlight stolen by a challenger doing it fresher for 1/10th the budget.
20 years ago people were complaining about the same lack of creativity in the AAA scene, saying that gaming was better in the 90s. In fact I remember it was a common talking point that AAA gaming had gotten so bad that there would surely be another crash like the one in '83.
Here’s how I see it: From a gameplay standpoint:
My perception of the mid to late 2000s is that every AAA game was either a modern military shooter, a ‘superhero’ game (think prototype or infamous), or fell somewhere in the assassin’s creed, far cry, GTA triangle. Gameplay was also getting more and more trivial and braindead, with more and more QTE cuts scenes. The perception among both game devs and journalists was that this was a good direction for the industry to go, as it was getting away from the ‘coin sucking difficulty’ mentality of arcade games and moving towards games as art (i.e. cinematic experiences). There were of course a few games like Mirrors Edge, and games released by Valve, but they were definitely the exception rather than the rule (and Valve eventually stopped making games). Then Dark Souls came out and blew all their minds that a game could both have non-braindead gameplay and be artful at the same time.
Now I would say we’ve actually seen a partial reversal of this trend. Triple A games are still not likely to be pioneers when it comes to gameplay, we’ve actually seen a few mainstream franchises do things like using Souls-like combat or immersive-sim elements, which IMO would have been unthinkable 15 years ago.
From an aesthetic standpoint:
My perception of the mid to late 2000s is that everything was brown with a yellow piss filter over it. If you were lucky it could be grey and desaturated instead. This was because Band of Brothers existed, and because it was the easiest way to make lighting look good with the way it worked at the time. As an aside, Dark Souls, a game where you crawl around in a sewer filled with poop and everyone is a zombie that’s also slowly dying of depression because the world is going to end soon and they’ve lost all hope, had more color than the average 2000s game where you’re some sort of hero or badass secret agent.
Things are absolutely better in the aesthetic department now. Triple A studios remembered what colors looked like.
From a conceptual / narrative standpoint:
I don’t think AAA games were very creative in this department in the 2000s and I don’t think they’re very creative now. They mostly just competed to see who could fellate the player the hardest to make them feel like a badass. If you were lucky the player character was also self destructive and depressed in addition to being a badass.
Then and now your best bet for a creative premise in a high budget game is to look to Japanese developers.
From a consumer friendliness / monetization standpoint:
In the 2000s people were already complaining about day one DLC, battlepasses and having to pay multiple times just to get a completed game.
Now its worse than its ever been IMO. Not only do AAA games come out completely broken and unfinished, really aggressive monetization strategies are completely normalized. Also companies are pretty reluctant to make singleplayer games now, since its easier to farm infinite gacha rolls from a multiplayer game (although this was kinda already the case in the 2000s).
Overall I think we’re now in a golden age for indie games, and things like Clair Obscura and Baldur’s Gate 3 give me a lot of hope for AA games.
Assassins Creed 1 came out in 2007, less than 20 years ago. It was mindbogglingly fresh and innovative back then. An open world where you can’t just run anywhere you want, but also climb anywhere? And your character dynamically climbed up walls, finding places to hold onto everywhere? That was amazing back then. It was the first game that even attempted anything like that, and it was really, really good. AC only became lame when they started doing the same over and over again with little change.
Similar story with Far Cry. FC1 came out in 2004, only FC2 was also released in that decade (2008). Both FC1 and FC2 were doing something new, fresh and genre-defining. Looking back from now, yes, these games look like everything else that followed it, but because these games defined it.
But in this decade we saw a lot of other genre-defining games, like Warcraft 3 (2002/2003), WoW (2004), KOTOR (2003), Bioshock (2007), Crysis (2007), Fable (2004), Batman: Arkham Asylum (2009), Portal (2007) and also a lot of AAA flops that happened due to too much experimentation and shooting for the stars, like Spore (2008).
And most of the games I listed above don’t have a piss filter.
So, when I mention the Assassin’s Creed / Far Cry / GTA triangle I really mean to say the poor imitators of those games. They did do some very innovative things when they first came out, but just like modern military shooters took regenerating health and the two weapon limit from Halo while leaving behind all the other gameplay mechanics that made that work, so too did many games adopt the open world and the general way you interact with it, while removing anything interesting. By “the way you interact with it” I’m referring specifically to the map unlocking, the collectables, the village / territory faction control, and the “heat” system that spawns enemies depending on how much attention you are generating.
IMO those sorts of games were very much the other side of the coin from CoD-likes, and the problem was that while the extremely linear levels of CoD-likes were too restrictive, these open world games had no structure at all. In games like Blood, Quake, or what have you, encounters are designed to flow in a certain way, with each one having its own flavor and favoring certain approaches over others. In some games you can even think of enemy encounters as a puzzle you need to solve. Level design and enemy placement of course form the two halves of encounter design. In good games this sort of thing extends to the structure of the game as a whole, with the ebs and flows in the action, and different gameplay happening in different sections so the formula keeps getting changed up. But in games where the level design is an open world that let’s you approach from any angle, and where enemy placement is determined on the fly by a mindless algorithm, there is no encounter design. At the same time the way enemy spawning works is actually too orchestrated to have interesting emergent gameplay. For example, if an algorithm made an enemy patrol spawn an hour ago, and the player can see it from across the map, they can come up with their own plan on how to deal with this unique situation. If the player gets one bar of heat and the algorithm makes an enemy spawn around a corner they can’t anticipate that at all, its just mindless. This has implications for the gameplay itself (no enemy can be very tough or require very much thinking or planning if you’re just going to spawn them around a corner) but also, as previously stated, the entire structure of the game.
As for the other games you mention, I want to bring up Bioshock in particular. Its true, that game is a master class in presentation and aesthetics, and a game I would highly recommend, but its actually one of the games that I remember people complaining about when they said gaming was better in the 90s. Specifically the way Bioshock is very dumbed down compared to its predecessor System Shock, both from a general game and level design standpoint, but also because of the inclusion of vita chambers and the compass pointer that leads you around by the nose. (One place I will give Bioshock points though is that it has way more of an ecosystem than most imm-sims with the way enemies interact with each other; it even beats out imm-sim darling Prey 2017 in this regard).
This is admittedly a way more niche complaint than people complaining about QTEs or games being piss/brown, but it was definitely a smaller part of the much larger “games are getting dumbed down” discourse.
I could talk about Crysis and Spore too, but this comment is already really long. I haven’t played the rest of the games you list, so I can’t offer an opinion on them, though I have heard that KOTOR was very good.
So, when I mention the Assassin’s Creed / Far Cry / GTA triangle I really mean to say the poor imitators of those games.
That only happened in the 2010s. That’s when the Ubisoft formula really took off. Assassin’s Creed 1 was only released in 2007, Far Cry 2 in 2008 (FC1 was a quite different game). GTA also only started to get imitated in the 2010s.
Open World in that sense (non-scripted encounters that can be approached from many different angles, with a “living” world) only became a thing in the late 2000s, precisely because of games like Assassin’s Creed and Far Cry 2.
I remember reading a pre-release article about Far Cry 2 in a game magazine, where were all hyped about the many different ways a player could take out an enemy camp, e.g. go in guns blazing, or set a fire that would spread to the camp, or startle wild animals which then would stampede through the camp.
While I do get your point about hand-crafted deterministic enemy placement, it’s just two different kinds of approaches that work for different players.
When you say “dumbed-down”, I understand you mean that the difficulty was too low, is that correct? While some players love or even need punishing difficulty levels, others play for other reasons. (Maybe check out the Bartle taxonomy of player types. It’s a bit outdated, but it shows some of these different reasons quite well.) If you want to just kick back and relax after a hard day of work, punishing difficulty might not be the right thing. Some players want to have to learn (or even memorize) levels/bosses/encounters and repeat them repeatedly until they know exactly which button to press when, and that’s fine. For others that’s just tedious busywork, everyone’s different. I quite enjoyed Far Cry 2 and its random encounters and having to adapt to different scenarios all the time.
I haven’t played the rest of the games you list, so I can’t offer an opinion on them, though I have heard that KOTOR was very good.
Forgive me for saying that, but it’s quite harsh to call a whole decade of games uncreative if you haven’t played a lot of the greatest and most creative games of that time.
To get back to the original point:
20 years ago people were complaining about the same lack of creativity in the AAA scene, saying that gaming was better in the 90s. In fact I remember it was a common talking point that AAA gaming had gotten so bad that there would surely be another crash like the one in '83.
That was in the 2010s, not in the 2000s. In the 90s, game development was pretty much completely low-budget, with games rarely having more than 5 programmers on staff, and maybe 5-10 content creators. In the 2000s games started getting bigger, but the studios were still led by game developers, not by finance dudes. Budgets were still not nearly where they are today. Assassins Creed 1, for example, had a budget of $20mio. Compare that to e.g. the $175mio that AC Valhalla cost to make. And AC1 was comparatively expensive back then.
It was only in the 2010s when finance really got into gaming, budgets ballooned and risks were lowered to nothing.
20 years ago AAA games could still experiment, but that was because back then AAA games had about the same budget as big indie games now.
You just can’t gamble if you have 10k employees and hundreds of millions riding on it.
Imagine having 10k employees and not setting aside an indie dev team or two for passion projects.
The huge majority of indie games never make any money at all. This link is a little older, but it claims that 50% of indie games on steam never make more than $4000, only 25% ever make more than $26 000 and only 14% cross the $100k mark.
Considering the cost of developers, that’s about 1-2 man years for the $100k mark, and then there’s only a 14% chance of even recouping that.
Passion projects work out because the people making them don’t value their time as work time, don’t make a salary from it, and even then in the huge majority of cases, it doesn’t work out financially.
This statement holds true for pretty much every other corporation. Imagine owning a huge farm and not setting aside a few farm hands to grow old artisan vegetables. Imagine owning a supermarket chain and not setting aside a few shops for exotic sweets from Central Africa. Imagine owning a fast food chain and not setting aside a few restaurants for artisan burger variations.
Yes, every corporation could afford to do stuff like that, but they aren’t there to advance humanity by investing in arts and crafts, but for making every last drop of money they can. And yes, there’s much to criticise about this goal, but making little indie passion projects doesn’t work well with corporations.
Being “safe” is also a gamble, if you aren’t bringing anything new or unique you’re gambling that the title or brand is sufficient for success.
Less so though.
Yes, being “safe” means you won’t make the next Minecraft, where a hobby budget turns into the best selling game of all time. But it also means that the people who buy every instalment of Fifa or Assassin’s Creed will also buy it.
These popular franchises almost always turn a calculable profit as long as they don’t experiment and do something new that bombs.
As sad as it is, it actually does work out.
That’s why we gamers shouldn’t trust on AAA titles bringing something great to the market. If you want to play a game like you watch linear TV (plonk down on the couch/in front of the PC and to whatever to relax and waste time), then AAA is great. If you want to play something new, something exciting, something that you haven’t played before, then go with lower-budget titles.
AAA is the McDonalds of games. You don’t go to McDonalds for the freaky hand-crafted vegan fusion kitchen bacon burger with crazy Korean curry mayo and caramelized lettuce.
Fear of failure becomes self fulfilling, yeah. You get so worried about making the wrong move and losing money that you can have your spotlight stolen by a challenger doing it fresher for 1/10th the budget.
20 years ago people were complaining about the same lack of creativity in the AAA scene, saying that gaming was better in the 90s. In fact I remember it was a common talking point that AAA gaming had gotten so bad that there would surely be another crash like the one in '83.
Here’s how I see it:
From a gameplay standpoint: My perception of the mid to late 2000s is that every AAA game was either a modern military shooter, a ‘superhero’ game (think prototype or infamous), or fell somewhere in the assassin’s creed, far cry, GTA triangle. Gameplay was also getting more and more trivial and braindead, with more and more QTE cuts scenes. The perception among both game devs and journalists was that this was a good direction for the industry to go, as it was getting away from the ‘coin sucking difficulty’ mentality of arcade games and moving towards games as art (i.e. cinematic experiences). There were of course a few games like Mirrors Edge, and games released by Valve, but they were definitely the exception rather than the rule (and Valve eventually stopped making games). Then Dark Souls came out and blew all their minds that a game could both have non-braindead gameplay and be artful at the same time.
Now I would say we’ve actually seen a partial reversal of this trend. Triple A games are still not likely to be pioneers when it comes to gameplay, we’ve actually seen a few mainstream franchises do things like using Souls-like combat or immersive-sim elements, which IMO would have been unthinkable 15 years ago.
From an aesthetic standpoint: My perception of the mid to late 2000s is that everything was brown with a yellow piss filter over it. If you were lucky it could be grey and desaturated instead. This was because Band of Brothers existed, and because it was the easiest way to make lighting look good with the way it worked at the time. As an aside, Dark Souls, a game where you crawl around in a sewer filled with poop and everyone is a zombie that’s also slowly dying of depression because the world is going to end soon and they’ve lost all hope, had more color than the average 2000s game where you’re some sort of hero or badass secret agent.
Things are absolutely better in the aesthetic department now. Triple A studios remembered what colors looked like.
From a conceptual / narrative standpoint: I don’t think AAA games were very creative in this department in the 2000s and I don’t think they’re very creative now. They mostly just competed to see who could fellate the player the hardest to make them feel like a badass. If you were lucky the player character was also self destructive and depressed in addition to being a badass.
Then and now your best bet for a creative premise in a high budget game is to look to Japanese developers.
From a consumer friendliness / monetization standpoint: In the 2000s people were already complaining about day one DLC, battlepasses and having to pay multiple times just to get a completed game.
Now its worse than its ever been IMO. Not only do AAA games come out completely broken and unfinished, really aggressive monetization strategies are completely normalized. Also companies are pretty reluctant to make singleplayer games now, since its easier to farm infinite gacha rolls from a multiplayer game (although this was kinda already the case in the 2000s).
Overall I think we’re now in a golden age for indie games, and things like Clair Obscura and Baldur’s Gate 3 give me a lot of hope for AA games.
I think your perception might be 10 years off.
Assassins Creed 1 came out in 2007, less than 20 years ago. It was mindbogglingly fresh and innovative back then. An open world where you can’t just run anywhere you want, but also climb anywhere? And your character dynamically climbed up walls, finding places to hold onto everywhere? That was amazing back then. It was the first game that even attempted anything like that, and it was really, really good. AC only became lame when they started doing the same over and over again with little change.
Similar story with Far Cry. FC1 came out in 2004, only FC2 was also released in that decade (2008). Both FC1 and FC2 were doing something new, fresh and genre-defining. Looking back from now, yes, these games look like everything else that followed it, but because these games defined it.
But in this decade we saw a lot of other genre-defining games, like Warcraft 3 (2002/2003), WoW (2004), KOTOR (2003), Bioshock (2007), Crysis (2007), Fable (2004), Batman: Arkham Asylum (2009), Portal (2007) and also a lot of AAA flops that happened due to too much experimentation and shooting for the stars, like Spore (2008).
And most of the games I listed above don’t have a piss filter.
So, when I mention the Assassin’s Creed / Far Cry / GTA triangle I really mean to say the poor imitators of those games. They did do some very innovative things when they first came out, but just like modern military shooters took regenerating health and the two weapon limit from Halo while leaving behind all the other gameplay mechanics that made that work, so too did many games adopt the open world and the general way you interact with it, while removing anything interesting. By “the way you interact with it” I’m referring specifically to the map unlocking, the collectables, the village / territory faction control, and the “heat” system that spawns enemies depending on how much attention you are generating.
IMO those sorts of games were very much the other side of the coin from CoD-likes, and the problem was that while the extremely linear levels of CoD-likes were too restrictive, these open world games had no structure at all. In games like Blood, Quake, or what have you, encounters are designed to flow in a certain way, with each one having its own flavor and favoring certain approaches over others. In some games you can even think of enemy encounters as a puzzle you need to solve. Level design and enemy placement of course form the two halves of encounter design. In good games this sort of thing extends to the structure of the game as a whole, with the ebs and flows in the action, and different gameplay happening in different sections so the formula keeps getting changed up. But in games where the level design is an open world that let’s you approach from any angle, and where enemy placement is determined on the fly by a mindless algorithm, there is no encounter design. At the same time the way enemy spawning works is actually too orchestrated to have interesting emergent gameplay. For example, if an algorithm made an enemy patrol spawn an hour ago, and the player can see it from across the map, they can come up with their own plan on how to deal with this unique situation. If the player gets one bar of heat and the algorithm makes an enemy spawn around a corner they can’t anticipate that at all, its just mindless. This has implications for the gameplay itself (no enemy can be very tough or require very much thinking or planning if you’re just going to spawn them around a corner) but also, as previously stated, the entire structure of the game.
As for the other games you mention, I want to bring up Bioshock in particular. Its true, that game is a master class in presentation and aesthetics, and a game I would highly recommend, but its actually one of the games that I remember people complaining about when they said gaming was better in the 90s. Specifically the way Bioshock is very dumbed down compared to its predecessor System Shock, both from a general game and level design standpoint, but also because of the inclusion of vita chambers and the compass pointer that leads you around by the nose. (One place I will give Bioshock points though is that it has way more of an ecosystem than most imm-sims with the way enemies interact with each other; it even beats out imm-sim darling Prey 2017 in this regard).
This is admittedly a way more niche complaint than people complaining about QTEs or games being piss/brown, but it was definitely a smaller part of the much larger “games are getting dumbed down” discourse.
I could talk about Crysis and Spore too, but this comment is already really long. I haven’t played the rest of the games you list, so I can’t offer an opinion on them, though I have heard that KOTOR was very good.
That only happened in the 2010s. That’s when the Ubisoft formula really took off. Assassin’s Creed 1 was only released in 2007, Far Cry 2 in 2008 (FC1 was a quite different game). GTA also only started to get imitated in the 2010s.
Open World in that sense (non-scripted encounters that can be approached from many different angles, with a “living” world) only became a thing in the late 2000s, precisely because of games like Assassin’s Creed and Far Cry 2.
I remember reading a pre-release article about Far Cry 2 in a game magazine, where were all hyped about the many different ways a player could take out an enemy camp, e.g. go in guns blazing, or set a fire that would spread to the camp, or startle wild animals which then would stampede through the camp.
While I do get your point about hand-crafted deterministic enemy placement, it’s just two different kinds of approaches that work for different players.
When you say “dumbed-down”, I understand you mean that the difficulty was too low, is that correct? While some players love or even need punishing difficulty levels, others play for other reasons. (Maybe check out the Bartle taxonomy of player types. It’s a bit outdated, but it shows some of these different reasons quite well.) If you want to just kick back and relax after a hard day of work, punishing difficulty might not be the right thing. Some players want to have to learn (or even memorize) levels/bosses/encounters and repeat them repeatedly until they know exactly which button to press when, and that’s fine. For others that’s just tedious busywork, everyone’s different. I quite enjoyed Far Cry 2 and its random encounters and having to adapt to different scenarios all the time.
Forgive me for saying that, but it’s quite harsh to call a whole decade of games uncreative if you haven’t played a lot of the greatest and most creative games of that time.
To get back to the original point:
That was in the 2010s, not in the 2000s. In the 90s, game development was pretty much completely low-budget, with games rarely having more than 5 programmers on staff, and maybe 5-10 content creators. In the 2000s games started getting bigger, but the studios were still led by game developers, not by finance dudes. Budgets were still not nearly where they are today. Assassins Creed 1, for example, had a budget of $20mio. Compare that to e.g. the $175mio that AC Valhalla cost to make. And AC1 was comparatively expensive back then.
It was only in the 2010s when finance really got into gaming, budgets ballooned and risks were lowered to nothing.
You’re right, as is so often the case when people talk about a decade I’m thinking more of its latter half and the beginning half of the next one.
But in my defense I did say “the mid to late 2000s”.
I have a few more thoughts, but I’ll have to make another reply in a bit.