I want to shed light on a tactic that involves collecting data as you play, feeding this data into complex algorithms and models that then alter the rules of your game under the hood to optimize spending opportunities.
Being a lifelong pessimist, I am getting really tired of article headlines telling me that [x] is worse than I think.
Loot boxes, for example, aren’t inherently predatory; they can add an exciting and rewarding surprise element when balanced with noble intentions.
When you sell them, they’re unregulated gambling that children can access.
When designing a battle pass, a designer must answer questions like “How much faster should a premium player progress compared to a F2P player?” and “How long should it take for a player to finish the battle pass?” I’ve seen designers balance it fairly, like by requiring 30 minutes of daily play to complete the free track or $5 to unlock the premium pass.
I still don’t see a way that this could ever be anything other than creating an incentive to play the game for reasons beyond the game being fun, no matter how “fair” it is to the person needing to spend money or not. They’re still artificially creating another body in the matchmaking pool that creates value for someone more willing to part with their dollar. If your player base dries up when you stop offering your battle pass incentives, I’d say that was some artificial retention, and it’s kind of gross.
I definitely didn’t need more reasons to hate live services. The business model has always affected the game design, and a lot of the author’s bullet points could be seen as far back as the arcades, but I don’t think we’ve ever had a better business model for all parties than “sell a good product at a fair price”.
Well, the missing context is that this is how a lot of gaming is tuned regardless. It’s pretty basic economy tuning to look at how long a task takes to complete and tune based on that (for games with grind, anyway, think RPGs).
So if you’re playing “Perfectly Fair Single Player RPG 3” there’s a more than fair chance that the developers looked at the expected completion time of a quest, plugged in that time into some spreadsheet and assigned XP and other rewards to the quest based on that, just to keep the XP curve of the game somewhat predictable. This is a big rabbit hole with a bunch of nuance, but for these purposes we can assume they at least started by doing that flat on all quests.
If you have a F2P game and you’re charging for things you can also grind I frankly don’t see a much better place to start.
Now, if your premise is that all design for engagement in F2P is gross because it’s servicing your business and all design for engagement in paid games is fine because that’s just seeking “fun”… well, I don’t know that gets fixed. I agree that pay-up-front games can benefit from getting the ugly matter of getting money from players out of the way early, but these days even those games are trying to upsell you into later content, sequels and other stuff, so the difference is rarely that stark.
I think there’s a conversation to be had about whether “good”, “fun” and “makes people want to engage more” should be seen as the same thing and, if not, what the difference is. It’s tricky and nuanced and I don’t know that you can expect every game to be on one end of that conversation. Sometimes a person just wants to click on a thing to make number go up, and that’s alright.
I think the incentives matter. Diablo II is about making number go up, but Diablo IV has an active incentive to slow you down and make that number go up at a certain rate so that they can upsell you again later. And rather than taking a hardline position, I’d at least ask the question out loud: Is it possible to have a business model for a game other than selling a good product at a fair price and not have it eventually evolve into something gross? Maybe the old shareware model, which is basically just a demo, but other than that, I’m not sure.
I think there’s an argument to be made that some level of retention strategy may be a necessary evil in today’s market, especially when all your competitors are doing it. No developer wants to run the risk of letting that playerbase dry up. You can have the best multiplayer game in the world, but all the brownie points for playing fair wouldn’t mean much if I’m sitting in an empty queue with no one to play with.
It’s fine line to walk to make sure players are coming back for the right reasons, but you do want them to come back.
I think there’s also an argument to be made that all of this desire to suck up our attention has made it more difficult for the same developers to market their next game, since their potential customers are all preoccupied with something they haven’t stopped playing. It’s extremely natural for most people to fall off of a game after its initial release, and it’s definitely going to happen once they take their thumbs off the scales.
Now you have a prisoner’s dilemma. A lot of studio’s need to take their thumbs off the scale at the same time, or you’re just sending your customers to someone else.
I’d argue they’re different markets. The people who play every new Call of Duty and the people who play Spec Ops: The Line are not the same people.
I’ve been looking for deathmatch shooters for a long time, like what we got from the late 90s to the mid 10s. There are very very few. I don’t care if I or anyone else move on quickly, because I primarily want to play with my friends, and the deathmatch mode typically came alongside a campaign and maybe co-op modes. That’s not a prisoner’s dilemma, and the market hasn’t really been making games like that anymore. Same for things like arcade racers akin to F-Zero or Burnout.