In 1996, when the Nintendo 64 was first launched in the United States, it sold 1.6 million units (worth $200 each) in its first quarter. Its closest competitor for the holiday season was a $30 Tickle Me Elmo doll, which sold around a million units in the same window. More than 20 years later, when Nintendo’s $300 Switch sold 1.5 million units in its first week, there was a lot more competition, and not just for the holiday season.
The business of gaming has changed dramatically since its early days. From basic monetization through the sale of physical and digital copies of games to in-game monetization through microtransactions, the widespread adoption of the internet has caused a pronounced shift in the gaming landscape. While the previous millennium’s video game studios depended on revenue from selling games and gaming hardware, today’s goliaths don’t expect you to buy their games at all.
Nintendo is a relatively rare example of a large gaming studio that hasn’t delved too deep into the microtransaction waters. Fortnite rakes in around $5 billion per year for Epic Games, and with numbers like that, you can bet most gaming companies are at least investigating the free-to-play model. However, this shift in consumer mindset from deep loathing to moderate acceptance for microtransactions has been a long, arduous process.
Fortnite was far from the first game to introduce microtransactions, but it was one of the first mainstream examples of a live-service game that relied purely on in-game purchases. This came at a time when the concept of microtransactions invoked images of toxic loot-box economies and luck-based purchases that had games morphing into “pay-to-win” ecosystems and as consumers were growing increasingly
Read more on cointelegraph.com