Project Natal Tells Game Developers the Xbox 360 is Broken

In case you missed it, Project Natal (or Wave or whatever) is an huge admission that the next generation of video game consoles are FUBAR.


Departing Xbox Godfather Robbie Bach described Natal as a "midlife kicker for the 360". Now, if my math is correct, mid-life generally comes close to the halfway point of a life. With the Xbox 360 launching in Nov. of 2005, that means we can enjoy red-ringing for another four to five years. To put that in perspective, the entire life cycle of the original Xbox was five years.


Sure, Natal looks like fun - and a way to squeeze some more revenue juice out of a platform that has already had the money sunk into it. But there's also a cold, harsh reality behind the scenes - the economics of a new generation of console don't make sense for video game developers (with Xbox or Playstation).

It's already extremely difficult for an independent game studio to break even on a console title, much less make a profit. Moving 1.5 million units as a 3rd party game developer is a big "so what". And if the cost to produce a game goes up the way it did last generation, good luck making real money unless you're a top title (oh, and it take like $50M to even ante up for that game). A new console generation anytime soon would mean a bloody revolt among video game developers (and I'm not talking a Nintendo-style Revolution)

Unless DLC and other innovate ways to squeeze more cash out of gamers, er, I mean, gain incremental revenue from the retail price, independent game devs won't make enough money to keep the lights on.

As Sean Dugan sneaks up on mid-life, he eagerly awaits his personal kicker. 



Image by Juan Pol via Creative Commons license

3 comments:

  1. Hmm, are they FUBAR, or AWOL? (Or MIA, as game development seems to be in a bloody revolutionary period right now...)
    I'm really curious to see how this all plays out, as both game developers and console makers are in a bind. Console makers can only convince people to buy the devices if there's some clear benefit over their current console, which traditionally has been better graphics. But it would take a huge leap in both processing power and game asset qualities to have the same sorts of graphical improvements we've seen in the past (assuming it's even possible; it might have to be movie special-effects quality). There's talk that the next generation of consoles would double game development costs, which seems in line with historical trends, but doesn't take into account that sort of quantum leap. Absolutely there's no reliable funding model for games of that cost.
    Games that rely on DLCs and RMTs to make a profit are taking on more risk than they do with simple box sales; you not only have to convince people the game is worth playing, you have to re-convince them that each additional bit of content is worth playing (and paying for) as well. I could see DLCs being used to add to the profits of hit games, but I skeptical of them doing much for the smaller games. A major problem is the player expectation in terms of content duration per unit of money; I read comments from a lot of buyers of DLCs who complain that since the DLC cost, say, 1/4 that of the game, it should have at least 1/4 the number of hours of gameplay as the original game. Obviously that isn't likely if the purpose of the DLC is to subsidize the base game. Unless players are really devoted to your game, they're better off just buying a new game instead of a couple of your DLCs.

    Better development tools would really help make DLCs economically viable. It seems to be a highly neglected area of game development. Developers might have been able to get away with that ten years ago, but now it's inexcusable. Very few developers I've spoken to have had great things to say about the tools they used for their previous projects unless it used a licensed engine (and sometimes not even then...). I know of hugely expensive games where the development tools were absolutely appallingly bad; fatally so in many of the cases.

    I think this ties in with your previous post - it seems like developers at the moment need to stop thinking about games as single, finished products, but as a series from the very begining (either as sequels or DLCs) even if they aren't also online social experiences (e.g. WoW, L4D, Farmville) which by their natures are continuing projects. Perhaps then they'll realize that the time and money they spend on tools are long-term investments.

    ReplyDelete
  2. Yeah, I agree with your point about tools. It's rare a project that has truly excellent ones. Although, the main selling point of a new console is its graphics processing power - and it seems the more polys you try and put onscreen, the more work it takes from an artist. It means there's limits to how much efficiency you get from better tools.

    And the whole value proposition of DLC seems to rankle a lot of gamers. Microtransactions don't seem to bother casual gamers the way it does hardcore, as they are very concerned about how much more play value they get from the DLC, was it a download or just an unlock on the disk, etc.

    At some point, I gotta write something that breaks down the expense of a typical console title - the numbers sure aren't pretty!

    ReplyDelete
  3. "It means there's limits to how much efficiency you get from better tools."
    This is true, but in the cases I'm thinking of, there seems to be no limit to how much *less* efficient the asset creation process could be, due to the tools and general asset creation process as dictated by the needs of the engine. Asset creation took many times longer than it should have in these cases, with the actual creation of the high-poly models ending up being the fast part. In the most egregious case, the success of the game was predicated on the ability to quickly release new content, but it took so long to make levels (and required so much programmer support), due to the semi-functional tools, that this was impossible. If your revenue model depends on releasing lots of content after the game ships, you'd better have the most efficient content creation & import process possible.

    Yeah, Microtransactions and DLCs are polar opposites in many ways. Casual gamers seem to have completely different sets of expectations, in part due to the absence of an up-front cost for the games (though the differences go beyond that). The most successful RMTs have low to nonexistent asset costs, and low coding costs; they're often just database changes (money, experience rates, etc.), which are exactly the sorts of things that are intolerable to hardcore players. I can't imagine disk unlocks, or clear gaps in content where DLCs will obviously be "plugged-in" (e.g. as in Dragon Age), will *ever* go over well when players (justifiably) feel like they're being double-charged for a game they bought.

    Speaking of breaking down costs, I'm really curious if anyone has figured out the true costs of outsourcing? Since asset creation is such a huge part of the cost of next-gen games, and outsourcing has already become almost standard, this seems like an important consideration. I've worked with so many people who were convinced that the art outsourcing work done for their previous projects had ended up costing them more money than they "saved," due to the amount of work that didn't meet requirements. My own experiences are with "in-sourcing" work that was done in Asian-based studios owned by the same parent company, where cost wasn't an issue, but quality was poor, as the artists there had no experience working with current-gen graphics.

    ReplyDelete