Mobile Development is like Console Development

It's interesting to compare the PC market with the current mobile market in terms of the technical limitations of the platforms, and how developers work within those limitations.

Back in the day, a developer could release an application that was just adequate on today's hardware, knowing that before long, the hardware would catch up. And users became used to the idea of buying software that was slow, and upgrading their hardware to run it better.

Today, it's not uncommon for the release of a new PC game to send tens of thousands of PC users out to buy more RAM, faster processors or faster video cards. Or all of the above. Sure, the game would run on their old hardware, but it wouldn't run very well. New software meant it was upgrade time.

Sometimes the software was slow because of real limitations in the current systems - the software really was pushing the technical envelope - and sometimes it was just laziness or a lack of focus on performance. In the end, the result for the user is basically the same: The software's performance was fixed and it was up to the user to make the hardware match it.

Mobile, however, is another story, and is actually a lot more like console development. An Xbox 360 or a PS3 or a Wii is your target system, and if you don't run well on the current hardware, you're dead in the market. The hardware's performance is fixed, and it's up to the developer to make the software run well on it.

I like that Apple has fairly static targets, with predictable upgrade cycles. It goes a long way towards making a stable platform. If I'm planning to write an iPad app today, there's one hardware target. It's got a certain amount of RAM, a certain CPU, a certain video chipset, and that's it. If my software runs well on that hardware, then the users are happy. If it doesn't, then users won't buy my software.

Apple's restriction of cross-compilation and code-generation for iPhone applications seems to be targeted at performance. That's the only way it makes technical sense to me anyway. It should be the developers goal to write software that runs well on the target hardware. If a particular toolkit doesn't make that possible then developers aren't going to adopt it. But if it does, and makes it easier for a developer to write software (easier than 20+ year old languages), then doesn't everyone win?