App lust annoys me

posted by Jeff | Monday, October 1, 2012, 11:05 PM | comments: 0

I'll never forget the announcement of the original iPhone. By that time, I already had the first MacBook Pro and a Mac Pro, both Intel machines, and I was drinking the Kool-Aid®. Later in that day, Apple posted the video from the announcement, and I watched it enthusiastically. The thing that really stuck with me was Steve Jobs' position that the Web itself was the app, and that there would not be native development for the phone. That seemed like an exceptionally bold, and in my opinion correct, position to take.

A small but vocal group of people went ape-shit over this. How dare Apple deny me the ability to write software for this phone! (Most people, as is often the case with Apple products, just went along with whatever Steve said.) I don't know what the reality of the situation was, if the issue was really just that the SDK wasn't ready, or they eventually gave in to a market demand, but I think the original position was excellent. As it took only weeks for people to come up with little frameworks and styles to make Web pages look like a native iPhone app, I was pretty happy to see how things were shaping up.

The App Store for the iPhone opened about a year later, and smart phones became apptastic. At that time, I downloaded a few games, but that was about it. The core of the phone still did most of the useful things I was looking for. With the tablet, that became just slightly more complicated because, as a bigger form factor, you can certainly do more with it, but I still find myself primarily Web browsing with it.

And of course, the tech pundits have no shame declaring the end of one thing or another, like this gem from MG Siegler who says the PC is dead. He's actually arguing two different things. The first is that the "PC" is dead, which is pretty ridiculous hyperbole. This is largely an argument of semantics, because a personal computers is arguably everything from a desktop tower to a phone. It's true that we don't need the towers anymore, even on the desktop, but to suggest everyone is perfectly content with a screen 10 inches or smaller at all times is silly.

But even more silly is the suggestion that everything is going to be about "apps." While there's no question that phones all have some really cool thing that you (think you) can't live without, this notion of an app-dominant universe is not realistic for so many reasons. First, the Internet would suck if everything could only be access via apps and not Web sites. Good luck sharing a news article or some other content if it's walled-in. Second, the world of phones is already insanely fragmented. Android accounts for two-thirds of sales, and new version adoption of the OS is awful. iPhone accounts for a quarter of sales, with everyone else filling out the rest. Third, people are already getting enormous utility from Web-based services that are platform agnostic, because they want that.

And by the way, we've been here before. It was widely predicted that Flash would replace the Web as an app platform. How'd that turn out? Didn't the world just cry out in protest to the use of Flash, beating it into submission?

If you strip away this hyperbolic nonsense, you can have a more sane discussion about how computers of all sizes and form factors are going to be used. The fact that I can write code on a MacBook Air, and consider it as good as any computer I've ever had, says a lot about how things have changed. It's true that I don't need a big tower on my desk. My phone is good for certain things, as is my tablet. This winter we've got an entirely new product cycle of computers that are both tablets and "PC" in the traditional definition.

What I would like to see is more meaningful discussion about how to get back to platform agnostic application design. Does anyone remember the crappy old days of computers? "Oh, that's available on Atari and Commodore, but not Apple." Later, it was, "Oh, you don't have a fast enough CPU or RAM." We're finally getting to the point where the hardware basically doesn't matter, and now you want to fragment with different operating systems, programming languages and frameworks? This desire is even more insane given the fact that much of the hard stuff doesn't even happen in your hand, it happens on the network.

John Gage and Sun Microsystems turned out to be right that "the network is the computer," which is why this desire to sandbox everything with needless complexity into "apps" is so annoying. Yes, I get it, games and certain things have to run natively (for now). However, most of the apps I use, and even the hooks in the operating systems (Windows Phone, Windows 8, iOS and OS X) are just thin UI wrappers around services on the Internets. How is it efficient or OK to have to build these wrappers for every platform? It's not.


Comments

No comments yet.


Post your comment: