Technology, UXPaul McAleer

We Lost Native Apps

Technology, UXPaul McAleer

This is a work in progress. More to come, including some organization.

Somewhere between the time the web really took off and today, we lost truly native apps for our computers.

I know. I hear you. “I have Spotify on my iPhone! And I have it on my Mac too!” And yes, you are right. Some would rightly point out that native apps are undergoing a renaissance. While this is technically correct (especially thanks to programming languages & frameworks that let people "skin" an app to look like its native OS), there is something lost here. Take Spotify and Netflix for instance. The Spotify interface is practically the same from device to device. Netflix used to support native controls and native interfaces, but instead chose to unify under one interface across all devices as much as possible.

This isn't wholly good or bad, although I have an opinion on it.

So what?

A few decades ago, home computers were partially defined by their varying operating systems and programming languages. AppleBASIC was similar to BASIC 2.0, for instance, but not completely the same. Still, one could get the gist of BASIC on a TI-99/4A and mostly take it to a C64.

Once GUIs really took hold, we saw a lot of differentiation come to market. Mac OS was different than OS/2 which was different than BeOS which was different than Windows which was different than Amiga OS. Computers were bought and sold on the operating system and on the PC side, MS-DOS was mostly running the show. Over time, choices diminished and it was down to Mac OS and Windows.

The web grew in popularity. With it came an emerging common interface for accessing information – something new and different. The web, and a browser around it, started to take hold as the definition of what it meant to interact with a computer. That's notable: it wasn't Mac OS or Windows that instigated this change.

Once web apps came around and started to gain acceptance – around the initial AJAX and JavaScript libraries – the days of native apps became numbered. People started to get used to the idea of accessing things the same way on multiple devices without a local OS getting in the way. This is significant and followed through on the idea of the web being an interface for information. More so than a local OS, more so than anything else. You see this in the lineage: much later, we've got apps like Slack and Spotify that are web apps running in a local OS window, and only show their cracks through bugs or quirks.

Once the need to have a truly locally-influenced app is obviated, so is the need to store information locally. Why would you want your files to be on your home Mac when your work PC is where you needed the information? We collectively shifted over to storing things online, and in the cloud, in places we have no real control over for convenience.

Write Once, Run Anywhere

Some application frameworks promise the idea of writing a single codebase and then having that app work on different OSes naturally – iOS, Android, et al. These distill the operating system down to a series of UI widgets at best, and an aesthetic exercise at worst. These apps end up being native technically (that is, to fulfill a requirement on a checklist), but not holistically. Slack and Spotify, for instance, do not operate quite like any other Windows or Mac OS apps. There are conventions that are similar enough to get the gist, but one can tell they're not exactly like the local OS's built-in apps.

What We Lost

Here's where we're at.

  1. OSes used to have solid standards for interaction models. These were enforced by the manufacturers.
  2. The web has never had solid standards for interaction models, but many have emerged and become de facto standards.
  3. Because the web hasn't had these standards and the web usurped the popularity of local OSes, it influenced the direction of those OSes.
  4. And because of that, apps really don't need to adhere to OS standards anymore. There's no real risk or consequence.

I am not certain if this is ultimately something good nor bad. When I first started writing this I really felt that it was bad – it's something we lost, something we don't have anymore.

But on the flip side, we have the promise (if not the actual implementation) of universal interfaces that work from device to device, which is pretty awesome. The unfortunate part is that those interfaces don't need to be consistent from app to app, which is the real pain here: every app is potentially a new learning curve, a new way of interacting, a new way of processing, a new set of shortcuts, a new combination of gestures. That is what we've lost.

It is a different way of locking one in to an interaction model: if one learns to use Spotify, it isn't necessarily portable to iTunes (eeeesssshhhh) nor is it portable to Finder or Windows Explorer. It's an adjustment and a learning curve, or at least a situational change, every single time.

It would be ideal to say that we interact with our devices in a certain way, and they will work in that certain way no matter what. That yes, you can access the same app on different devices – but it will reflect the place you're using it more than anything else. That would be ideal. I'm not sure how, or if, we should get back there.