LOAD * ,8 ,1

Stay curious about how things work, and how you can use them to make things of your own.

The popular sugar substitute aspartame was discovered in 1965 by James M. Schlatter. He was researching an anti-ulcer drug and accidentally spilled some aspartame on his hand. When he licked his finger, he noticed that it had a sweet taste.

Imagine working in a laboratory on an experimental drug and accidentally spilling some chemicals. The sensible response is to quickly wash our hands. To make something new we have to be somebody who wonders what it tastes like and is tempted to lick!

Everything was made by somebody - the places where we live and work, buildings, shops and offices; the vehicles we get around in, cars and bikes and planes; our public spaces, streets, bridges, parks; the clothes and shoes we wear; the sports we play and even the teams we support; the devices that power our lives: computers, phones, software (not to mention the power stations and electricity network that make it all possible); all the food we eat; all the medicines that keep us healthy or fix us when we’re broken; all the things which entertain us: television shows and movies, art, music, theatre; even the schools, companies, organisations and institutions that make up our communities, with their rules and traditions and processes.

It’s easy to assume that these all exist and our job is just to live with them. But it’s better to be curious about how things are built and how they work. The people who made all these things are not any different from us.

It’s easy to assume that existing things are a bit magical. But once we take off the cover we see the design and the thought that has gone into making it.

It’s an important thing to understand about the world: we can change it, influence it, improve it, and build our own things that others can use to make their own lives better.

Or, if not better, at least artificially sweet.

Atari 2600

The dark ages, circa 1985, from memory…

Atari 2600
Atari 2600

The state of the art, in our house at least, back then was an Atari 2600, a simple game console with a slot for game cartridges which were sold separately.1

We had a few of the classics – Pac ManSpace InvadersMissile Command.

This was hours of fun for all of us. We just inserted the game cartridge we wanted to play and it magically appeared on the screen. It had a grand total of four switches – on/off, colour/black+white, game select and game reset. There was no real learning curve and it was pretty bullet proof.

I started to wonder how it all works on the inside (as I’ve since discovered for myself, little people can be annoyingly curious, and I was no exception). Who made these games we were playing and how? I enjoyed the games we had, although I never was (and still am not) much of a gamer, but it felt like it would be more fun to make my own.

Around the same time I was given some old BYTE magazines, which were full of articles about “computers” like the ZX Spectrum and Commodore-64. At the back there were pages and pages of gobbledegook which were apparently the instructions anybody could type into these machines to make them do different and interesting things. That sounded intriguing to me, so I started thinking of all the things I could build and tried to convince my parents to let me buy one. However, they didn’t see the need for another “game”. To be fair, the distinction between a console which I could play games on directly and a more expensive computer which I could type the code for games into and then play games on was subtle and I struggled to make the case.

Eventually I saved up enough money to take the decision out of their hands. I purchased a second-hand Commodore-16 and started to teach myself BASIC.2

It was slow going to begin with. My first project was to emulate the statistics shown during a one day cricket game, with run rates for each batsman and manhattan graphs and worms. It turned out to be far too ambitious. I eventually got it to work for 50 overs, but it would always crash at the change of innings. In hindsight I suspect that I may have needed more than 16 kilobytes of memory to achieve my grand vision. Either way I never let it defeat me. There was always a new technique to learn (discovering if statements and while loops was a revolution!) and I enjoyed the challenge of creating something of my own from scratch.

iPhone & iPad

Fast forward a few years, to circa 2010…

The state of the art, in our house at least, then was an iPhone and iPad, both simple mobile devices running applications which were sold separately.

We had a few of the classics - Flight ControlAngry BirdsShazam.

Again, it was hours of fun for all of us. We could just tap on the icon of the application we wanted to use and it magically appears on the screen. It had a grand total of four switches – on/off, volume and mute and a home button. There was no real learning curve and they were pretty bullet proof.

I started to wonder how it all works on the inside. Who makes these applications we were using and how? While those we could download were pretty addictive, it felt like it would be more fun to make my own.

I found some tutorials online. They were full of square brackets and semicolons that anybody could type into a computer to create an application that would run on a device. I started thinking of all the things I could build. In the intervening years I’d completed a Computer Science degree and helped to build Trade Me a couple of times over, so I wasn’t quite as naïve. And thankfully this time around I didn’t need to convince anybody other than myself that this was a good idea.

Eventually I saved up enough time to begin experimenting. I installed XCode and the SDK and started to teach myself Objective-C.

It was pretty slow going to begin with (it was a few years since I had to allocate and deallocate memory, for goodness sake!) My first project was a power meter reader app. It probably took me about 10x longer to get it working than it should have, while I came up to speed with some of the unique problems of designing and developing for a mobile device and a touch interface. Either way I never let it defeat me. There was always a new technique to learn (discovering autorelease was a revolution, for example) and I continue to enjoy the challenge of creating something of my own from scratch.

A couple of years after that we launched the first version of Triage, which was briefly a top-rating paid app on the App Store.3

The post-PC era?

At the D8 conference in 2010 Apple CEO Steve Jobs compared a PC to a truck - i.e. a heavy duty vehicle that has its uses but is not the standard transport mode of choice for most people.4

With the benefit of hindsight he was right. He was describing the beginning of the post-PC era.

Of course, it’s not an either/or situation.

Around the same time as the first iPhone was launched, I got fed up with providing tech support for my parents and replaced their PC with an iMac (yes, they eventually realised that computers are not only about games but also useful for sharing photos of grandkids!) This was much more of a controlled computer experience than they were used to - maybe a minivan, to extend Jobs’ analogy? They loved it and the things they were able to do with technology blossomed. By 2015, when it was next time to upgrade, they made the jump to iPad. A small car suited them much better than a truck.

The numbers tell the story loud and clear. There are orders of magnitude more demand for cars than for trucks. The iPhone and other equivalent mobile devices are so simple to use that nearly everybody in the world now carries a computer around wherever they go. Most people are happy to consume the technology that exists.

But trucks haven’t gone away. As long as there are people like me who want to create things of our own we’ll always need some trucks to do the heavy lifting.

Distribution, Distribution, Distribution

Ever since the first App Store was launched in 2008 (it was called the iTunes App Store back then) it has attracted controversy - both for the opaque approval process and for the licence fees and commissions that Apple charge app developers.

As Chris Dixon explained (in a since deleted tweet), perhaps Atari has some responsibility for this:

In video game industry, it is widely believed that Atari died because of explosion of crappy games. Hence platforms have been curated since then.

I don’t know if that is correct. But, either way, it is true that the App Store is tightly controlled by Apple (is “curated” the right word?), and that remains a source of frustration for many developers who are forced to wait for them to approve every application and update and pay handsomely for the privilege.

The App Store is a monarchy.5

I guess nobody likes to think of themselves as a serf or worse, find themselves banished from court for befriending a rival kingdom.

On the other hand, I struggle to get too angry about it. It’s an amazingly popular venue full of people looking for software to install. It does a pretty decent job of separating the wheat from the chaff (which is a bit depressing for those of us whose apps are in the latter category). And it takes care of many aspects of marketing and selling applications that are painful on other platforms.

Chris Dixon again in an article about how the iPhone permanently upended the mobile industry: 6

The people griping about Apple’s “closed system” are generally people who […] didn’t realize how bad it was before.

If only there was such an accessible and well trafficked distribution channel for web applications. Many of the startups that I’ve worked on over the years would certainly have benefited from an equivalent channel to reach the customers we were targeting with our software and services.

Those who begrudge paying Apple a 30% success fee, probably overlook how much they would spend on sales, marketing, distribution, payment and fulfilment via any alternative channel.

Recently this feels like an open question again, with legal challenges7 and the emerging threat of regulation.8 Whatever happens we will always need a method for getting software installed on devices. Perhaps it’s the App Store and equivalents on other proprietary platforms. Perhaps is a decentralised and more open equivalent, without the oversight of a single company? Perhaps it’s just the web?

Who knows?

History only ever repeats

Here is a quote from a 1996 issue of Wired Magazine: 9

The Web reminds me of the early days of the PC industry. No one really knows anything. There are no experts. All the experts have been wrong. There’s a tremendous open possibility to the whole thing. And it hasn’t been confined, or defined, in too many ways. That’s wonderful. There’s a phrase in buddhism, “beginner’s mind.” It’s wonderful to have a beginner’s mind.

— The Next Insanely Great Thing, Wired 4.02

That’s Steve Jobs again, talking about the coming wave of web applications, as he saw it way back then. I was able to ride that wave. It’s been a fun ride so far, and still has some distance to run, I think.

We can easily tweak that quote to apply to any new technology.

Different hardware will come and go. Different languages for developing software will come and go. Different channels to distribute software will come and go.

Don’t worry about any of them destroying what came before - that’s inevitable in time. Don’t be scared. Try to see the possibilities and keep a beginner’s mind. But balance that with a healthy scepticism - it’s rare that anything is as amazing as it promises to be in the short term. Often the biggest opportunities are revealed when we understand the limits and constraints.

All we need to do is ensure that people generally, and kids especially, remain curious about how things work, and most other things will take care of themselves.


Related Essays