Stay curious about how things work, and how you can use them to make things of your own.
The popular sugar substitute aspartame was discovered in 1965 by the US chemist James M Schlatter. He was researching an anti-ulcer drug and accidentally spilled some on his hand. When he licked his finger, he noticed it had a sweet taste.
Imagine working in a laboratory and spilling chemicals on yourself. What would you do? The only sensible response is to quickly clean up the mess and be much more careful next time. To discover something new you have to be the kind of person who wonders what it tastes like and is tempted to lick. Curiosity is the key.
Just about everything was made by somebody – the places we live and work; the cars, bikes and planes we get around in; our public spaces, streets, bridges, parks; the clothes and shoes we wear; the devices that power our lives (not to mention the electriciy networks that make it all possible); the meals we eat; the medicines that keep us healthy or fix us when we’re broken; the entertainment we enjoy and the sports teams we support; even the schools, companies, organisations and institutions that make up our communities, with their rules and traditions and processes.
It’s easy to take all of these things for granted and just live with them. But it’s much more rewarding to be curious: how were they built and how do they work? Sometimes we can look inside and see the design. We can try to deduce what the people who made these things were thinking. We discover there is no magic. Then we can change them, fix them, and build our own new and better things that everybody can use to make their own lives better. Or, if not better, at least artificially sweet.
Join me back in the dark ages, circa 1985.…
The state of the art, in our house at least, back then was an Atari 2600, a simple game console with a slot for game cartridges which were sold separately.
We had a few of the classics – Pac Man, Space Invaders, Missile Command.
This was hours of fun for all of us. We just inserted the game cartridge we wanted to play and it magically appeared on the screen. It had a grand total of four switches – on/off, colour/black+white, game select and game reset. There was no real learning curve and it was pretty bullet proof.
It wasn’t long before I started to wonder: how does it all work on the inside? (As I’ve since discovered as a parent, little people can be exasperatingly curious.) Who made these games we played and how? I enjoyed the games we had, but it felt like it would be more fun to make my own.
Around the same time I was given some old Byte magazines, which were full of articles about new personal computers like the ZX Spectrum and Commodore 64. At the back there were pages and pages of gobbledegook which were apparently the instructions anybody could type into these machines to make them do different and interesting things. That sounded intriguing. I started thinking of all the things I could build and tried to convince my parents to buy one. Bad news. They didn’t see the need for another “game”. To be fair, the distinction between a console which I could play games on directly and a more expensive computer which I could type the code for games into and then play games on wasn’t obvious and I struggled to make the case.
Eventually I saved up enough money to take the decision out of their hands. I purchased a second-hand Commodore-16 and started to teach myself BASIC.
It was slow going to begin with. My first project was to emulate the statistics shown during a one-day cricket game, with run rates for each batsman and Manhattan and worm graphs. It was far too ambitious. I eventually got it to work for 50 overs, but it would always crash at the change of innings. In hindsight, I suspect that I may have needed more than 16 kilobytes of memory to achieve my grand vision. Either way I never let it defeat me. There was always something new to learn (discovering fundamental programming techniques like if statements and while loops was a revelation) and I enjoyed the challenge of creating something of my own from scratch.
Fast forward a generation, to circa 2010…
The state of the art, in our house at least, , took the form of an iPhone and iPad, both simple mobile devices running applications which were sold separately.
We had a few of the classics - Flight Control, Angry Birds, Shazam.
Again, it was hours of fun for all of us. We could just tap on the icon of the application we wanted to use and it magically appears on the screen. It had a grand total of four switches – on/off, volume and mute and a home button. There was no real learning curve and they were pretty bullet proof.
Once again, I started to wonder: how does it all work on the inside? Who made these applications we were using and how? While those we could download were pretty addictive, it felt like it would be more fun to make my own.
I found some tutorials online. They were full of square brackets and semicolons that anybody could type into a computer to create an application that would run on a device. I started thinking of all the things I could build. In the intervening years I’d completed a computer science degree and helped to build Trade Me a couple of times over, so I wasn’t quite as naive. And thankfully this time around I didn’t need to convince anybody other than myself that it was a good idea.
Eventually I saved up enough time to begin experimenting. I started to teach myself Objective-C, the software language that all iPhone applications were written in back then.
It was slow going to begin with (it had been a few years since I’d had to allocate and deallocate memory). My first project was a power meter-reader app. It took me about 10 times longer to get it working than it should have, while I got up to speed with some of the unique problems of designing and developing for a mobile device and a touch interface. Either way I never let it defeat me. There was always a new technique to learn (discovering key Objective-C memory management concepts like autorelease was a revelation, for example) and I continue to enjoy the challenge of creating something of my own from scratch.
A couple of years after that we launched the first version of Triage, which was briefly a top-rating paid app on the App Store.
At the D8 conference in 2010 Apple CEO Steve Jobs compared a PC to a truck - i.e. a heavy duty vehicle that has its uses but is not the standard transport mode of choice for most people.1
With the benefit of hindsight he was right. He was describing the beginning of the post-PC era.
Of course, it’s not an either/or situation. Around the same time as the first iPhone was launched, I got fed up with providing tech support for my parents and replaced their PC with an iMac (yes, they eventually realised that computers are not only about games but also useful for sharing photos of grandkids!) This was much more of a controlled computer experience than they were used to - maybe a minivan, to extend Jobs’ analogy? They loved it and the things they were able to do with technology blossomed. By 2015, when it was next time to upgrade, they made the jump to iPad. A small car suited them much better than a truck.
The numbers tell the story loud and clear. There are orders of magnitude more demand for cars than for trucks. The iPhone and other equivalent mobile devices are so simple to use that nearly everybody in the world now carries a computer around wherever they go. Most people are happy to consume the technology that exists. But trucks haven’t gone away. As long as there are people like me who want to create things of our own we’ll always need some trucks to do the heavy lifting.
Ever since the first App Store was launched in 2008 (it was called the iTunes App Store back then) it has attracted controversy - both for the opaque approval process and for the licence fees and commissions that Apple charge app developers.
As Chris Dixon explained (in a since deleted tweet), perhaps Atari has some responsibility for this:
In video game industry, it is widely believed that Atari died because of explosion of crappy games. Hence platforms have been curated since then.
I don’t know if that is correct. But, either way, it is true that the App Store is tightly controlled by Apple (is “curated” the right word?), and that remains a source of frustration for many developers who are forced to wait for them to approve every application and update and pay handsomely for the privilege.
The App Store is a monarchy.2 I guess nobody likes to think of themselves as a serf or worse, find themselves banished from court for befriending a rival kingdom. On the other hand, I struggle to get too angry about it. It’s an amazingly popular venue full of people looking for software to install. It does a pretty decent job of separating the wheat from the chaff (which is a bit depressing for those of us whose apps are in the latter category). And it takes care of many aspects of marketing and selling applications that are painful on other platforms.
Chris Dixon again in an article about how the iPhone permanently upended the mobile industry: 3
The people griping about Apple’s “closed system” are generally people who […] didn’t realize how bad it was before.
If only there was such an accessible and well trafficked distribution channel for web applications. Many of the startups that I’ve worked on over the years would certainly have benefited from an equivalent channel to reach the customers we were targeting with our software and services. Those who begrudge paying Apple a 30% success fee, probably overlook how much they would spend on sales, marketing, distribution, payment and fulfilment via any alternative channel. Recently this feels like an open question again, with legal challenges4 and the emerging threat of regulation.5 Whatever happens we will always need a method for getting software installed on devices. Perhaps it’s the App Store and equivalents on other proprietary platforms. Perhaps is a decentralised and more open equivalent, without the oversight of a single company? Perhaps it’s just the web?
Who knows?
The Web reminds me of the early days of the PC industry. No one really knows anything. There are no experts. All the experts have been wrong. There’s a tremendous open possibility to the whole thing. And it hasn’t been confined, or defined, in too many ways. That’s wonderful. There’s a phrase in buddhism, “beginner’s mind.” It’s wonderful to have a beginner’s mind.
— The Next Insanely Great Thing, Wired 4.02
That was Steve Jobs in the February 1996 issue of Wired magazine, talking about the coming wave of web applications, as he saw it way back then.6 I was able to ride that wave. It’s been a fun ride so far, and still has some distance to run, I think.
We can easily tweak that quote to apply to any new technology. Different hardware will come and go. Different languages for developing software will come and go. Different channels to distribute software will come and go. We don’t need to worry about any of them destroying what came before – that’s inevitable in time. Rather than being scared, we should try to see the possibilities and keep a beginner’s mind. But, at the same time, balance that with a healthy scepticism – it’s rare that anything is as amazing as it promises to be in the short term. Often the biggest opportunities are revealed much later, once we understand the limits and constraints.
All we need to do is ensure that people generally, and kids especially, remain curious about how things work, and most of the rest will take care of itself.
Steve Jobs in 2010, at D8 Conference (Full Video), YouTube. ↩︎
Web Services as Government, Brad Burnham @ Union Square Ventures (via Internet Archive), July 2010. ↩︎
Steve Jobs single-handedly restructured the mobile industry, Chris Dixon. ↩︎
Judge’s ruling may take a bite out of Apple’s App Store, but falls short of calling the iPhone maker a monopolist, Washington Post, September 2021. ↩︎
Apple to Allow Outside App Stores in Overhaul Spurred by EU Laws, Bloomberg, December 2022. ↩︎
The Next Insanely Great Thing, Wired, 4.02, 1st February 1996. ↩︎