The End of Moore's Law and What It Means for Software

A few weeks ago I have seen a talk given by Bob Colwell, one of the brains behind Intel's Pentium Pro whose architecture has evolved into the CPUs that power Macs today). The gist is that the exponential growth of the number of transistorsMoore's Law — will taper off and we will enter a new age of computing where innovation can no longer be fueled by ever increasing processing power. At the same time, the startup Upthere made its presence publicly known. Backed by industry veterans such as Bertrand Serlet, Chris Bourdon and Alex Kushnir, it has received sizable funding from venture capitalists. Their goal is to usher in a new era of cloud computing where all of your data is in the cloud, and storage on the device is at best a cache. Upthere's admittedly very ambitious goal is to offer an end-to-end solution from server hardware to end user software.

These two seemingly disparate threads led me to the question What will software look like in the future?

In the past hardware advances have outpaced software advances, and while there were a few pieces of software (such as OS X initially) which required more performance than hardware could deliver at the time, nowadays new platforms such as watchOS and tvOS are designed specifically to work well with the hardware available at the time. What is more, the processing speed of the current breed of iOS devices is toe-to-toe with Apple's notebook offerings.

Once there are no more easy gains to be had through advances in chip manufacturing technology, you either have to improve the architecture of the chips or improve the software. From that point on, performance gains will be much harder to come by, and advancements from better software will become increasingly important. What kind of trends will likely play a more prominent role as exponential growth of chip complexity tapers out?

Specialized hardware integrated with software

While specialized hardware has been around literally since the beginning of computing, the cost-benefit analysis was usually skewed towards general purpose hardware. It was simply easier and cheaper to wait for one or two iterations of general purpose hardware rather than design specialized hardware (which is manufactured at a smaller scale) and wait for software to take advantage of it.

Ever since the advent of the smartphone, specialized hardware for things like image processing and encryption has gained more importance, albeit for the sake of power efficiency. But also in other areas such as storage can one find specialized SoCs from Intel, Annapurna Labs and others that are optimized for specific tasks. Once CPU and GPU performance levels out, the prospect of dedicating specialized hardware to certain tasks becomes more and more appealing. (This is another reason why I think Intel will be in big trouble in the consumer space: they are not able to offer the same level of diversity the ARM ecosystem offers.)

Software, of course, needs to take advantage of custom hardware, so companies which make their own hard- and software are at an advantage. Here, one should not just think of Apple, but also of Amazon, Backblaze, Facebook, Google and Oracle who are all designing their own custom hardware to run their software.

Cross-device and cloud-based computing

Speaking of the datacenter, many future pieces of software will integrate the cloud in a smart way. By this I do not refer to cloud document storage, but cloud-based processing of your data. One simple example is cloud-based indexing of your files: Google's Photos service not only processes the image file's metadata, but uses sophisticated image recognition algorithms to analyze what is actually in the photo. Even if the algorithms could run on your Mac, iPhone or iPad, the algorithms would have to be optimized to work within the limited power, performance and memory limits that are imposed upon the system by the hardware.

Software running on dedicated and specialized server hardware would not be subject to such limitations, and instead, it could be designed to only do, say, image processing very, very well. Software engineers could implement the much more advanced algorithms to take on the heavy load of certain computing jobs. “The application” is designed from the start to be split up into several parts that run across several devices in concert. The Apple Watch can be seen through this lens: The Watch has limited compute power and no dedicated LTE connection, it has to outsource those tasks to an iPhone. And while the Watch's reliance on an iPhone is a drawback, without it the Apple Watch would not have been a viable product with current technology.

While in some cases this is done out of necessity, there are cases when a cloud element actually improves efficiency. Podcast clients such as Overcast and many RSS readers do not let the device part of the app crawl RSS feeds individually, but instead have dedicated servers do the job for them. Tens and hundreds of thousands of requests for each individual instance of the app are replaced by one, and the devices talk to one central server instead.

Such integration across several devices is not just limited to what one may call “consumption of content” (a term that is often used disparagingly), but also features in services like Adobe CC. Once you subscribe, you have access to their whole app portfolio and the community components such as Bēhance. The apps are designed to work with one another, you can start a drawing in Sketch and continue it on Illustrator. After you are done, you can put it into your digital portfolio or solicit feedback from others.

A focus on software quality and stability

The rapid release cycle of hardware forced the software divisions of Apple and Google as well as software vendors to keep pace. While the criticism of some commentators who lamented the state of Apple's recent software was overblown, software that runs on “computers” is very different from software that runs the various controllers and gadgets in a car or an airplane. People expect their car to work, and would not want to accept if their “radio app crashed”.

Longer release cycles, more modern programming languages and better software frameworks could pave the way for software quality and stability being a higher priority. The effects would initially be subtle, but add up over time: Every crash or bug gnaws at the bond of trust between user and device, so less crashes mean people can place more trust into their device. And for certain tasks, you have to aim higher — if Apple and Google really do end up building the rumored cars for consumers, self-driving or not, they better raise the bar for software quality and stability.

Moreover, many of the hard problems are UI and UX problems: How are certain ideas and paradigms implemented, and how are new features exposed to the user without overwhelming him or her in the process? Also here longer development cycles could allow for more thought to go into the design as well as give software engineer time to tackle hard problems (such as getting the replacement for mDNSresponder right or implementing a new filesystem) which cannot be implemented within the 6-8 month window they currently have.

Slower progress: the new normal

The breakneck speed of the computing industry is an outlier, other industries are moving at a much slower pace. A 5-year-old car is not only acceptable, it is in all likelihood still comparable to its younger 2015 sibling. The horsepower has not doubled or quadrupled, fuel economy is not through the roof, and all the controls are essentially still the same. You would have to go back further in time to have a markedly different experience. Compare a 2010 iPad to a 2014 iPad Air 2 or a 2015 iPad Pro: the difference is night and day.

While it is understandable that some would bemoan the end of rapid performance and battery life improvements — and all the benefits that come with them, that does not mean that our experience has to evolve that much more slowly. Instead other trends such as abundant availability of networking, the ability to cheaply put full-fledged computers into more and more every-day objects will continue to fundamentally change the world. (A new Raspberry Pi Zero costs just $5, and that includes a copy of the MagPi magazine.) I am optimistic that software engineers will rise to the challenge.

Managing transitions and expectations

Recently, there have been increasingly loud voices criticizing Apple's software quality, and some concerns whether Apple has abandoned the »pro« market. Articles and blog posts centering around these themes seem to ebb and flow in regular intervals.

While I do think there are often good particular reasons to make any one argument, it is not very productive to look at them in isolation. For instance, is Apple abandoning development of Aperture a sign that Apple no longer believes in serving the »pro« market? Or is it just one incarnation of a larger issue inside the company?

Moments of transition

With the announcement of Windows 10, Microsoft has embarked on a journey from a desktop PC-centric approach to a world where for each person you have several devices. And interestingly, also another major piece of Microsoft software is going through a chrysalis with some awkward consequences. I'm speaking of Office: it was Office for the iPad which showed the way. Based on a common core with all other versions of Office, it showed off the new design language Microsoft intends to use. Also a lot of their other pieces of software are rewritten so that one universal app works across all devices.

That sounds familiar, doesn't it? Just like Microsoft Office is going through an awkward phase, also Apple's office suite is going through a similar transition: it cut features on the desktop version in favor of starting afresh with a codebase that is shared amongst the iOS and OS X apps. There are other victims of this period of transition, Apple's iPhoto and Aperture come to mind. Here, the motivation is not so much to use a shared codebase, but rather a tectonic shift in the philosophy. iPhoto and Aperture were born in the digital hub era where the personal computer, the Mac, is in the center of your digital life and contains »the truth«. With the introduction of iCloud, Steve Jobs pronounced the end of the digital hub and put the cloud in its place. Step-by-step Apple has been busy updating existing apps such as iTunes to take care of »the cloud«. Clearly, at a certain point in times it would have been iPhoto's and Aperture's turn. While I don't think it was wise to cancel Aperture before a similarly powerful replacement was ready, more on that below, at least on the grander scheme of things the direction makes sense.

From simple to complex

From a glass half-empty perspective, people on the desktop are sacrificing functionality in favor of a platform they may not even use. Apple's focus primarily lies on iOS rather than OS X, and this is clear evidence of that. An optimist would counter that this is a chance to re-evaluate old pieces of software with old code bases, and to make them more user friendly. Microsoft and Apple took the opportunity to start from scratch.

I find this trend quite interesting, because it reverses the direction in which features have usually flown: in the past, professional, complex apps were often the starting point for simpler, less powerful apps. Photoshop gave birth to Photoshop Elements. Apple derived Final Cut Express from Final Cut Pro. iPhoto eventually inherited the database format Aperture was based on.

Nowadays, more powerful desktop software is reimagined after having to design software for tablet and phone operating systems. Microsoft Office for Windows 10 is based off on Office for the iPad. The restrictions force software designers to reevaluate their decisions, and they often find that simpler interfaces are more appealing than being overwhelmed by choice. I see this as a huge opportunity to make desktop software more user-friendly, and the personal computer as a whole more humane.

Cross-device interoperability

The trend also goes to people having several computers. Even thermostats or lightbulbs are based on computers. So a lot of efforts are going into integrating them seamlessly between devices. Microsoft will allow you to stream XBox One games across all your Windows 10 devices, meaning you can continue to play on your Suface or PC in case someone else wants to watch TV. Apple's Continuity features enable users to »move« tasks across devices. The lines between the different computers are blurred. And most of these features are relying on the cloud. This is the major push behind all these moves.

Managing software transitions, managing expectations

Apple has a mixed track record when it comes to software transitions: they have excelled in some (e. g. migrating from classic MacOS to OS X and from PowerPC-based Macs to Intel-based Macs), more recently, their record was much more, ahem, mixed. The first botched transition was that of Final Cut Pro to Final Cut Pro X. While I am not a video editor, and I am not fully qualified to judge, it seems to me that a major flaw was Apple's management expectations. I am fairly certain that if Apple had renamed Final Cut Pro X into Final Cut Xpress and announced that future versions of Final Cut Pro will be based on it, the reaction would have been much more calm. Instead, a vocal part of the Final Cut community got their pitchforks out and flailed Apple. Also with Aperture, I think it was a big mistake to half-heartedly continue development of one of the most popular paid apps on the Mac app store, and discontinue it before a suitable replacement is ready.

Lack of light tower apps

What is really disconcerting to me is the scarcity of Apple light tower apps, meaning apps where Apple shows off what in their view a good Mac app should look like and fulfill a need. A long time ago, these light tower apps were the iApps on the consumer side as well as pro apps such as Keynote, Final Cut and Aperture. I don't think the significance of these apps can be understated for people who switched to the Mac before it became en vogue to do so. These apps were revved regularly, and often in lock-step with the OS. However, the UI of many of these apps got worse (the most glaring offender here is iTunes, but also iPhoto got interface elements which seem very un-Apple-y).

And it's not just the Mac which seems to be suffering, it is iOS, too. Right now, iOS hardware seems overpowered for the type of apps iPads and iPhones need to deal with. For comparison, an iPad Air 2 from 2014 is roughly as powerful as a 2011 11" MacBook Air. iOS also seems to have all the plumbing to make it very powerful. What is missing are Apple apps which take advantage of this. They've tried with iPhoto for the iPad, but it is safe to say that this experiment has failed. If Apple subscribes to the vision that iOS devices will supplant traditional PCs for most tasks, they need to give us apps which do things for which right now you still need a PC.

Software quality issues

While I don't subscribe to the hysteria that Apple is doomed, I think software quality across the board is an issue: it starts with bugs in the OSes but continues with rare updates to popular apps. Hardware-wise Macs are best-of-breed, and Apple has consistently anticipated trends correctly. So the obvious lack of finish on its software products is even more glaring. From the outside it looks as if Apple is constantly redlining its software team, so that major pieces of software (including consumer-grade software such as iPhoto) are no longer updated on a regular cycle. It's part of the failure of its software division to keep up with the growth of its hardware division. According to me, Apple's biggest challenge is not to enter new markets, redefine other categories or some such, it needs to bring its software to where its hardware is.