Menu Icon Created with Sketch Beta.
View All "Securities" by Lux Capital Posts
Matt Mets / Flickr / Creative Commons

We take for granted just how effortlessly consistent our daily life is. Our stoves burn with the swoosh of a knob, our stop signs (at least in the U.S.) are a stable hue of red, our wall outlets haven’t changed prongs in decades. The stairs, the shower head, the garage door — hardware devices that may not be the pinnacle of design, but at least tirelessly function as one would expect.

That stable world is being shaken though. Hardware is getting smart, and thus dumb. Smart fridges crash after a software update and can’t be opened. Smart light bulbs can’t find a Wi-Fi signal and fail to illuminate. Autonomous cars don’t stop properly, or they just drive away when the cops pull them over. As anyone who has ever tried to trick out their home with smart devices, “just plug it in” isn’t exactly the user experience anyone discovers.

C.P. Snow once coined “two cultures” to describe the epistemological differences between science and the humanities, but a modern version within engineering would divide between the “hard” engineering fields of electrical, civil, and hardware versus the “soft” engineering of, well, software. When the two worlds fuse, the kludge of software tends to dominate over the stability of hardware.

Our scientist in residence Sam Arbesman wrote a short post recently on the lack of long-term thinking in the world of software. He notes, “But here’s the problem: tech, and especially software, is inherently transient. Code, in many ways, is fragile and delicate: it can easily fail or become buggy or be rendered obsolete. An online service rarely operates the way it should even just a few years after it is created.” In his view, that culture creates a focus on rapid but ephemeral action: if you try to code a stable product, watch as others race ahead.

Indeed, the half life of software viability can be sometimes as short as hours for those in quantitative financial trading. There are roads built by Romans that are still functional today.

This is a choice. And it’s an important one to critique as software continues to dominate not just the economy but also the future of labor. In March in “We’re going to try to stop him,” I noted that:

The pressure will only intensify. As I wrote in TechCrunch in late 2020, the no-code generation is arriving. Empowered by platforms like Roblox and Minecraft, kids growing up today are learning digital skills and coding at earlier ages and with greater facility than ever before. If I am optimistic about anything, it is that the next generation is going to have an incredible wealth of talent to use technology to change the world.

There is certainly bountiful software talent maturing, but what happens when that acumen and ambition mixes with the current ephemerality of our software culture?

Nothing short of a generational waste.

Software is about abstractions, with each layer of code building on the others below. When lower layers of code change, it forces changes to all downstream layers. That’s why the kernel of an operating system tends to be extremely stable while the software libraries built on top of it are mostly stable but evolving. No application could be written if that infrastructure were any less permanent.

Unfortunately, far too many software layers in our world today are unstable. Every application feels like it’s built on a fault line, just waiting for the earthquake to shake all the code asunder. Apple and Google tweak their mobile operating systems, and tens of thousands of developers have to assess the damage.

That chaos and rapid adaption made sense over the past few decades as software engineers explored just what was possible with computing and code. There’s still plenty of areas where that hyper iteration should remain the rule of the law, but simultaneously, there are more and more areas where the expected functions of software are well defined and stability should reign supreme. We need to transition more of our software into the latter zones from the former.

Interestingly, the mostly immutable nature of crypto and blockchain technologies is one bright light in the search for software stability. The Bitcoin protocol and its reference implementation have had updates applied to it over the years, but fundamentally, it remains essentially indistinguishable from the technology described in the Satoshi white paper published in 2008. Fourteen years is a very long time in software.

The immutability of crypto protocols has been mocked relentlessly with each new bug and hack discovered. But that mocking misses a key observation: that software engineers are not comfortable at all today with immutability. The very notion that software can never be patched or updated is anathema to all but a handful of software engineers working in high availability technologies like phone switching or air traffic control.

That’s precisely the cultural change we need to start inculcating though. We need to improve the half life of software, and in order to do that, we need to reset the expectations developers have for the quality of their work. Coding will need to be slower and more deliberate. More peer reviews will be required, and some level of that dreaded “bureaucracy” is all but inevitable to raise the quality bar.

With new, higher standards in place though, something miraculous will happen: software can and will just work. Cybersecurity issues that plague hastily written software connected to the internet will be minimized, if not eliminated entirely. Perhaps most importantly, users will enjoy a consistent experience and a sense of relaxation when their favorite app doesn’t suddenly fail or change at a moment’s notice.

There are a couple of accelerants that can move the industry forward. From a decentralized perspective, software engineers can determine the culture of their own teams and the stability of their products. They have the power to change the trajectory of software for the better by instituting their own higher standards.

The key accelerant though in my mind is around insurance and warranties. Cyberhacks remain a measly cost for most companies, who face limited consequences for poorly-written software. In addition, companies offer only short guarantees for their software interfaces without long-term enterprise support contracts.

We can change the balance here. Cyber risk insurance can be made mandatory, with much more variance in premiums between companies with strong and stable coding practices and companies with churn-and-burn approaches. Warranties on software could be required and extended, forcing companies to consider how to support their software for more than just the short-term throwaway timeframe they are using today.

Such a standard would change not just the artifact of software itself, but would also instill a mode of thinking for the next generation of software engineers rising through school and into the workforce. No child needs to provide a warranty for their Roblox games, but a transition to professionalization will naturally happen if the software industry actually has high professional standards around its work.

Our expectations around software are so low, it doesn’t even faze us when even the simplest features fail. We need to demand better. No designer changes the color of stop signs every year, and no software engineer should have to change their software on such a rapid cadence. It’s time to do what Snow asked with “two cultures” and E.O. Wilson discussed in his book “Consilience” — we need to bridge the systems of thinking that divide our engineering disciplines and bring them together so that the smart devices of the future are actually smart.

“Securities” Podcast: The future of biotech is moving from bench to beach