trenchant.org

by adam mathes · archive · rss · 🐘

You Probably Like Bad Software

The challenge with software is it gets worse over time.

It seems counter-intuitive that the more people work on something, the longer it takes to get done, but that’s a well established principle in software.

What’s harder to fathom is that it also gets worse. But that is the default outcome. Outside of extraordinary circumstances and extreme measures taken, it’s what you should expect.

Software is a world created by thoughts where real work and progress over time turns into an endless ouroboros of engineers making software that breaks other software then having to fix that broken thing only to break again in new ways.

Forever.

If you’re an engineer, it’s very likely you are making software worse every day.

It’s ok, most people who work on software are.

If you are using software, you probably use bad software. You probably like bad software.

It’s ok, most people like bad software.

The alternatives are usually worse.

Perverse Incentives

Software gets worse over time because what people change often isn’t related to making the software better in a coherent, measurable way.

It is not fixing bugs (boring! not fun!) or improving security (nobody cares until it’s too late!) or making things faster (who cares! computers and phones are faster every year! Moore’s law makes optimization forever unnecessary!)

What’s sexy and interesting in the world of software is adding features, redesigning a user interface, or integrating it with some other unrelated piece of software to help it (synergy!) or monetization – which these days usually means spying on users to better target ads, serving ads, delivering ads, or in rare cases selling things people don’t need more efficiently.

Often this is how individuals working in software show they did something and that’s how they are judged.

(People brag about the new software they make, nobody brags about the terrible awful bugs they had to fix.)

But if there’s a piece of software people are already using, by definition, it is useful and used.

Most of the above is likely going to get in the way of that existing usage.

If you’re not fixing bugs or improving performance which, you know, unless you are properly testing things and measuring them – also boring! – you’re probably harming those things – you’re making something that people use worse.

It’s probably attemping to solve a company’s problems, not users’ problems. And the accidental outcome is making worse software.

Again, that’s ok, most people make bad software.

Most people use bad software. Most of the software industry is predicated on selling, supporting, and monetizing bad software and making it worse over time.

Underneath It’s Even Worse

The perverse incentives of individuals who work on software is one thing – but it’s when you start moving down the levels of abstraction that things get really scary.

Let’s start with operating systems.

Now the accumulated cruft, random interface changes, inconsistent features, and whatever “me too” garbage thrown in to remain “competitive” doesn’t just impact a little corner of the software world via an application, but has the potential to fuck up every process and program running on top of it.

Eventually, the weight of this nonsense led to people jumping – leaping with joy! – to stop using their computer for phones.

Snobs/weirdos like me in linux/unix/macos/beos/amiga/whateverbsd land were somewhat insulated from this but it’s not hard to understand how using an iPhone 4S with a 3.5” screen and consistent touch interface would be a massive improvement over any verison of Windows released after 1995 which we too quickly forget was basically a wasteland of crashing (blue screens of death) and virus-filled malware.

“Getting rid of the garbage on your parents’ windows machine” is an annual ritual for many people.

2007-10 era smartphones were a clean slate – there just hadn’t been enough time for programmers, product managers, marketing hacks, sales guys, and aesthetic-obsessed designers to fuck it up by larding on complexity.

For those in the future who are baffled, let me set the scene.

It’s 2017, and the Apple iPhone 7, a device previously heralded as one of the most beautiful, usable, and understandable products, has a tentpole feature called “3D touch” – a rebranding of the disastrously named “force touch” – that performs different actions depending on the pressure applied while tapping.

Which is different than the different actions performed based on the duration of tapping.

So trying to rearrange the icons on the home screen – already an undiscoverable action but one users learned over a 10 year period – depending on how hard you press can accidentally trigger a nonsense “app menu” which by default includes a single item – “Share.”

Nobody wants to “share” their apps. That is solving developer and company problems (use more apps you don’t need!) not user problems.

And the few that do want to share an app definitely don’t want to do it by pressing REALLY HARD on the icon. The only reason people press really hard on an icon is to move it which they had to discover somehow in 10 years since there’s no affordance for it.

And the iPhone is probably one of the best case scenarios. Some people at Apple are really good at this stuff – they just seem to be increasingly overruled or making bad decisions.

It’s not just them. These things seem inevitable.

Laws Of Bad Software

Given enough popularity, hardware will be mediated by bad software trying to solve corporate problems, not user problems.

Given enough additional code, all software will become bloated and incomprehensible.

Now imagine these software stacks – applications built on frameworks using libraries dependent on operating systems with kernel bugs all packaged into containers deployed on hypervisors built on other operating systems running on virtual machines managed via orchestration systems that eventually somewhere runs on real CPUs and memory and disk drives.

That doesn’t make any sense because nothing makes sense anymore in software.

We used to call people who understood things end to end “full stack engineers” but that’s a bit laughable now because nobody really undestands anything end to end anymore.

This Is Your Program, And It’s Ending One Millisecond at a Time

If you aren’t measuring latency, it’s probably getting worse, because every stupid feature you’re adding is slowing things down.

Most software would be improved if people just started turning features off.

Turning features off generally doesn’t sell additional units to consumers, close a sale, or make for a PR fluff piece, so people only do it in times of extreme failure or consequences.

I’ve seen an inverse correlation in the amount of engineering time spent on features vs their usage regularly. I’ve seen direct correlations between the amount of time spent on features and higher latencies pretty much constantly.

Some software cultures understand this and put tight controls in place to prevent regressions (because, you know, it turns out to be a real revenue and/or usage problem when people abandon your software because it’s too slow) but if your software is already painfully slow due to low standards and atrophy good luck convincing people to fix it.

Software bloat is the seemingly inevitable and sad reality of nearly all software.

Security

As the layers of complexity start to overwhelm end users, you can only imagine what it’s like for the poor programmers stuck making all this work.

It’s layers upon layers of filth nobody wants to even wade through.

Kind of like how you’d be willing to pay lawyer-like fees just to avoid legal contracts? Well, tech is like that too now, hence tech-wages.

The terrible truth of software security isn’t that people are incompetent or lazy (though that probably happens sometimes.) It’s that the interactions between components, dependencies, and overall systems are now so awful that they may be impossible to secure at a reasonable cost.

That’s not a metaphor – literally – the benefits may outweigh the costs of connectivity according to insurance risk assessments –

“A future where the annual costs of being connected outweigh the benefits is not only possible, it is happening now. According to our project models, annual cybersecurity costs in high-income economies like the U.S. have already begun to outweigh the annual economic benefits arising from global connectivity.”

How To Stop Bad Software

1. Death

Dead software can’t accumulate additional bugs. It can’t get new features. It can’t get any worse. It also can’t make assumptions about how fast today’s hardware is.

George R. R. Martin writes 1000 page novels on DOS machine, on WordStar 4.0

If you disconnect hardware from the internet and run old software (or hide it in a virtual machine) it may actually run better as the inevitable pace of hardware improvements provide speed updates without software engineers using that additional hardware power to get in your way.

Bruce Sterling once said, If your grandfather’s doing a better job at it, you can put that aside for later, when you’re dead, like him.

If 25 year old dead software is doing a better job of it, then maybe stop trying to top it.

But nobody wants to actually be a neo-luddite and refuse to use any normal technology. It’s like, do you really want to never use Facebook and miss out on everything because you insist on using a god damned email mailing list? (I do, but I was always anti-social, hence the social aspects of the web were always sort of an anomaly in my life.)

2. Fight complexity

You can use OpenBSD or similarly esoteric systems where developers are even more jaded than me but actually do something about it – see features are faults, redux

This is fighting the good fight – having taste, being smart and proactive, outsmarting and outwitting an endless array of opponents.

The problem is some of those opponents start to look like forces of nature (hostile nation states, friendly nation state three letter agencies, corporations with more money than most nation states) and actual forces of nature (entropy) and it’s just fucking tiring because you know it’s probably just a losing battle that never ends and everybody is just fucking annoyed at you the 99.9% of the time a disaster isn’t happening and the 0.1% of the time it is, people are really fucking annoyed when you say I told you so.

3. Begin Again

When Microsoft’s and Intel’s duopoly led to a certain terrible low quality / high boredom in mass market hardware and software, it provided the fertile ground for the world wide web. By adding a new magical layer of abstraction (the web!) that made the underlying garbage of Wintel a commodity, there was a whole new world of adventure.

Normal people could like, look at the source of a web page, understand what was going on, and write their own!

Now, 20 years later, when nobody can understand which of the dozen competing Javascript frameworks to use to even write something that will reliably manipulate a Document Object Model efficiently enough to re-render a page of malformed bastardized SGML-derivate gunk after a click returns an asynchronously fetched incomprehensible stream of data nested between parentheses and quotes, that seems like a quaint concept.

The clean slate of mobile applications – where limited memory, screen size, CPU, and battery – actually provided enough constraints to force engineers, designers, and the software industry to shut up long enough to solve some actual problems in a comprehensible way – seems to be ending.

In my tech lifetime it seems that we only get about 5 years of “non-insane complexity” in our platforms before the “ecosystem” shifts into a swampish nightmare, and then 5-10 years of complete hell before we can move in. (My deep worry is this pace may be accelerating.)

The current hot place to jump next (internet of things) is going to be pretty fun when it works!

But when that stuff gets too complex and all the newly networked objects around us start speaking Portugesse we don’t understand and firing off spam emails because nobody bothered to secure the SSH and SMTP daemons on the ancient versions of linux that are lurking just beneath the surface we’re going to be in for a world of pain.

That’s already happening now – we’re already in trouble.

How 1.5 Million Connected Cameras Were Hijacked to Make an Unprecedented Botnet

As many predicted, hackers are starting to use your Internet of Things to launch cyberattacks.

At some point we have to ask ourselves, why the hell would anybody allow a camera that runs linux into their lives? Linux is an operating system that thinks it is talking to a teletype.

· · ·

We’re building the future of super-intelligent robots, but we’ve put brains in them that are hardwired to think it’s the 1970’s.

When things inevitably go wrong, I hope that they’ll let me watch Star Wars

· · ·

An earlier version of this essay was published as trenchant.org letter #9 – you probably like bad software.

·   ·   ·  
If you enjoyed this post, send emoji to my phone
🐸   🎯   🍩