trenchant.org

by adam mathes
archive · subscribe

The Case for Freeing MacOS

Professionals and power users have been upset with Apple’s high end computers for some time, but the last six months it’s come to a boiling point.

I don’t think much in the last round of updates will change that.

I’ve worked enough in big companies to understand the external perception and internal reality diverges a lot more than people know, so it’s hard to know how this happened.

I’m not particularly interested in the explanations – for me the interesting point is one of strategic misalignment and the opportunity for Apple to do something really bold to address it.

Why Pros Are Angry

Basically, if you want the absolute fastest processing and graphical power, you are hampered if you want high power, high thermal, desktop computing. Apple isn’t just losing on a price/performance perspective – in some cases it’s not even competing anymore. The 2013 Mac Pro essentially being not updated for years is the most grievous offense, but the more recent MacBook Pro without decent GPUs or keyboards and instead idiotic touch UIs is just offensive to those of us who actually work on computers for a living.

Exhibit 1 – The 2016 MacBook Pro

Exhibit 2 – The 2013 Mac Pro is ancient

Exhibit 3 – The 201x Mac Pro Announcement “coming next year-ish maybe!”

Exhibit 4 – The Hackintosh

Basically, people are unhappy, and often the best option is to make an illegal hacked up machine from parts that has better performance. Or just use a Windows/Linux PC with better components.

What Is The Point Of The Mac

Today Apple is, from a business perspective, an iPhone company.

The iPhone is the most successful consumer product in the history of consumer products in just about any objective measure. It is unclear if or when we will ever see another consumer product as successful in my lifetime.

Given what I understand of Apple’s functional internal structure (rather than business units) – one would expect all the other product lines to suffer as Apple puts more and more of their efforts into the business that makes all their other businesses seem small.

Interesting is that in my experience this is true even if benevolant management recognizes this as a problem and tries to adjust staffing / compensation / priorities to invest in other things. Because the potential rewards and recognition from working on “winning” supported projects end up influencing individual’s project decisions, this can be challenging. The rich get richer in that successful projects attract better talent. See also: The Innovator’s Dilemma

People like me look at the Mac as a general purpose computer with which to do interesting things (write, program, create art, type in terminal windows for a few decades). Historically the Mac has been Apple’s primary product that is created and sold at high margins.

The problem is that isn’t the Mac’s purpose anymore from a macro business perspective – it’s to support the iPhone.

The purpose of the Mac is to enable the creation of software and content experiences that make iPhones better.

And since the current market scale between laptops, smartphones, and new devices and experiences is unlikely to change, this is likely the reality for the next decade. There will be more smartphone users than computer users, and they will have a faster upgrade cycle. It’s a market that makes others seem tiny.

So it may be time to embrace that reality.

(Note that this equation changes if iPhones/iPads become platforms to create iOS software, but there’s been little to indicate that is planned in the near term.)

Apple Trucks

When computer products are compared to automobiles, the trite analogy now is that desktop and laptop computers like the Mac are trucks, while smartphones and tablets are cars.

Most people just need a car. Sometimes you might need a truck for specific purposes. Businesses need trucks.

The analogous problem here is that Apple’s car business is so large it seems almost irrational to care about the trucks.

But you need the trucks to make cars – they haul in the parts and people needed.

The problem is trucks have stagnated to the point where the truck drivers who bring the parts to assemble their cars are miserable and looking to buy something else.

The weird thing is Apple only allows Apple-trucks to bring them parts for Apple-cars, so when they stop buying Apple trucks, Apple cars suffer.

Commodify Your Complement

Many of the big successes in the tech business world have come from a strategy of commodifying your complement. The classic example was Microsoft creating a standard operating system that worked on a plethora of commodity computer hardware. Anybody could assemble PC’s, so fierce competition followed, which made PC’s cheaper and more prevalent.

Which was great for Microsoft, because every PC sold meant another Windows license.

Windows was the product, PC’s the complement that became further commodified – you could buy any IBM PC compatible system and run Windows and do what you needed.

Pundits suggested Apple follow this same course (and they briefly did in the 90’s with clone manufacturers before Steve Jobs returned) but it never really made sense because Apple computers weren’t about commodity hardware and solving all problems – they were about charging a premium for an integrated experience that worked (this was much harder in the 90’s, Windows “worked” on all kinds of hardware, but poorly.)

So it’s 2017 and I’m making the totally discredited suggestion Apple sell its OS and let hardware manufacturers compete in hardware?

Sort of.

Understand Your Complement

My hypothesis is that Apple needs as many developers using their software as possible to maintain dominance in smartphones and the next generation of hardware (AR, VR, whatever).

Their current high margin computers is making this somewhere between hard (programmers) and impossible (virtual reality developers, though the most recent WWDC keynote and external GPU enclosure is suggesting they are trying to take this from impossible down to hard.)

Let’s take things to one extreme for the sake of argument.

Free MacOS

MacOS is already “free” – Apple has stopped charging for upgrades. The cost of MacOS is just hidden in the cost of buying a Mac, and Apple wants everyone to have the latest version for ease of maintenance and market size for developers.

But what if MacOS was free and ran on commodity hardware (which it basically does, already, if you bend the law and make a Hackintosh.)

A few interesting things happen here.

The first is less direct Mac profits – via cannibalization of the existing Mac product lines.

But there’s some potentially offsetting gains that are better in the long run –

  1. More MacOS users – via decreased cost of hardware, increased hardware support
  2. Increased innovation on the platform – via (1) and more students, starving garage developers, hobbyists choosing MacOS
  3. Better, stickier app ecosystem on iOS and new Apple hardware – via happier, larger pool of developers
  4. Support for virtual reality, augmented reality, and other hardware-dependent hacking becomes easier
  5. The demand for Apple services (iCloud, Music, etc) goes up significantly, especially for current iPhone users who also adopt MacOS powered desktop/laptops

There’s less extreme iterations on this –

  • MacOS supports more hardware but licenses are only available with an iOS device purchase
  • MacOS licenses are sold to support some homebrew hardware but with limited/no customer support

Why Not

Crappy, ugly, commodity hardware is fundamentally “off-brand” for Apple, and the nature of enabling MacOS to work across more hardware fundamentally leads to experiences that are sub-optimal compared to the fully integrated Mac hardware/software stack today.

There’s also a serious strategic discussion of whether the potential gains offset the revenue declines and other issues.

It’s easy to pontificate on these things externally, it’s a lot harder to make these decisions when you have hard numbers in front of you and shareholders to be accountable to.

And it’s hard to cannibalize existing business lines as an executive, people generally fight tooth and nail for short term gains over long term strategy that has risks.

Why Yes

Apple is a beloved company that is having trouble coming up with its next hit.

Hits take time and Apple has a cash hoard that can buy time, acquisitions, or a few small countries, any of which might help them at this point.

Getting developers on their side – getting a small army of Apple lovers tinkering to make the best tricked out, hot-rodding Macs instead of Windows and Linux boxes – may be one of the things that has immeasurable “brand lift” (imagine the ads linking Apple II homebrew computer club users and today’s garage hackers doing AR on weird looking Mac hardware) and helps cultivate a new generation of developers.

And there’s something fundamentally “Apple” about making desktop computers simple, easy, and affordable. That’s what the Apple computer was, deep down, and everything good (Apple II, Mac, iPhone) that followed.

It may be that by giving more software away, Apple will make their software and services available to more people, make them happier, and improve long term businesses.

Or it may just lose them a lot of money – if it was an obvious win, they’d probably already be doing it.

Either way, I’m typing this on a MacBook Pro with abysmal keyboard and Touchbar and it’s insane to me that this is the best they can do.

If they don’t start shipping better hardware or freeing their OS, Apple will lose key influencers.

Serious Changes

Today I thought about how I wanted to change some things.

Then I opened .emacs

;; cursor
(setq blink-cursor-mode 0)
(setq default-cursor-type 'box)

Cursors should not blink. Cursors should be boxes, not lines.

Small victories, tiny bits of autonomy.

Ace

I love this art from the 15-year anniversary of the Ace Attorney franchise by Takuro Fuse.

There’s something beautiful in this image split between Phoenix, starting alone with his mentor, next to himself years later as the mentor, surrounded by the people he’s bonded with.

· · ·

I finished Spirit of Justice yesterday.

Ace Attorney games are a precious thing in this modern world. I hope there’s another 15 years ahead.

37

Birthday coffee for a birthday vacation.

Video Game Consumption: Q1 2017

I played surprisingly few games the last couple months.

Glittermitten Grove

Beneath the surface, it’s a masterpiece.

(Trust me – I’m in it.)

★★★★★

Deus Ex: Mankind Divided

The original Deus Ex came out in 2000, and is now a cult classic. It was ambitious and brilliant and weird and also a bit of a mess because it tried to do so much.

But the essence of Deus Ex was that it used a first-person-shooter engine to create a first-person action adventure that wasn’t just about shooting things. Violence and shooting is one tool to solve problems, but by itself would almost never work. Stealth, exploration, and outwitting your opponents through clever use of skills and the environment was key. (That and the conspiracy theory / illumnati are real stuff.)

Despite the most recent entry Deus Ex: Human Revolution being a masterwork in the action RPG genre and revitalizing it, this time it feels like the series has run out of steam and ideas. The gameplay and mechanisms feel repetitive and dated rather than fresh after 5 years. The storyline is both incomprehensible (even for Deus Ex) and seems completely unfinished and unsatisfying. It feels plodding and boring. Rather than leave me wanting the next chapter, I felt bored.

Huge disappointment.

★★

Dishonored 2

Where Mankind Divided fails, Dishonored 2 succeeds. As an action-stealth-play-as-you-want RPG, it enables all sorts of different, varied play styles. Lethal or non-lethal, loud or stealthy, indirect or head-on, and all manners in between.

The characters, voice acting, plot and more seem improved.

Dishonored made Dunwall feel real and interesting. The most remarkable thing is how vibrant and larger and varied yet cohesive the larger Empire of the Isles becomes in this sequel, and how exciting it is each time we see more of it. The level design and setting combines with the mechanics and story to create something spectacular.

The choices and how you play again feel like they have weight and impact the world. Choosing to sow chaos has repercussions. Seeing how Emily and Corvo have changed over the years was actually interesting. Very much enjoyed this one.

★★★★

Ace Attorney: Dual Destinies

Despite more or less buying a 3DS to play this game, I never actually completed it. (I got through the first case and stopped.) Part of it was playing on a 3DS annoyed me.

I then bought it for iOS when it came out and played through the second case and stopped. I got bored.

This time, though, for whatever reason, the love of Phoenix Wright games overtook me again as I completed the other three cases.

If you have never played Ace Attorney, the iOS re-releases are the easiest way to experience them, despite the flaws in the ports it’s a lot easier than tracking down Nintendo GBA imports or DS versions now.

Anyway, I love these games so much, and I hope they keep making them forever.

★★★★★

How to build a tolerable gaming PC

The nice thing about building your own PC is you get exactly the parts you want.

The bad thing about building your own PC is it’s hard to know exactly what you want.

The last time I built a computer (about 2.5 years ago) was my first time building one completely from scratch in the modern era. (I’d cobbled together some tiny linux boxes from barebones PC’s, but hadn’t done the whole thing, and not for gaming.)

This is something I probably should be leaving to professionals. But I wanted the satisfaction of doing it myself.

I ended up with something that worked and ran modern games effectively on a weird 34” ultrawide monitor but it looked sort of absurd and I’m pretty sure I never got the thermal situation right – fans were loud and always running and it seemed to generated what I thought was an inordinate amount of heat.

This is also the bad thing about building your own computer – how do you even know you did it right?

It’s complicated.

Anyway, I learned a lot –

  • many cases are embarrassingly ugly
  • many parts wants to generate obnoxious lights
  • if you’re not careful you will end up with a weird looking monstrosity that has branded lighted logos flashing everywhere

Clearly I did not learn “just let the pros do it next time” because I’m stubborn.

So spurred on by my need for Thunderbolt 3 support discussed yesterday, I embarked on a new PC building mission.

Part Picking

Fractal Design Define R5

A beautiful but functional monolith, without obnoxious branding, windows, or colors.

Focuses on quiet computing so includes sound dampening and quiet fans.

I bought the “blackout edition” which makes even the internals and fans and everything black. It’s nice.

Intel Core i5-7500 3.4GHz Quad-Core Processor // BX80677I57500

Chose the i5 since it seems like overkill for gaming already and wanted less power/heat than dealing with the i7.

Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler

This is a super popular cooling solution that was recommended to me.

Seems OK. It was kind of a pain to install but seems far quiet and more effective than using the stock cpu fan like I did last time. If I do another build I might try something different / quieter / more expensive.

Asus PRIME Z270-A ATX LGA1151 Motherboard

Decision was driven by the need to support Thunderbolt 3 and the latest generation of Intel chips. ASUS hardware and software and BIOS etc. seems relatively inoffensive and functional.

Went with the “stock” Z270 board – seemed unclear what value most of the higher end motherboards with various add-ons actually did.

Asus ThunderboltEX 3

The add-on card needed to drive my LG UltraFine 5K display via a PC over Thunderbolt.

Corsair Vengeance LPX 16GB (2 x 8GB) DDR4-3200 Memory

Damn RAM is fast now. Very fast.

Samsung 960 Evo 500GB M.2-2280 Solid State Drive

These tiny little M2 drives are mind-boggling. Progress in size/speed/cost even over the past 2 years is significant. Definitely splurged on this because I was sick of worrying about disk space.

EVGA SuperNOVA GS 550W 80+ Gold Certified Fully-Modular ATX Power Supply

Has been super quiet and efficient.

Video Card

I transplanted my old EVGA GeForce GTX 980 Ti into this one as it doesn’t seem quite worth it to upgrade yet. If I was buying something I think I’d go with the EVGA GeForce GTX 1070 or 1080.

See also: the PCPartPicker list for this build.

Gotchas

Everything actually went really smoothly this time other than I was somewhat confused on how to properly set up the CPU cooler. I think it went ok, but I did spend like an hour watching YouTube videos of people doing it first.

Also I plugged in the ATX power but forgot the separate CPU power and spent 30 minutes checking everything but that – rookie mistake, but, whatever. Helps to build character? It makes the end product feel like more of a triumph, maybe.

Conclusion

I have a system that is quiet, sleek, and just has a single white LED on top to indicate power and no other annoyances that is not embarrassingly loud or flashing ugly lights under my desk.

LG 5K Monitor with a Windows PC

Despite being officially unsupported, the LG UltraFine 5K Display can mostly work with a Windows PC that supports Thunderbolt 3.

You can even use your existing GPU to drive it with the right hardware. The USB-C ports on the monitor are recognized and works properly. The speakers work too. (They’re terrible, but they work.)

Major caveat: Only 4k as max resolution, 5K is trickier right now.

This is fine for my usage – gaming on PC, everything else on Mac. But if you’re looking for true 5K you may need to get the pricier Dell 5K monitor or try one of the few motherboards that claim to support 5K out of the box mentioned below.

How

Hardware:

I used an Asus Z270A-prime which isn’t on that list but Asus explicitly notes is compatible.

I suspect most recent Intel boards with a 5-pin thunderbolt header will work, as it did for John Griffin who used a similar setup as me but with a Gigabyte motherboard.

After connecting the add-on card to the motherboard, you do an external connection from the Displayport on your GPU card to a mini-displayport input on this card with the included cable.

Then connect to the LG 5K monitor with the Thunderbolt-3 cable.

The display powered up and worked at boot instantly for me, including showing the POST screens.

Support for the USB-C hub and speakers on the monitor required me to make a few BIOS changes to enable Thunderbolt-3. I guess TB3 support needs to be enabled explicitly, but somehow the Displayport passthrough on the card doesn’t require it? Which was convenient but very confusing.

What Works

On a PC with Windows 10 –

  • 4K – 3840×2160 @60hz
  • Speakers – (but again, why) and volume
  • Hot swapping the cable between Mac/PC
  • USB-C hub (probably)

What doesn’t work –

  • brightness adjustments (though maybe that’s fixable)
  • reliable USB device recognition on hot-swaps
  • the webcam (haven’t tried to figure out why)

The USB-C hub passthrough was recognized and appeared to work, but wasn’t reliable for me on hot swaps. It was always fine on a fresh boot.

That’s hard to debug or speak definitively on as I’m only using it with legacy USB-A peripherals (keyboard, mouse, speakers) that are connected to a cheap Amazon Basics hub, which is then connected via Apple’s USB-A to USB-C adapter. (I think the USB-A hub is the unreliable part.)

Hot swapping the cable generally worked fine, except for flakiness in recognizing USB devices, and sometimes plugging individual devices in and out of the hub fixed it. It worked enough of the time that I suspect swapping out my old USB-A hub may fix it.

Why

I bought the 2016 Macbook Pro a few months ago, and part of my excitement around it was the LG UltraFine 5K Display that Apple announced at the time.

It’s a native resolution of 5120×2880 at 27”, so it has the same screen real estate as a 27” 2560×1440 but doubling the clarity and pixels per inch. It looks really good!

My challenge with it was I have a gaming PC running Windows so I can play real computer games and use my HTC Vive and I didn’t really want to have two monitors on my desk. I also really like having my monitor act as a USB hub and KVM since I use both a Mac laptop and a desktop PC.

Previously I was using the LG 34UM95 34” ultrawide monitor for this – it supported Thunderbolt 2 input from my mac and Displayport from the PC, and happily swapped the USB devices between them.

So I’m losing the KVM aspect with this setup – I have to actually swap the cable between the desktop and laptop.

But the gains – 5K retina resolution on the screen I use most makes a huge difference – and while I enjoyed the 21×9 aspect ratio for gaming it is much less hassle and equally nice to go back to 16×9 but up the resolution to 4K. (Performance of 3440×1440 and 4K on my GPU, a 980TI, is usually about the same.)

Other Compatibility Notes

The LG UltraFine 5K will also work with older Macbook Pros I tried, but at lower resolutions.

As noted in Apple’s support page, most Macs from the past 3 years will drive it at 4K – 3840×2160 at 60hz – the 2014 Macbook Pro I tested worked as expected.

Not noted on that page but tested and verified by me: my first generation 2012 Macbook Pro with Retina Display will drive the display via the Thunderbolt 3 (USB-C) to Thunderbolt 2 adapter, but only at 2560×1440. I couldn’t get USB devices to be recognized or work, just video.

While it doesn’t make sense to buy this monitor except for a recent generation Macbook Pro that can drive it at 5K, you will (in most cases) be able to use this as a secondary display for older machines if needed, which is nice.

Alternatives on the pc side – the Gigabyte GC-Alpine Ridge looks like it should also work, and has dual Thunderbolt 3 ports, and (seemingly) two Displayport inputs. It claims to enable 4K video, and I suspect it’s possible that it supports dual stream and could drive 5K in some cases but nobody seems to be able to actually buy this card in the US or test it from what I could determine.

Gigabyte also produces some motherboards natively support 5K output over Thunderbolt 3 –

I assume these would be limited to the integrated graphics on the board, so wouldn’t be interesting for my gaming setup, but at least one person has gotten the Z170X Designare to drive 5K on the tonymacx86 boards.

Conclusion

The LG Ultrafine 5K is on sale (30% off) at Apple through the end of March so if what was holding you back was PC compatibility, there’s now at least a few documented examples of it working. Mostly.

It’s still early days of super-high resolution displays, so things are a little trickier to get working. I suspect in a year or two 5K and 8K monitors and supporting motherboards and GPUs will be much more prevalent. If you want to live on the bleeding edge now, you have to be picky about your parts.

But What About Your Personal Brand

Gershon, a professor of anthropology at the University of Indiana, Bloomington, spent a year interviewing and observing job seekers and employers in Silicon Valley and around the US. Her new book, Down and Out in the New Economy: How People Find (Or Don’t Find) Work Today explains that branding is largely a boondoggle advanced by inspirational speakers and job trainers. It doesn’t help people get jobs. But it does make us more accepting of an increasingly dehumanized job market that treats workers as products rather than people. […]

When people think of themselves as brands, they are speaking the language of reputation, appearance, and marketing. It’s hard to switch from that to a discussion of moral responsibility. […]

“Maybe instead of thinking about people as property or businesses, we could think of people as craftsman.”

Noah Berlatsky – Our obsession with personal branding reveals a dark truth about the future of work Quartz

So the conclusion was your personal brand won’t help you get a job. But it may make you more accepting of dehumanization in the postmodern economy.

The book: Down and Out in the New Economy, Ilana Gershon [via]

Throw Away Your Operating System

I’ve been thinking about the complexity of modern technology stacks. See You probably like bad software from earlier this week.

Some of the interesting approaches to dealing with this eschew operating systems almost entirely.

Finding ways to reduce attack surfaces, create more maintainable systems in a world where the cost of hardware approaches zero is critical because we’re going to have a lot more of them.

Nobody (or company) is going to be able to effectively manage an infinite number of unix systems spread across dozens of devices indefinitely. I can barely maintain one unix server running this site and I have a computer science degree and two decades of experience running it.

Some interesting approaches / existing work –

unik – The Unikernel Compilation and Deployment Platform

UniK (pronounced you-neek) is a tool for compiling application sources into unikernels (lightweight bootable disk images) rather than binaries. UniK runs and manages instances of compiled images across a variety of cloud providers as well as locally on Virtualbox. UniK utilizes a simple docker-like command line interface, making building unikernels as easy as building containers.

Unikernels are interesting – throw out the OS and write monolithic kernels with a single application on top. Somewhere between “research concept” and “possibly a production-ready technology” but this toolkit makes testing and applying unikernels in various current forms pretty straightforward – I had some basic Go code running on a rump kernel very quickly.

Unikernels – rethinking cloud infrastructure

Unikernels are specialised, single-address-space machine images constructed by using library operating systems. Unikernels shrink the attack surface and resource footprint of cloud services. They are built by compiling high-level languages directly into specialised machine images that run directly on a hypervisor, such as Xen, or on bare metal. Since hypervisors power most public cloud computing infrastructure such as Amazon EC2, this lets your services run more cheaply, more securely and with finer control than with a full software stack.

gokrazy is a pure-Go userland for your Raspberry Pi 3 appliances

For a long time, we were unhappy with having to care about security issues and Linux distribution maintenance on our various Raspberry Pis. Then, we had a crazy idea: what if we got rid of memory-unsafe languages and all software we don’t strictly need?

MirageOS

MirageOS is a library operating system that constructs unikernels for secure, high-performance network applications across a variety of cloud computing and mobile platforms. Code can be developed on a normal OS such as Linux or MacOS X, and then compiled into a fully-standalone, specialised unikernel that runs under a Xen or KVM hypervisor.

Library OS in OCaml.

Rump Kernels

Rump kernels enable you to build the software stack you need without forcing you to reinvent the wheels. The key observation is that a software stack needs driver-like components which are conventionally tightly-knit into operating systems — even if you do not desire the limitations and infrastructure overhead of a given OS, you do need drivers.

Uses NetBSD drivers and enables a large amount of existing software to more or less “just work” as a unikernel – some example packages include mysql, nginx, leveldb, haproxy, rust, tor, zeromq.

Unshareable Concepts

I’m uninterested in the latest viral content.

I want more exposure to things that are good, even if you don’t want to share them. Or can’t share them in a moment easily.

Or things that won’t get repeatedly re-shared because they are actually complex and require thought and therefore less likely to be a meme.

Or they matter too much to you to share without thought.

Engagement optimized social media underexposes these things systematically.

It’s like we have a biological ecosystem where the most infectious virus won, and we’re slowly seeing all the complex organisms die.

This is a hard problem. All the incentives around attention and money are generally going the opposite direction.

Unfiltered [vghf, bn2b, jglg, docomoji code-poetry]

The Video Game History Foundation

We’re preserving the history of video games, one byte at a time.

Frank Cifaldi’s destiny is this foundation. (At least, that’s what I’ve been telling him.) Preserving video game culture is important – it’s great that Frank has a structure and team to do this full time now. First special collection is great – NES Launch Collection

Twitter Is Bad, Part 400000000000

I think Twitter actually has some sort of weird philosophical stance where brands, consumers, and Russian propaganda bots are all people, and they all stand on equal footing, and must be treated equally. Everybody in charge at Twitter was like “Wow, we live in a world where corporations have all the same rights as people and… it’s turned out great, we better emulate that!”

So great to see Andrew writing. Also, he’s right that foundational assumptions in networks like Twitter (all nodes are equal, anyone can contact people) have huge implications. And they’re hard to change. (See: death of Orkut.com)

Jon Glaser Loves Gear

season 1 trailer

Not sure how I missed Jon Glaser getting a new show until my brother sent it to me.

Well, I do. Probably because I pay no attention to anything, and it’s on, truTV – and when John Hodgman plugged the show on Comedy Bang Bang and explained he was playing Gear-i, a Siri-like artificial intelligence on a phone that helps Jon Glaser choose what gear to buy, I assumed he was fucking with the audience. But that’s also true, and it’s awesome.

I made a 2001-era emoji font! That you can use!

Last month, my coworker casually told me he still has a 2001 era DoCoMo phone, which is one of the first phones to have emoji […] I then took a 10 hour flight to Europe and, for lack of better things to do while watching every movie that came out this year, I drew every one of those emoji as a sprite. 166 emoji in total, 12x12px each, in one of six colors

Amazing hand-tuned tiny pixel modern usable font rendition of one of the earliest emoji fonts.

http://code-poetry.com

This website displays a collection of twelve code poems, each written in the source code of a different programming language. Every poem is also a valid program which produces a visual representation of itself when compiled and run.

This is inspiring – both in concept and execution.

You Probably Like Bad Software

The challenge with software is it gets worse over time.

It seems counter-intuitive that the more people work on something, the longer it takes to get done, but that’s a well established principle in software.

What’s harder to fathom is that it also gets worse. But that is the default outcome. Outside of extraordinary circumstances and extreme measures taken, it’s what you should expect.

Software is a world created by thoughts where real work and progress over time turns into an endless ouroboros of engineers making software that breaks other software then having to fix that broken thing only to break again in new ways.

Forever.

If you’re an engineer, it’s very likely you are making software worse every day.

It’s ok, most people who work on software are.

If you are using software, you probably use bad software. You probably like bad software.

It’s ok, most people like bad software.

The alternatives are usually worse.

Perverse Incentives

Software gets worse over time because what people change often isn’t related to making the software better in a coherent, measurable way.

It is not fixing bugs (boring! not fun!) or improving security (nobody cares until it’s too late!) or making things faster (who cares! computers and phones are faster every year! Moore’s law makes optimization forever unnecessary!)

What’s sexy and interesting in the world of software is adding features, redesigning a user interface, or integrating it with some other unrelated piece of software to help it (synergy!) or monetization – which these days usually means spying on users to better target ads, serving ads, delivering ads, or in rare cases selling things people don’t need more efficiently.

Often this is how individuals working in software show they did something and that’s how they are judged.

(People brag about the new software they make, nobody brags about the terrible awful bugs they had to fix.)

But if there’s a piece of software people are already using, by definition, it is useful and used.

Most of the above is likely going to get in the way of that existing usage.

If you’re not fixing bugs or improving performance which, you know, unless you are properly testing things and measuring them – also boring! – you’re probably harming those things – you’re making something that people use worse.

It’s probably attemping to solve a company’s problems, not users’ problems. And the accidental outcome is making worse software.

Again, that’s ok, most people make bad software.

Most people use bad software. Most of the software industry is predicated on selling, supporting, and monetizing bad software and making it worse over time.

Underneath It’s Even Worse

The perverse incentives of individuals who work on software is one thing – but it’s when you start moving down the levels of abstraction that things get really scary.

Let’s start with operating systems.

Now the accumulated cruft, random interface changes, inconsistent features, and whatever “me too” garbage thrown in to remain “competitive” doesn’t just impact a little corner of the software world via an application, but has the potential to fuck up every process and program running on top of it.

Eventually, the weight of this nonsense led to people jumping – leaping with joy! – to stop using their computer for phones.

Snobs/weirdos like me in linux/unix/macos/beos/amiga/whateverbsd land were somewhat insulated from this but it’s not hard to understand how using an iPhone 4S with a 3.5” screen and consistent touch interface would be a massive improvement over any verison of Windows released after 1995 which we too quickly forget was basically a wasteland of crashing (blue screens of death) and virus-filled malware.

“Getting rid of the garbage on your parents’ windows machine” is an annual ritual for many people.

2007-10 era smartphones were a clean slate – there just hadn’t been enough time for programmers, product managers, marketing hacks, sales guys, and aesthetic-obsessed designers to fuck it up by larding on complexity.

For those in the future who are baffled, let me set the scene.

It’s 2017, and the Apple iPhone 7, a device previously heralded as one of the most beautiful, usable, and understandable products, has a tentpole feature called “3D touch” – a rebranding of the disastrously named “force touch” – that performs different actions depending on the pressure applied while tapping.

Which is different than the different actions performed based on the duration of tapping.

So trying to rearrange the icons on the home screen – already an undiscoverable action but one users learned over a 10 year period – depending on how hard you press can accidentally trigger a nonsense “app menu” which by default includes a single item – “Share.”

Nobody wants to “share” their apps. That is solving developer and company problems (use more apps you don’t need!) not user problems.

And the few that do want to share an app definitely don’t want to do it by pressing REALLY HARD on the icon. The only reason people press really hard on an icon is to move it which they had to discover somehow in 10 years since there’s no affordance for it.

And the iPhone is probably one of the best case scenarios. Some people at Apple are really good at this stuff – they just seem to be increasingly overruled or making bad decisions.

It’s not just them. These things seem inevitable.

Laws Of Bad Software

Given enough popularity, hardware will be mediated by bad software trying to solve corporate problems, not user problems.

Given enough additional code, all software will become bloated and incomprehensible.

Now imagine these software stacks – applications built on frameworks using libraries dependent on operating systems with kernel bugs all packaged into containers deployed on hypervisors built on other operating systems running on virtual machines managed via orchestration systems that eventually somewhere runs on real CPUs and memory and disk drives.

That doesn’t make any sense because nothing makes sense anymore in software.

We used to call people who understood things end to end “full stack engineers” but that’s a bit laughable now because nobody really undestands anything end to end anymore.

This Is Your Program, And It’s Ending One Millisecond at a Time

If you aren’t measuring latency, it’s probably getting worse, because every stupid feature you’re adding is slowing things down.

Most software would be improved if people just started turning features off.

Turning features off generally doesn’t sell additional units to consumers, close a sale, or make for a PR fluff piece, so people only do it in times of extreme failure or consequences.

I’ve seen an inverse correlation in the amount of engineering time spent on features vs their usage regularly. I’ve seen direct correlations between the amount of time spent on features and higher latencies pretty much constantly.

Some software cultures understand this and put tight controls in place to prevent regressions (because, you know, it turns out to be a real revenue and/or usage problem when people abandon your software because it’s too slow) but if your software is already painfully slow due to low standards and atrophy good luck convincing people to fix it.

Software bloat is the seemingly inevitable and sad reality of nearly all software.

Security

As the layers of complexity start to overwhelm end users, you can only imagine what it’s like for the poor programmers stuck making all this work.

It’s layers upon layers of filth nobody wants to even wade through.

Kind of like how you’d be willing to pay lawyer-like fees just to avoid legal contracts? Well, tech is like that too now, hence tech-wages.

The terrible truth of software security isn’t that people are incompetent or lazy (though that probably happens sometimes.) It’s that the interactions between components, dependencies, and overall systems are now so awful that they may be impossible to secure at a reasonable cost.

That’s not a metaphor – literally – the benefits may outweigh the costs of connectivity according to insurance risk assessments –

“A future where the annual costs of being connected outweigh the benefits is not only possible, it is happening now. According to our project models, annual cybersecurity costs in high-income economies like the U.S. have already begun to outweigh the annual economic benefits arising from global connectivity.”

How To Stop Bad Software

1. Death

Dead software can’t accumulate additional bugs. It can’t get new features. It can’t get any worse. It also can’t make assumptions about how fast today’s hardware is.

George R. R. Martin writes 1000 page novels on DOS machine, on WordStar 4.0

If you disconnect hardware from the internet and run old software (or hide it in a virtual machine) it may actually run better as the inevitable pace of hardware improvements provide speed updates without software engineers using that additional hardware power to get in your way.

Bruce Sterling once said, If your grandfather’s doing a better job at it, you can put that aside for later, when you’re dead, like him.

If 25 year old dead software is doing a better job of it, then maybe stop trying to top it.

But nobody wants to actually be a neo-luddite and refuse to use any normal technology. It’s like, do you really want to never use Facebook and miss out on everything because you insiste on using a god damned email mailing list? (I do, but I was always anti-social, hence the social aspects of the web were always sort of an anomaly in my life.)

2. Fight complexity

You can use OpenBSD or similarly esoteric systems where developers are even more jaded than me but actually do something about it – see features are faults, redux

This is fighting the good fight – having taste, being smart and proactive, outsmarting and outwitting an endless array of opponents.

The problem is some of those opponents start to look like forces of nature (hostile nation states, friendly nation state three letter agencies, corporations with more money than most nation states) and actual forces of nature (entropy) and it’s just fucking tiring because you know it’s probably just a losing battle that never ends and everybody is just fucking annoyed at you the 99.9% of the time a disaster isn’t happening and the 0.1% of the time it is, people are really fucking annoyed when you say I told you so.

3. Begin Again

When Microsoft’s and Intel’s duopoly led to a certain terrible low quality / high boredom in mass market hardware and software, it provided the fertile ground for the world wide web. By adding a new magical layer of abstraction (the web!) that made the underlying garbage of Wintel a commodity, there was a whole new world of adventure.

Normal people could like, look at the source of a web page, understand what was going on, and write their own!

Now, 20 years later, when nobody can understand which of the dozen competing Javascript frameworks to use to even write something that will reliably manipulate a Document Object Model efficiently enough to re-render a page of malformed bastardized SGML-derivate gunk after a click returns an asynchronously fetched incomprehensible stream of data nested between parentheses and quotes, that seems like a quaint concept.

The clean slate of mobile applications – where limited memory, screen size, CPU, and battery – actually provided enough constraints to force engineers, designers, and the software industry to shut up long enough to solve some actual problems in a comprehensible way – seems to be ending.

In my tech lifetime it seems that we only get about 5 years of “non-insane complexity” in our platforms before the “ecosystem” shifts into a swampish nightmare, and then 5-10 years of complete hell before we can move in. (My deep worry is this pace may be accelerating.)

The current hot place to jump next (internet of things) is going to be pretty fun when it works!

But when that stuff gets too complex and all the newly networked objects around us start speaking Portugesse we don’t understand and firing off spam emails because nobody bothered to secure the SSH and SMTP daemons on the ancient versions of linux that are lurking just beneath the surface we’re going to be in for a world of pain.

That’s already happening now – we’re already in trouble.

How 1.5 Million Connected Cameras Were Hijacked to Make an Unprecedented Botnet

As many predicted, hackers are starting to use your Internet of Things to launch cyberattacks.

At some point we have to ask ourselves, why the hell would anybody allow a camera that runs linux into their lives? Linux is an operating system that thinks it is talking to a teletype.

· · ·

We’re building the future of super-intelligent robots, but we’ve put brains in them that are hardwired to think it’s the 1970’s.

When things inevitably go wrong, I hope that they’ll let me watch Star Wars

· · ·

An earlier version of this essay was published as trenchant.org letter #9 – you probably like bad software.

How I Write, 2017

Amethyst

A tiling window manager. Set to “widescreen tall” mode.

Safari

Faster, leaner, and more dependable than the alternatives, in my opinion, on MacOS.

(Although not as fast or lean as w3m if you’re willing to go that route.)

Aquamacs

A version of emacs for MacOS. It’s sort of the spiritual successor to XEmacs, the weird version of emacs for X from the 90’s, kind of. It’s an acquired taste but I love it.

Used with –

Terminal.app

I use standard MacOS terminals when I’m not using my weird fork of CRT.

Usually there’s more interesting things going on, but here we have a live-preview setup for my new static site generator, snkt.

This is using entr to monitor a directory of text files. entr is great! It’s why I haven’t written a file-system watcher in snkt.

With something like

$ ls ~/dailytxt | entr -r snkt -b

Anytime a file changes, snkt rebuilds the site. This takes less than a second on my MacBook Pro.

The other window is looking at the directory of HTML files created, also using entr, and reloads Safari with a reload-browser script.

So anytime I save a text file in that directory, it will rebuild the site on my local machine, and reload the browser automatically, making for a nice fast feedback loop.

I suppose in the modern era I could be using wysiwyg tools but I find this to be more satisfying and efficient.

Behind The Scenes

The text files that compose trenchant daily are synchronized in real time via Syncthing to my Linux server. Syncthing is a great alternative to Dropbox if you have a lot of time, energy, and desire to not use Dropbox.

The local configuration on my MacBook Pro for snkt shows all entries, even future-dated ones, but the one on my server will ignore anything with a future date, so if future entries get sync’ed they are ignored.

Because the site only updates once a day, I don’t listen for changes there, but simply have a cron job that rebuilds the site every morning. (Builds take about 1-2s of real time, which is pretty good considering I have 2028 entries.)

I also turned back on a script that tweets new entries out when I include certain metadata in the entry (which is probably the only reason you clicked this!)

Hardware

2016 MacBook Pro, 15”

USA Filco Ninja Majestouch-2, Cherry-MX Black – still the best keyboard I’ve ever owned.

SteelSeries Senei Mouse – great despite awful aesthetics but battery seems to have lost longevity over long usage.

LG 34UM95 34” ultra-wide monitor – 21x9 aspect ratio is cinematic and I vascillate between thinking it’s awesome and being like wtf why did I get this. I tried replacing it with one of the new 5K monitors but failed (you’d know the whole tragic story if you were reading the trenchant.org list.)

Sixteen

trenchant daily turned 16 yesterday.

16!

trenchant.org daily’s format – one daily post across multiple media formats, focused on quotes, essays, photographs, and links – was intended to be a departure from the weblogs that were dominant.

And now, here we are, a decade and half and change later, and not only is having a weblog an archaic concept – but so is having a personal web site that is regularly updated.

In 2001 the web felt like a place of beautiful chaos where anything could happen, and the frictionless creation and distribution of information would change everything. That mostly happened, though not always in the ways I expected.

the persistence of daily

In past years I used to reflect on whether I’d still be doing this site in the future or not, and how old I’d be.

Now I think about it a little differently – it’s remarkable how little maintaining a web site has changed in these years, technology wise.

HTML, CSS, HTTP, not-quite-Unix servers.

Good technologies and protocols are durable. They last. You can expect they will work in a few years when you need them.

The tech is better and cheaper. For $5 a month you can run a hosted virtualized server that has more power to serve web pages than you’ll probably ever need.

What’s changed is that readers spend time in other places now. The web isn’t it anymore.

it’s charming

But the modern, hyper-optimized, aggregated social systems lack the charm of the personal web.

I was trying to explain to coworkers earlier this week that when we removed web design as a part of web publishing, we lost something magical – the “ugly” web of early web sites, and even the centralized services of Livejournal, Diaryland, Pitas, Blogger, etc. generally had voice and personality that you can’t get when you decontextualize web pages into aggregated social streams.

Also it’s hard to be charming when you are basically highly optimized surveillance technology for more efficient advertising.

I used to think everyone should have a web site and then we had the dystopia of social networks and I changed my mind but maybe everyone should have a web site. They should just have to learn HTML and UNIX system administration first.

· · ·

The web has a look and feel and voice and authenticity and power that you can’t get elsewhere.

It’s still thrilling to have my own domain name and server and site and make it all look and work and feel exactly how I want.

I hope it still does in another sixteen years.

· · ·

If you enjoyed these posts, please join my mailing list