My all-time favorite video games

Why not list them here. In rough chronological order. Incomplete, very much WIP.

  • Shadowgate (NES) (1989)
  • Final Fantasy VI (1994)
  • Chrono Trigger (1995)
  • World of Warcraft (Original trilogy, Vanilla/TBC/WotLK) (2004)
  • Shadowrun Returns, Shadowrun: Dragonfall (2013)
  • Rimworld (2013)
  • Technobabylon (2015)
  • Tormentum – Dark Sorrow (2015)
  • Valheim (2021)

Asian Geopolitical Predictions, September 2022

The caveat here is that nothing really bad has happened since the end of the Cold War, but I like to write down predictions and find out in a few years whether I was right or not. No specific timeline for this. This sounds great.

A) Encouraged by deterioration of global supply chains, domestic unrest, and Xi Jinping thought/Wolf Warrior diplomacy/whatever, China takes Taiwan, giving it a strategic position in the first island chain.

B) US fails to retake Taiwan and commences a military buildup in the region. New US bases from Japan down to Southeast Asia.

C) The resulting sanctions and embargoes end supply chains as we know them. Small and export-led economies are the hardest hit. North America fares the best as it has the capacity to be self-sufficient in energy, manufacturing etc. Europe somewhere in between.

And here I was just last week feeling I was underweight on ex-US stocks…

A splash of cold water for the Mars futurists

I think for an average Martian Joe any type of outdoor recreation would be a rare thing. Rip your suit in this rugged rocky landscape and almost immediately you start to develop serious injuries, pass out and die. And what if you vomit? How do you pee? Cheaply protecting someone from radiation, near-vacuum, and below freezing temperatures while keeping them comfortable is way beyond our current level of tech.

Most likely if we actually built a city of people on Mars, life would be similar to living on an earth city but with low-G and you never go outdoors. We’d replicate typical earth city attractions and add a low-G spin where it was appropriate. Access to nature would be a problem (expensive fake Earth hab dome, dangerous outdoor games, no unfiltered sunlight on your skin, etc.)

Elon’s ravings notwithstanding, what makes sense to me is extensive robotic exploration of Mars, perhaps operated by a skeleton crew of people who don’t need quite the extensive training and qualifications of an astronaut. With enough exploring, experimenting and digging and we’re likely to find something very exciting.

Arcane: Season One is a Stupendous Achievement of Animated Storytelling

Arcane is a nine episode animated series based on the world and characters of the video game League of Legends.

I played League of Legends many years ago but was never a huge fan. When this adaptation was released in November, I gave it a miss. LoL didn’t strike me as a particularly fertile medium for incubating good television.

A few months later I learned, through the accident of one recommendation algorithm or another, that Sting had written a song for the Arcane soundtrack, titled What Could Have Been. Really? THE Sting? Holy shit, this is actually one of his better songs in my opinion.

OK, I reasoned that if they were willing to hire Sting and he made the effort to get this right — I might as well give Arcane a watch.

What followed was the fastest series binge of my life.

Critics have almost universally acclaimed Arcane as the greatest video adaptation ever made. Admittedly, this is a low bar. That genre is filled with cash grabs and low budget productions. (Not the case with Arcane, which took six years to create, and is rumored to be the most expensive animated series ever produced.)

In fact, Arcane’s triumph in its own category is so obvious that the question which has preoccupied fans and critics since its release has been: Is Arcane the best animation ever created in the West, period? I struggle to think of anything better. How about in the East? That raises the bar substantially. Are we dealing with something here which rivals the ethereal creativity of Miyazaki’s capstones or the existential heft of the better Ghost in the Shell adaptations? I’m not sure, but owing to the progress of technology and the talent of Fortiche Productions, the quality of Arcane’s animation surpasses them all. Its music is also one hell of a ride (Sting, Raymond Chen, Imagine Dragons, and Pusha T all contributing original work to one album? WTF*$!#! planet am I on?!).

And we still haven’t gotten to the real strength of Arcane: its story and character development, which have lit the Internet on fire with endless plot analyses and fan reaction videos. This show makes people cry, man. It makes them cry buckets. It tackles subject matter which is deeply relevant to modern audiences, but it’s smart enough not to beat them over the head with moral conclusions. I question whether a story like Arcane’s could ever be the story of a live action production, and I’m not talking about the fantasy elements — I’m talking about the personal impact of the issues around which the plot revolves, the flaws of its heroes, the humanity of its villains, and how close to home the hard parts clearly hit for a surprising number of people, which a Hollywood studio in 2022 would struggle to give the green light. Joker comes to mind as a live action production in this category and that’s about it. (I think Arcane is better than Joker.)

So ironically this is a computer animated production which manages to be more real than just about all the live action out there.

Arcane certainly has its flaws, which I’ve started to notice after watching it three times — but it feels like talking about them would be doing it a disservice at this stage. It has no equal within its category. In the future people may talk about animation in terms of what came before Arcane, and what came after it. You should watch it now. It’s too soon to give all the other stuff much consideration — right now, the thing to do is just to watch it.

Can Twitter Be Salvaged?

Twitter’s very popular, but the way it reduces communication to a series of hastily crafted, artificially truncated blurbs produces dystopian quantities of toxic and fallacious discourse. Quality Twitter is a website which attempts to provide a better experience. It presents Twitter threads with a Medium-style reading experience. (A Twitter thread is a series of tweets which are all written by the same author and have been linked together by that author.) Individual, non-thread tweets as well as all comments are excluded. Threads are displayed as if they were “long read” style articles. Popular topics or hashtags are featured on the homepage like a news site, so the user can browse through multiple “articles” on the topic he’s interested in.

The theory behind this is that Tweet threads tend to be the highest quality content on Twitter, with single tweets and comments generally containing most of the knee-jerk trash. (A casual scan of recent Twitter threads found this to be true, though a lot of threads are still garbage, and many feel disjointed when you read them because they weren’t composed as a single piece of prose.)

Quality Mastodon might work too, if Mastodon supports threads, but Twitter has more content.

This idea is similar to threadreaderapp.com, but they differ in a few ways:

1. TRA requires opt-in, they only “unroll” a thread if someone requests it. Opt-in is irrelevant to my idea; as far as I know, presenting any Tweet thread in this format is compliant with the Twitter ToS. The focus of Quality Twitter is to expose the best content for popular topics or hashtags, so Quality Twitter would instead monitor those hashtags and add new threads as they emerge.

2. TRA has a poor discovery, browsing and reading experience. You can only discover threads by searching for hashtags. They don’t give you any hints as to which hashtags might be interesting to search for. The thread browsing and reading experience is mediocre. URLs are not SEO friendly. Overall, the site doesn’t feel like a destination. This is all pretty easy stuff to fix so it probably just isn’t the focus of the small team behind TRA.

Would Quality Twitter work as a business? Maybe, but I’m not sure it’s a slam dunk. The content would be better than Twitter, but worse than Medium. The only means of revenue generation I can think of are ads and subscriptions, and traditional publications struggle to make ends meet online through these methods. The presentation of the content might be more satisfying than Twitter itself, but it would definitely be less addictive. Still, there might be a lot of people who would like to read the more thoughtful content posted to Twitter and avoid the rest, and the idea isn’t capital intensive (the engineering is pretty simple and not much else to the product). Dependency on Twitter is an obvious liability but could perhaps be mitigated by introducing other sources of content (any textual social media would be a candidate for inclusion).

This points to a deeper problem: in the world we live in today, billions of dollars are being poured into software which hijacks our lizard brains in order to maximize profits for its owner, even when this isn’t in the best interest of the user. Examples include most popular social media as well as play-to-win/lootbox video games, which compulsive addicts spend tens of thousands of dollars on. These products are the Frosted Flakes of software, you don’t eat it because it’s healthy, you eat it because your brain is hooked on sugar. Most people who use this software don’t know any better or are using it for the same reasons we all get addicted to something: we crave a distraction to dull the pain of existence. Have we ever found a way to counteract the lure of tobacco, alcohol, drugs or gambling?

My initial answer to this question was no, all we’ve come up with is regulating these things to limit the damage they cause in excess. But I realized we actually do have another trick up our collective sleeve, which is to substitute something that’s equally addictive, but less harmful. This is a big part of vaping is so popular, it’s equally addictive to smoking, but probably less deadly, so the switch makes sense to both the lizard limbic system and the rational prefrontal cortex. Sure enough, a lot of people are making this switch. I have to think more about this.

Cliffs notes on wearing a mask for coronavirus

I’ll explain whether you should wear a mask for the novel coronavirus and why the debate has been complicated.

The short answer is yes, make or buy a reusable cloth mask, wash it and wear it regularly.

  • Wearing a mask properly reduces your risk of contracting COVID-19. Crucially, if you already have the disease, it decreases the change you will spread the coronavirus to other people.
  • For the purpose of preventing COVID-19, any brand new mask is probably better than no mask, since it’s primarily spread through liquid droplets.
  • Wearing masks properly includes not touching anything other than the ear straps, and changing to a new mask regularly. Use of the mask degrades its effectiveness.
  • The outcome of wearing a mask improperly is neither well studied nor well understood, it may help, hurt, or have no effect in different circumstances.
  • Wearing a mask is not a substitute for social distancing, which includes maintaining 6 feet of distance from other people at all times.
  • The supply of masks is limited and as a result health care workers in many places don’t have enough masks to use them properly. If health care workers become infected it can lead to a catastrophic outcome because they easily become super-spreaders.
  • Advice from various authorities to not wear a mask made sense because the supply is limited and it needs to be reserved for health care workers.
  • Advice to make your own mask, wear and wash it regularly also makes sense because while we don’t know much about this approach, it’s probably better than nothing.

Any questions?

Netflix pulled the plug on a feature designed to get kids addicted to Netflix

https://www.vanityfair.com/hollywood/2018/03/netflix-patch-testing-kids-binge-watching

Any metric or KPI is going to get gamed in harmful ways if it’s prioritized too highly. So the answer is to use A/B testing to make sure you’re not regressing on the goals of a design change, and judiciously, to validate whether your change is moving things in the right direction.
But beyond that you have to do the hard slog of mastering the discipline of UX, real, human-centric design, and accept that not everything important is measurable.

We have plenty of examples of how doing real UX instead of playing a numbers game can differentiate your business, Apple has applied this philosophy consistently over many years.

The root of the problem is a “fuck you, market share at all costs” culture that has come to permeate Silicon Valley. And you can argue (somewhat cynically) that this philosophy makes sense in a blue ocean where you have no competition and just need to gobble up people and turn them into cash before someone else does. But I think going forward this mentality may actually become a liability as more humane alternatives to heavily despised products emerge. Many of the current crop of giants seem to have forgotten that a company’s most valuable asset is always its brand.

On monopolies

Winners in the tech industry turn into giants via vertical integration and predatory pricing. They keep adding stuff related to the segment they first won in. They keep making stuff cheap or free to undercut competitors.

There are many examples of this and we have two big examples in the US of anti-trust regulators stepping in and hobbling the giant: IBM in the 60s/70s and Microsoft in the 90s/00s. In both cases the government forced the giant to change its behavior and the industry eventually experienced a new wave of small firms that filled the void, competed, and innovated.

Notably in both cases the giant also got off relatively easy in the end (no breakup). And both are successful companies to this day. But they were tied up in court long enough to lose their ability to squelch competitors and dominate new markets.

Now you could argue this was not fair to IBM or to Microsoft (in the future, to Google?). In both cases you’d be right. I’d argue you don’t have to be fair to the winner especially once he starts playing dirty tricks to keep others down. You be fair to the little guys and you get tough on the big guys, that’s the foundation for both a healthy market and a healthy society.

You could also argue that making things free is good for consumers. You’d be right again, except in the case where the freebies are designed to kill off competition and stagnate the product category (Internet Explorer being a classic case where Microsoft bundled it, gave it away for free, then stopped improving it for years).

We have 50 years of precedents and the people at the DOJ know their history. I suspect it’s just a matter of political winds and timing before Alphabet has to follow in IBM and Microsoft’s footsteps.

Social discussion vs social soundbiting

The centralization of social media is a problem, but I think maybe we’re dealing with an even deeper problem. I joined Mastodon, the potential is clear, and there is already an early adopter audience sharing content I find more interesting than Twitter/FB.

But it only took me a few days to realize that I wasn’t thrilled with Mastodon’s content either, and I realized why: it’s all still short-form.

The best and most enlightening discussions I’ve had are usually slow, drawn-out exchanges of a few hundreds of words per message.

And the Internet has had a thing for this very early on, it’s mailing lists. Well-curated mailing lists and newsgroups (almost the same thing) were long a mainstay of the Internet’s best communities. Unfortunately they are not webby and they have declined in popularity over time.

I think there is another reason the discourse is all short-form these days. Short-form is easier on a phone. No one writes a couple hundred words of their best thoughts from their phone. Almost universally if you look at the content we post from our phones, most of it is lightweight, low intellectual value, not a product of critical thinking. It is either a straight up bad idea (Elon Musk style career-ending pedo and stock tweets) or it is low value stuff (share a news article, food photo #120,315,838,284,942).

It’s very bad for our public discourse to be so heavily influenced by throwaway content.

Marshall McLuhan said that the medium is the message. Maybe we are all producing crap because we spend so much time on devices that are crappy at authoring. (On the other hand, radio, a medium where you can’t reply at all, produced Hitler — by moving to a medium where you can make shitty replies, have we moved forward or backward?)

I don’t know of many social media platforms which promote thought-provoking discourse (and I’m not really talking about 3,000 word Medium-style longreads which no one can possibly respond to in full). I always go back to Hacker News as my favorite one and I think they are successful in large part due to having a narrow focus and strict, pro-intellectual moderation.

The history of web self-publishing from a certain angle.

About 20 years ago something new arose on the Web, it was called web logging, or ‘blogging. People would write long and thoughtful articles about their interests.

And a few years later we got RSS, which was a way of syndicating the content of your blog. Other people could read it in whatever app they preferred.

And a few years after that, Google (back when they still had “do no evil” in their mission statement) invented Google Reader, which was an easy way to scoop up and read the content from your favorite blogs. The combination of using Google Reader to follow the RSS feeds of your favorite blogs created a complete and compelling ecosystem around high quality, independent publishing.

In the mid-2000s a few other publishing services came into being, and a few big winners emerged — notably Facebook and Twitter. These services were very different from the old blogging ecosystem. They were centrally controlled and their design encouraged people to spend many hours consuming bite-size, throwaway content because that was the best way to sell more ads. Google entered this category with Google+ in 2011, and shortly thereafter discontinued Reader. Reader could be seen as a competitor to G+ and it never made much money.

The old blogging and syndication scene never really recovered from the death of Reader, it is still used and appreciated by many today, but it’s Facebook, Twitter and (kind of) Google who shape the world zeitgeist. A zeitgeist in which we grapple with fake news, elections rigged by foreign powers, privacy relegated to a past age. Congressional hearings and EU regulations to keep these monsters at bay.

One wonders if the world itself might be a very different place today, had Google shown some vision and doubled down on something like Reader. Doubled down on promoting thoughtful, independent content instead of A/B optimizing for maximum engagement with advertising.

Maybe economics dictated that the world of today was inevitable, maybe not. But from Mastodon to Medium, most new publishing tools are seeking a cure for an industry that is quite clearly sick. One wonders.